sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
# Belgian GPT-2 🇧🇪
**A GPT-2 model pre-trained on a very large and heterogeneous French corpus (~60Gb).**
## Usage
You can use BelGPT-2 with [🤗 transformers](https://github.com/huggingface/transformers):
```python
import torch
from transformers import GPT2Tokenizer, GPT2LMHeadModel
# Load pretrained model and tokenizer
model = GPT2LMHeadModel.from_pretrained("antoiloui/belgpt2")
tokenizer = GPT2Tokenizer.from_pretrained("antoiloui/belgpt2")
# Generate a sample of text
model.eval()
output = model.generate(
bos_token_id=random.randint(1,50000),
do_sample=True,
top_k=50,
max_length=100,
top_p=0.95,
num_return_sequences=1
)
# Decode it
decoded_output = []
for sample in output:
decoded_output.append(tokenizer.decode(sample, skip_special_tokens=True))
print(decoded_output)
```
## Data
Below is the list of all French copora used to pre-trained the model:
| Dataset | `$corpus_name` | Raw size | Cleaned size |
| :------| :--- | :---: | :---: |
| CommonCrawl | `common_crawl` | 200.2 GB | 40.4 GB |
| NewsCrawl | `news_crawl` | 10.4 GB | 9.8 GB |
| Wikipedia | `wiki` | 19.4 GB | 4.1 GB |
| Wikisource | `wikisource` | 4.6 GB | 2.3 GB |
| Project Gutenberg | `gutenberg` | 1.3 GB | 1.1 GB |
| EuroParl | `europarl` | 289.9 MB | 278.7 MB |
| NewsCommentary | `news_commentary` | 61.4 MB | 58.1 MB |
| **Total** | | **236.3 GB** | **57.9 GB** |
## Documentation
Detailed documentation on the pre-trained model, its implementation, and the data can be found [here](https://github.com/antoiloui/belgpt2/blob/master/docs/index.md).
## Citation
For attribution in academic contexts, please cite this work as:
```
@misc{louis2020belgpt2,
author = {Louis, Antoine},
title = {{BelGPT-2: a GPT-2 model pre-trained on French corpora.}},
year = {2020},
howpublished = {\url{https://github.com/antoiloui/belgpt2}},
}
```
|
{"language": ["fr"], "license": ["mit"], "widget": [{"text": "Hier, Elon Musk a"}, {"text": "Pourquoi a-t-il"}, {"text": "Tout \u00e0 coup, elle"}]}
|
text-generation
|
antoinelouis/belgpt2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"gpt2",
"text-generation",
"fr",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #tf #jax #safetensors #gpt2 #text-generation #fr #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
Belgian GPT-2 🇧🇪
================
A GPT-2 model pre-trained on a very large and heterogeneous French corpus (~60Gb).
Usage
-----
You can use BelGPT-2 with transformers:
Data
----
Below is the list of all French copora used to pre-trained the model:
Documentation
-------------
Detailed documentation on the pre-trained model, its implementation, and the data can be found here.
For attribution in academic contexts, please cite this work as:
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #safetensors #gpt2 #text-generation #fr #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
65
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #safetensors #gpt2 #text-generation #fr #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.028639141470193863,
0.014947979710996151,
-0.006127930246293545,
0.04458294063806534,
0.12104924768209457,
0.004401865415275097,
0.14052847027778625,
0.12381710112094879,
0.008473933674395084,
-0.035619474947452545,
0.1571531742811203,
0.21785230934619904,
-0.004171071574091911,
0.10547531396150589,
-0.0911974161863327,
-0.22303269803524017,
0.055211447179317474,
0.037809163331985474,
0.0020685791969299316,
0.11907226592302322,
0.11408305168151855,
-0.0384964756667614,
0.09094099700450897,
-0.03068036213517189,
-0.14595761895179749,
0.0015428068581968546,
0.07378454506397247,
-0.12606869637966156,
0.12379421293735504,
0.06862877309322357,
0.0616556815803051,
0.08499932289123535,
-0.03740778565406799,
-0.11845146119594574,
0.034964676946401596,
0.025508826598525047,
-0.10229408740997314,
0.052898768335580826,
0.08860985934734344,
-0.05379026010632515,
0.14942355453968048,
0.06470444798469543,
-0.02424013800919056,
0.06666626781225204,
-0.14341993629932404,
-0.13861088454723358,
-0.05573752894997597,
0.09274119138717651,
0.053100086748600006,
0.08977796137332916,
0.008396110497415066,
0.16250057518482208,
-0.05416123941540718,
0.12089262157678604,
0.13853424787521362,
-0.3633709251880646,
-0.011280334554612637,
0.09467604756355286,
0.0724971815943718,
0.04068112373352051,
-0.04631220921874046,
0.0584862045943737,
0.055655401200056076,
0.01802351325750351,
0.07312270253896713,
-0.07446875423192978,
-0.11500106751918793,
0.022328514605760574,
-0.09346907585859299,
-0.056223295629024506,
0.23879778385162354,
-0.045910727232694626,
0.007971714250743389,
-0.04274323210120201,
-0.0832672268152237,
-0.011824358254671097,
-0.007133773528039455,
-0.009466797113418579,
-0.043658267706632614,
0.07475101202726364,
0.03472782298922539,
-0.056500717997550964,
-0.14166899025440216,
-0.031832195818424225,
-0.15926668047904968,
0.14695128798484802,
0.02179562672972679,
0.04773717746138573,
-0.1721288114786148,
0.09736277908086777,
-0.00332476943731308,
-0.1089916080236435,
-0.0021182168275117874,
-0.09168744087219238,
0.0959397703409195,
-0.021827977150678635,
-0.021341778337955475,
-0.04788024351000786,
0.11541261523962021,
0.15903466939926147,
-0.0724048912525177,
0.0036011054180562496,
-0.06366056948900223,
0.08813481032848358,
-0.015621365047991276,
0.018683981150388718,
0.020498663187026978,
0.026068784296512604,
0.10807699710130692,
-0.10328968614339828,
0.015179246664047241,
-0.04372863844037056,
-0.1407398283481598,
-0.01610385812819004,
0.05697954073548317,
0.1073673740029335,
0.012122533284127712,
0.11384380608797073,
-0.022279998287558556,
0.022004062309861183,
0.11598382145166397,
-0.0643082708120346,
-0.019836794584989548,
0.00028547621332108974,
0.06030043959617615,
0.022706711664795876,
0.027296490967273712,
0.00420162221416831,
-0.08943133801221848,
0.06674226373434067,
-0.0679662674665451,
-0.03957204893231392,
-0.005610624328255653,
-0.06562324613332748,
0.06148536130785942,
-0.07007694989442825,
0.031407494097948074,
-0.18556152284145355,
-0.1825135201215744,
0.041558168828487396,
0.008826049044728279,
-0.009430701844394207,
-0.06548182666301727,
0.015763558447360992,
-0.06900633126497269,
0.03078087605535984,
-0.06262201815843582,
-0.019744502380490303,
-0.06797175854444504,
0.12397467344999313,
-0.04400493949651718,
0.03535782918334007,
-0.14945414662361145,
0.042102545499801636,
-0.1154608279466629,
-0.014528697356581688,
-0.06272400170564651,
0.002565228147432208,
-0.03570796176791191,
0.1338059902191162,
-0.008681598119437695,
-0.036977920681238174,
-0.060378510504961014,
0.04355721175670624,
-0.04782022908329964,
0.18184955418109894,
-0.08642719686031342,
-0.0817561000585556,
0.2689439058303833,
-0.14979204535484314,
-0.21786904335021973,
0.11149243265390396,
0.02161756530404091,
0.052729684859514236,
0.0907590389251709,
0.17404811084270477,
0.06275110691785812,
-0.08187348395586014,
0.06887820363044739,
0.12594904005527496,
-0.11281261593103409,
-0.10207499563694,
0.029283713549375534,
-0.014043337665498257,
-0.1440640389919281,
0.03811485692858696,
0.018617058172822,
0.08014124631881714,
-0.04539550468325615,
-0.031241945922374725,
-0.05967326834797859,
-0.008460106328129768,
0.020940423011779785,
0.005339990369975567,
0.074652299284935,
-0.08837076276540756,
-0.03602564334869385,
-0.03545992821455002,
-0.00923932995647192,
0.0033884295262396336,
0.01606721803545952,
-0.06029747426509857,
0.12957845628261566,
0.025291377678513527,
0.04537908732891083,
-0.12354202568531036,
-0.08479851484298706,
-0.002069835551083088,
0.10056263953447342,
-0.0010238741524517536,
0.06685652583837509,
0.06162893772125244,
0.01779100112617016,
-0.023845288902521133,
-0.014819974079728127,
0.157246395945549,
0.011598424054682255,
-0.03391144052147865,
-0.12188448756933212,
0.07447744160890579,
-0.05796048790216446,
0.02519914321601391,
-0.10782462358474731,
0.03771873936057091,
0.10132106393575668,
0.08950021117925644,
-0.0018740021623671055,
0.04781745374202728,
-0.030230175703763962,
-0.015097690746188164,
-0.07348605990409851,
-0.016200389713048935,
0.0892433151602745,
0.03668469563126564,
-0.06199319660663605,
0.2116057425737381,
-0.1592429131269455,
0.32633382081985474,
0.21159493923187256,
-0.17940765619277954,
-0.02783738449215889,
-0.048812951892614365,
-0.03491887450218201,
0.022929292172193527,
0.037001825869083405,
-0.0211927630007267,
0.016248323023319244,
-0.025952545925974846,
0.17614492774009705,
-0.09493966400623322,
-0.07630102336406708,
0.029450194910168648,
-0.05054973065853119,
-0.018625924363732338,
0.07074553519487381,
0.09815122932195663,
-0.21740444004535675,
0.1931043416261673,
0.22361664474010468,
0.05513012409210205,
0.17140018939971924,
-0.046107009053230286,
0.0057792579755187035,
0.05362690985202789,
0.05134192854166031,
0.003532256232574582,
-0.013244246132671833,
-0.13815264403820038,
0.008629626594483852,
0.06945516169071198,
0.01233556866645813,
0.056139152497053146,
-0.1385570764541626,
-0.07094357162714005,
-0.014228430576622486,
-0.04010353982448578,
-0.018559155985713005,
0.11346517503261566,
0.009257098659873009,
0.13989652693271637,
-0.05789191275835037,
-0.056693919003009796,
0.12390967458486557,
0.022195016965270042,
-0.12181653827428818,
0.1953197568655014,
-0.1196296215057373,
-0.2886985242366791,
-0.1210135892033577,
-0.10913142561912537,
-0.029956378042697906,
0.03051029145717621,
0.1380816549062729,
-0.05844435468316078,
-0.03286297246813774,
-0.04161781072616577,
0.001805102452635765,
-0.05203961580991745,
0.03114546462893486,
-0.08743658661842346,
0.042080942541360855,
-0.042069513350725174,
-0.10504894703626633,
-0.06680997461080551,
0.0005878011579625309,
-0.09061938524246216,
0.1553029865026474,
-0.07396169751882553,
0.06798012554645538,
0.12442141026258469,
-0.007132378872483969,
0.04410763084888458,
-0.07717833667993546,
0.1979970932006836,
-0.062298573553562164,
0.02961324341595173,
0.21448276937007904,
-0.025315897539258003,
0.07336888462305069,
0.10648800432682037,
0.013165525160729885,
-0.0839819610118866,
0.03286372870206833,
-0.047285567969083786,
-0.09235011786222458,
-0.2322997748851776,
-0.08151379227638245,
-0.11067664623260498,
0.09907970577478409,
0.04219783470034599,
0.09419561177492142,
0.163614422082901,
0.08154039084911346,
-0.04945817217230797,
0.004697481635957956,
0.06735024601221085,
0.09102387726306915,
0.18391184508800507,
-0.006928649730980396,
0.11666586995124817,
-0.08240814507007599,
-0.09574919193983078,
0.11591143161058426,
0.014575443230569363,
0.09928227216005325,
0.052330680191516876,
0.01271252240985632,
0.06405018270015717,
0.13813534379005432,
0.11421475559473038,
0.1424739807844162,
-0.005988267250359058,
-0.037124861031770706,
-0.030215611681342125,
-0.059544824063777924,
-0.01701434701681137,
0.0247616209089756,
-0.08213527500629425,
-0.11257768422365189,
-0.06318311393260956,
-0.09391050785779953,
0.07089889794588089,
0.09200930595397949,
0.03264601156115532,
-0.2197730839252472,
0.0014225097838789225,
0.06853029876947403,
0.006819179281592369,
-0.1003003716468811,
0.08579554408788681,
0.03191964700818062,
-0.1049867793917656,
0.09775576740503311,
-0.0585518442094326,
0.09414472430944443,
0.03590923547744751,
0.06829744577407837,
-0.00980748888105154,
-0.0654740184545517,
-0.0036873130593448877,
0.10673625767230988,
-0.33456987142562866,
0.21063286066055298,
0.00010648488387232646,
-0.006727878469973803,
-0.08402730524539948,
0.014521986246109009,
0.016791077330708504,
0.18396328389644623,
0.16248540580272675,
0.0008182261954061687,
-0.04661567136645317,
-0.05931723117828369,
-0.0013794341357424855,
0.03684799745678902,
0.08983749151229858,
-0.025178510695695877,
-0.014005464501678944,
-0.052866775542497635,
-0.004347463604062796,
0.004211642779409885,
-0.010295752435922623,
-0.04838424548506737,
-0.1429767906665802,
0.05574986711144447,
0.028761904686689377,
0.0739339217543602,
-0.05030415579676628,
-0.01646709069609642,
-0.12713122367858887,
0.18003056943416595,
-0.0981883630156517,
-0.0988662913441658,
-0.1254281848669052,
-0.09595660120248795,
0.009446024894714355,
-0.06509754806756973,
0.053357698023319244,
-0.0723283588886261,
0.015654312446713448,
-0.07450909912586212,
-0.2038601189851761,
0.12027934193611145,
-0.12734679877758026,
-0.08439745754003525,
-0.038822323083877563,
0.18014323711395264,
-0.08399538695812225,
-0.005125650204718113,
0.03829651698470116,
0.005854635499417782,
-0.07182873040437698,
-0.12237956374883652,
0.004575465805828571,
-0.06296009570360184,
0.03268948197364807,
-0.04151167348027229,
-0.09106376767158508,
-0.05841977521777153,
0.0020450716838240623,
-0.03188168630003929,
0.2276117503643036,
0.23811253905296326,
-0.05124679580330849,
0.15539497137069702,
0.1364348977804184,
-0.07241944223642349,
-0.3114365339279175,
-0.11310042440891266,
-0.1462559700012207,
-0.07377883046865463,
-0.00612487131729722,
-0.1138409972190857,
0.05796302855014801,
0.03005526401102543,
-0.04442965239286423,
0.15357470512390137,
-0.21360324323177338,
-0.09025705605745316,
0.14770841598510742,
0.04853643849492073,
0.3080137372016907,
-0.17658814787864685,
-0.09369845688343048,
-0.026632100343704224,
-0.1550852656364441,
0.19375699758529663,
-0.12957638502120972,
0.06626486778259277,
-0.019108576700091362,
0.0024983084294945,
0.021340180188417435,
-0.05530231446027756,
0.06507278978824615,
-0.03328337520360947,
0.05862134322524071,
-0.12343550473451614,
0.010238932445645332,
0.09563711285591125,
0.01149109099060297,
0.058803629130125046,
-0.13901562988758087,
0.04419315606355667,
-0.07372580468654633,
-0.04010768234729767,
-0.07872181385755539,
0.10076040029525757,
-0.004702773876488209,
-0.08914501965045929,
-0.006308861076831818,
-0.028678713366389275,
-0.012032036669552326,
-0.036758020520210266,
0.18581771850585938,
-0.02895098552107811,
0.2044336050748825,
0.14932484924793243,
0.1168534904718399,
-0.13905031979084015,
0.046419646590948105,
-0.057117678225040436,
-0.08243649452924728,
0.0641583725810051,
-0.11231493949890137,
0.04087536409497261,
0.08268840610980988,
-0.020763462409377098,
0.07569202780723572,
0.10124193131923676,
-0.003238174831494689,
-0.03162422776222229,
0.14366847276687622,
-0.2550586462020874,
-0.05420517921447754,
-0.07608823478221893,
-0.0033291401341557503,
0.08362746238708496,
0.09069585055112839,
0.1501718908548355,
-0.02032562345266342,
-0.019320502877235413,
-0.010340532287955284,
0.01620018668472767,
-0.04448861628770828,
0.040075164288282394,
0.03476322069764137,
0.016264410689473152,
-0.12642905116081238,
0.05091644451022148,
0.002837776206433773,
-0.1064305379986763,
0.0016529812710359693,
0.14834006130695343,
-0.14774368703365326,
-0.13715346157550812,
-0.004405898507684469,
0.08233921974897385,
-0.1273432821035385,
-0.05470055714249611,
-0.03605629876255989,
-0.16444465517997742,
0.0677514597773552,
0.20290403068065643,
0.05353614687919617,
0.09426124393939972,
0.011343632824718952,
-0.03104470483958721,
-0.04305750131607056,
0.03286324441432953,
-0.05159513279795647,
0.03389187157154083,
-0.12561067938804626,
0.06804248690605164,
-0.014863798394799232,
0.09559290111064911,
-0.0826006531715393,
-0.006389004178345203,
-0.15369358658790588,
0.001085000578314066,
-0.06781996786594391,
-0.02944917231798172,
-0.07300823926925659,
-0.026446551084518433,
-0.002714523347094655,
-0.033562611788511276,
-0.026684030890464783,
-0.016893640160560608,
-0.09644673019647598,
0.020553847774863243,
-0.0029724494088441133,
0.05422334000468254,
-0.10572342574596405,
-0.026542125269770622,
0.05256011337041855,
-0.03232484310865402,
0.14264149963855743,
0.09322793036699295,
-0.09530794620513916,
0.11324368417263031,
-0.23035478591918945,
-0.042383767664432526,
0.106581911444664,
0.013250053860247135,
0.012463856488466263,
0.06586742401123047,
0.036953166127204895,
0.08779197931289673,
-0.02033674530684948,
0.0643538236618042,
-0.014095577411353588,
-0.12192592024803162,
0.02263137884438038,
-0.00990405771881342,
-0.11583645641803741,
-0.0287120770663023,
-0.046522632241249084,
0.06197422742843628,
-0.024304678663611412,
0.14487294852733612,
-0.07590114325284958,
0.047885917127132416,
-0.10976725816726685,
0.015005446970462799,
0.01218568067997694,
-0.1796664297580719,
-0.14257821440696716,
-0.054920587688684464,
0.009388558566570282,
0.005355652887374163,
0.21931172907352448,
0.04606131836771965,
-0.0705222487449646,
0.05349917337298393,
0.037853632122278214,
0.06441128998994827,
-0.0076654586009681225,
0.2539096772670746,
0.0461212582886219,
-0.039329227060079575,
-0.13514642417430878,
0.03376442566514015,
0.005723148584365845,
-0.10722813755273819,
0.13889886438846588,
0.059847135096788406,
-0.06076778843998909,
0.04841991513967514,
0.04803996533155441,
0.00012187800166429952,
-0.037621431052684784,
-0.13780181109905243,
0.000698863179422915,
0.06724633276462555,
-0.014811432920396328,
0.07144448906183243,
0.20720404386520386,
-0.025094255805015564,
0.005928099155426025,
-0.02818671613931656,
-0.029793089255690575,
-0.18095967173576355,
-0.1550559550523758,
-0.07850854843854904,
-0.10610116273164749,
0.024701852351427078,
-0.08924473822116852,
0.03640243038535118,
0.0052856397815048695,
0.07595838606357574,
-0.0686269998550415,
0.06621316820383072,
0.05053326487541199,
-0.07580021023750305,
0.05234329774975777,
-0.016745517030358315,
0.028096895664930344,
-0.03186848387122154,
-0.023494094610214233,
-0.0702349990606308,
-0.037387680262327194,
-0.038145847618579865,
0.04448980465531349,
-0.004606923088431358,
0.04554752632975578,
-0.13664546608924866,
-0.07561177760362625,
-0.02230575866997242,
0.06944113224744797,
-0.018307717517018318,
0.1320158988237381,
0.009098277427256107,
-0.016597118228673935,
0.06760402023792267,
0.19164876639842987,
-0.04006991907954216,
-0.13064263761043549,
-0.015423612669110298,
0.24478979408740997,
0.037858299911022186,
0.09364669770002365,
0.008385858498513699,
0.009504921734333038,
-0.014135255478322506,
0.29889747500419617,
0.30004823207855225,
-0.03788847476243973,
0.05121822655200958,
-0.011520753614604473,
0.025553181767463684,
0.09029430150985718,
0.13686047494411469,
0.09464259445667267,
0.27684903144836426,
-0.05687650665640831,
0.010741443373262882,
-0.027110863476991653,
0.03473662957549095,
-0.09986727684736252,
0.09444345533847809,
0.03026428446173668,
-0.0640144795179367,
0.002440202049911022,
0.09284112602472305,
-0.16180483996868134,
0.08428634703159332,
-0.059923600405454636,
-0.10346035659313202,
-0.026519570499658585,
-0.006735200062394142,
0.08828098326921463,
0.005944910459220409,
0.04749281331896782,
-0.02833058498799801,
-0.05367141216993332,
0.03621014207601547,
0.011557101272046566,
-0.20306091010570526,
0.012004001997411251,
0.05390968546271324,
-0.05047689378261566,
0.0924939438700676,
-0.01090247742831707,
0.06606651842594147,
0.0824849084019661,
0.01524324156343937,
-0.06956581771373749,
0.09735319763422012,
0.007660307921469212,
-0.04184184595942497,
0.018214799463748932,
-0.0354037843644619,
0.021413855254650116,
-0.08267845958471298,
0.04058463126420975,
-0.10630705952644348,
0.04286649078130722,
-0.014769659377634525,
-0.07303408533334732,
-0.03886017948389053,
0.04982421547174454,
-0.058283135294914246,
0.06757193803787231,
0.061812374740839005,
-0.014192398637533188,
-0.006297148764133453,
-0.09025741368532181,
0.00927729532122612,
0.037350282073020935,
-0.10806956142187119,
-0.026358572766184807,
-0.09685148298740387,
-0.06064946576952934,
0.10547511279582977,
0.003938277717679739,
-0.23191659152507782,
0.0018748632865026593,
-0.12541954219341278,
0.044960055500268936,
-0.2109343558549881,
0.05837300792336464,
0.12189270555973053,
0.02327718213200569,
0.004786692094057798,
-0.023254089057445526,
0.014059648849070072,
0.07605072855949402,
-0.08107005804777145,
-0.0651739239692688
] |
null | null |
transformers
|
# NetBERT 📶
<img align="left" src="illustration.jpg" width="150"/>
<br><br><br>
NetBERT is a [BERT-base](https://huggingface.co/bert-base-cased) model further pre-trained on a huge corpus of computer networking text (~23Gb).
<br><br>
## Usage
You can use the raw model for masked language modeling (MLM), but it's mostly intended to be fine-tuned on a downstream task, especially one that uses the whole sentence to make decisions such as text classification, extractive question answering, or semantic search.
You can use this model directly with a pipeline for [masked language modeling](https://huggingface.co/tasks/fill-mask):
```python
from transformers import pipeline
unmasker = pipeline('fill-mask', model='antoinelouis/netbert')
unmasker("The nodes of a computer network may include [MASK].")
```
You can also use this model to [extract the features](https://huggingface.co/tasks/feature-extraction) of a given text:
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('antoinelouis/netbert')
model = AutoModel.from_pretrained('antoinelouis/netbert')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Documentation
Detailed documentation on the pre-trained model, its implementation, and the data can be found on [Github](https://github.com/antoiloui/netbert/blob/master/docs/index.md).
## Citation
For attribution in academic contexts, please cite this work as:
```
@mastersthesis{louis2020netbert,
title={NetBERT: A Pre-trained Language Representation Model for Computer Networking},
author={Louis, Antoine},
year={2020},
school={University of Liege}
}
```
|
{"language": ["en"], "license": "apache-2.0", "library_name": "transformers", "widget": [{"text": "The nodes of a computer network may include [MASK]."}]}
|
fill-mask
|
antoinelouis/netbert
|
[
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# NetBERT
<img align="left" src="URL" width="150"/>
<br><br><br>
NetBERT is a BERT-base model further pre-trained on a huge corpus of computer networking text (~23Gb).
<br><br>
## Usage
You can use the raw model for masked language modeling (MLM), but it's mostly intended to be fine-tuned on a downstream task, especially one that uses the whole sentence to make decisions such as text classification, extractive question answering, or semantic search.
You can use this model directly with a pipeline for masked language modeling:
You can also use this model to extract the features of a given text:
## Documentation
Detailed documentation on the pre-trained model, its implementation, and the data can be found on Github.
For attribution in academic contexts, please cite this work as:
|
[
"# NetBERT \n\n<img align=\"left\" src=\"URL\" width=\"150\"/>\n<br><br><br>\n\n NetBERT is a BERT-base model further pre-trained on a huge corpus of computer networking text (~23Gb).\n\n<br><br>",
"## Usage\n\nYou can use the raw model for masked language modeling (MLM), but it's mostly intended to be fine-tuned on a downstream task, especially one that uses the whole sentence to make decisions such as text classification, extractive question answering, or semantic search.\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nYou can also use this model to extract the features of a given text:",
"## Documentation\n\nDetailed documentation on the pre-trained model, its implementation, and the data can be found on Github.\n\nFor attribution in academic contexts, please cite this work as:"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# NetBERT \n\n<img align=\"left\" src=\"URL\" width=\"150\"/>\n<br><br><br>\n\n NetBERT is a BERT-base model further pre-trained on a huge corpus of computer networking text (~23Gb).\n\n<br><br>",
"## Usage\n\nYou can use the raw model for masked language modeling (MLM), but it's mostly intended to be fine-tuned on a downstream task, especially one that uses the whole sentence to make decisions such as text classification, extractive question answering, or semantic search.\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nYou can also use this model to extract the features of a given text:",
"## Documentation\n\nDetailed documentation on the pre-trained model, its implementation, and the data can be found on Github.\n\nFor attribution in academic contexts, please cite this work as:"
] |
[
52,
75,
97,
42
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# NetBERT \n\n<img align=\"left\" src=\"URL\" width=\"150\"/>\n<br><br><br>\n\n NetBERT is a BERT-base model further pre-trained on a huge corpus of computer networking text (~23Gb).\n\n<br><br>## Usage\n\nYou can use the raw model for masked language modeling (MLM), but it's mostly intended to be fine-tuned on a downstream task, especially one that uses the whole sentence to make decisions such as text classification, extractive question answering, or semantic search.\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nYou can also use this model to extract the features of a given text:## Documentation\n\nDetailed documentation on the pre-trained model, its implementation, and the data can be found on Github.\n\nFor attribution in academic contexts, please cite this work as:"
] |
[
-0.08916964381933212,
-0.010895969346165657,
-0.00024323318211827427,
0.09631839394569397,
0.0992559939622879,
0.0107595669105649,
0.14384257793426514,
0.04135392606258392,
0.033118557184934616,
-0.02793012373149395,
0.11453589051961899,
0.04221650958061218,
-0.002641483210027218,
0.15341666340827942,
0.053739141672849655,
-0.27793821692466736,
-0.03602698817849159,
-0.04108068719506264,
0.0156681090593338,
0.12503445148468018,
0.12396015971899033,
-0.06373819708824158,
0.12820769846439362,
0.044372402131557465,
-0.0925060287117958,
0.03466084599494934,
0.0030006400775164366,
-0.06923183053731918,
0.08768703043460846,
0.11366058140993118,
0.02972809039056301,
-0.007626567501574755,
0.06533869355916977,
-0.1293288767337799,
0.054912105202674866,
0.027384186163544655,
-0.04163875803351402,
0.09189598262310028,
0.05734960734844208,
-0.04403861612081528,
0.1399022489786148,
0.014094194397330284,
0.0027794644702225924,
0.052614856511354446,
-0.1027902290225029,
-0.01284762378782034,
-0.01999790593981743,
0.09052670747041702,
0.05799562484025955,
0.08545911312103271,
-0.013120906427502632,
0.0653684213757515,
-0.084842748939991,
0.11586719751358032,
0.11698497086763382,
-0.18430396914482117,
-0.04705336317420006,
0.11681270599365234,
0.025408461689949036,
0.03502606227993965,
0.012656786479055882,
0.08609870076179504,
-0.03303639963269234,
0.0060255988501012325,
0.13325099647045135,
-0.03574495017528534,
-0.06759106367826462,
-0.031597089022397995,
-0.12882710993289948,
0.00756615586578846,
0.19291308522224426,
0.026306385174393654,
-0.029960155487060547,
-0.1211281418800354,
-0.052812568843364716,
0.11841589212417603,
-0.04207265004515648,
-0.016867484897375107,
-0.0034630894660949707,
0.01461015548557043,
-0.03135770559310913,
-0.14082063734531403,
-0.07417136430740356,
-0.08027492463588715,
-0.11268497258424759,
0.08615082502365112,
0.003324156627058983,
0.04847324267029762,
-0.10916314274072647,
0.07052386552095413,
-0.13657346367835999,
-0.09175015985965729,
-0.012899817898869514,
-0.06778782606124878,
-0.025371158495545387,
-0.022205539047718048,
-0.031046677380800247,
-0.13671371340751648,
0.03051077201962471,
0.1551126092672348,
0.0083574578166008,
-0.004240884445607662,
-0.00021827506134286523,
0.05984879285097122,
0.06423141062259674,
0.06034795567393303,
-0.21059976518154144,
-0.0006625759997405112,
0.09199593961238861,
-0.0851496085524559,
-0.03084135800600052,
-0.03546476736664772,
-0.1330920308828354,
0.015296105295419693,
-0.05973944813013077,
0.030703647062182426,
0.024159718304872513,
0.12248203158378601,
-0.004083350300788879,
-0.10730552673339844,
0.06137608736753464,
-0.10426335781812668,
-0.03202192857861519,
0.01965186558663845,
-0.047473713755607605,
-0.10302898287773132,
0.08139068633317947,
0.006327468901872635,
-0.035748835653066635,
-0.02979911118745804,
-0.09000874310731888,
-0.0005687220836989582,
-0.05837344750761986,
-0.1600698083639145,
0.002745778765529394,
0.01660456694662571,
0.0017346523236483335,
-0.15125428140163422,
-0.20425774157047272,
0.003330780193209648,
0.10969746857881546,
-0.02271164208650589,
0.03544333577156067,
-0.001981721492484212,
0.03711000084877014,
-0.035893864929676056,
-0.02007260173559189,
0.04954984039068222,
-0.05523432046175003,
-0.021848134696483612,
-0.03983592987060547,
0.07553271949291229,
-0.07297751307487488,
0.04090004786849022,
-0.0653771460056305,
-0.014348752796649933,
-0.1622619330883026,
0.15255500376224518,
-0.046914104372262955,
0.06541786342859268,
-0.12352326512336731,
-0.0334291085600853,
-0.039743825793266296,
0.045095015317201614,
0.06291117519140244,
0.13802090287208557,
-0.08779110759496689,
-0.04369634389877319,
0.18650807440280914,
-0.06903236359357834,
-0.09679106622934341,
0.11944019049406052,
-0.07961883395910263,
0.1321641206741333,
0.10083547234535217,
0.17368708550930023,
0.1141257956624031,
-0.08791913837194443,
0.10391811281442642,
0.10747800022363663,
-0.0569041408598423,
-0.012733496725559235,
0.10467267036437988,
0.03585764393210411,
-0.1306302845478058,
0.013154068030416965,
-0.12157828360795975,
0.06587774306535721,
-0.05686761811375618,
-0.0879269540309906,
0.05194869637489319,
-0.032455503940582275,
0.025310412049293518,
-0.020052742213010788,
0.0746278464794159,
0.025933722034096718,
-0.05251302570104599,
0.060207635164260864,
0.04247543588280678,
-0.07031378149986267,
-0.009085353463888168,
-0.09578179568052292,
0.03099524974822998,
-0.06587091833353043,
0.03502399101853371,
-0.22505198419094086,
-0.05137290805578232,
-0.010993318632245064,
0.025722837075591087,
0.1326483190059662,
-0.015374366194009781,
0.02701590768992901,
0.07607986778020859,
-0.03088073432445526,
0.04729072377085686,
0.0024790868628770113,
-0.006060512270778418,
-0.05053810775279999,
-0.14205896854400635,
-0.05452657863497734,
-0.041503578424453735,
0.03510185703635216,
-0.0565313957631588,
0.033291224390268326,
-0.03420503810048103,
-0.02479035034775734,
0.03707273676991463,
-0.007212507538497448,
0.05853942781686783,
-0.019193774089217186,
-0.009873578324913979,
-0.013531572185456753,
0.030543358996510506,
0.012236249633133411,
-0.10540676862001419,
0.10431066900491714,
-0.06499134749174118,
0.062580905854702,
0.09132828563451767,
-0.07912247627973557,
-0.06533342599868774,
-0.016700906679034233,
-0.0448032021522522,
0.015070895664393902,
-0.03529910743236542,
-0.010136209428310394,
0.19371217489242554,
0.005290616769343615,
0.10118785500526428,
-0.1109500601887703,
-0.03923296928405762,
-0.02062683179974556,
-0.055706318467855453,
-0.0009301404352299869,
0.08016063272953033,
0.08179425448179245,
-0.20714320242404938,
0.03232567012310028,
0.020658012479543686,
-0.04062584415078163,
0.16988426446914673,
0.05083214491605759,
-0.050748832523822784,
-0.024284347891807556,
-0.04026830196380615,
-0.013780685141682625,
0.029845038428902626,
-0.15252119302749634,
-0.034415353089571,
0.047011200338602066,
-0.013653568923473358,
0.05804664269089699,
-0.04798818752169609,
0.03308645635843277,
0.03967620059847832,
-0.056418731808662415,
-0.0824030265212059,
0.011110927909612656,
-0.07842700183391571,
0.06933611631393433,
0.05583229660987854,
0.009426583535969257,
-0.019295748323202133,
-0.030637133866548538,
-0.13605503737926483,
0.21952834725379944,
-0.09149190038442612,
-0.29468870162963867,
-0.13193389773368835,
-0.09293846786022186,
-0.10017810016870499,
0.03546801209449768,
0.017791004851460457,
-0.03917839005589485,
-0.06439390778541565,
-0.09562262147665024,
0.056607261300086975,
0.012405255809426308,
-0.021426385268568993,
-0.021347733214497566,
-0.0023986888118088245,
-0.012797967530786991,
-0.14857248961925507,
-0.002104197395965457,
-0.04295719787478447,
-0.012978116981685162,
0.032568011432886124,
-0.09067639708518982,
0.09244436025619507,
0.12857896089553833,
0.004716245923191309,
0.06713326275348663,
-0.026575878262519836,
0.2124788463115692,
-0.0062619773671031,
0.022975219413638115,
0.15557730197906494,
-0.019318263977766037,
0.015280681662261486,
0.1036779060959816,
0.07221561670303345,
-0.06859254091978073,
0.0631830170750618,
-0.0008998014964163303,
-0.08077143132686615,
-0.17582431435585022,
-0.12251795083284378,
-0.07253539562225342,
-0.03728431090712547,
0.06181478500366211,
0.02783280424773693,
0.03357642889022827,
0.057943448424339294,
0.001458776998333633,
0.022695334628224373,
0.08633887767791748,
0.057947684079408646,
0.04341123253107071,
-0.05060718581080437,
0.10585862398147583,
0.007024429738521576,
-0.022751405835151672,
0.09893311560153961,
-0.002846156945452094,
0.15080246329307556,
-0.013902523554861546,
0.16815443336963654,
0.12473741918802261,
0.036860570311546326,
0.012223567813634872,
0.11935275048017502,
-0.07167986780405045,
0.04590849205851555,
-0.05495462939143181,
-0.10570704936981201,
0.01094584260135889,
0.037331778556108475,
-0.032054170966148376,
-0.04680291563272476,
-0.05177798494696617,
0.025493836030364037,
0.05325794965028763,
0.18805097043514252,
0.011353406123816967,
-0.2721780240535736,
-0.11373159289360046,
-0.016512295231223106,
0.015043924562633038,
-0.04887229576706886,
-0.009497989900410175,
0.07781843841075897,
-0.09139134734869003,
0.019121510908007622,
0.014081564731895924,
0.13234882056713104,
0.0014177642296999693,
0.012288973666727543,
-0.0278471652418375,
0.09984744340181351,
-0.036623552441596985,
0.11295612901449203,
-0.23078039288520813,
0.22556748986244202,
0.052278243005275726,
0.038290444761514664,
-0.09663783758878708,
-0.002120265970006585,
0.08013830333948135,
0.13132810592651367,
0.1532614678144455,
0.03107479028403759,
0.10846555978059769,
-0.027406979352235794,
-0.10482434183359146,
0.03002725914120674,
0.0280083529651165,
-0.10603568702936172,
-0.0013344037579372525,
0.018277300521731377,
-0.039141420274972916,
0.01587403193116188,
0.05655989795923233,
-0.17558692395687103,
-0.08609110116958618,
0.028340430930256844,
-0.008324747905135155,
-0.04485590383410454,
-0.02200792171061039,
-0.04512990266084671,
0.046336859464645386,
0.23469947278499603,
-0.030598776414990425,
-0.0739574283361435,
-0.14759093523025513,
0.06582088768482208,
0.05814988911151886,
-0.10429075360298157,
0.041228681802749634,
-0.019156770780682564,
0.13067756593227386,
-0.02027961239218712,
-0.18024098873138428,
0.1356862485408783,
-0.1332361251115799,
0.0015308308647945523,
-0.022600863128900528,
0.014875420369207859,
0.10631213337182999,
0.03542710840702057,
0.05557568743824959,
-0.009768442250788212,
-0.0649794265627861,
-0.11396100372076035,
-0.08805587142705917,
0.08653388917446136,
0.028714947402477264,
0.10450869798660278,
-0.13673703372478485,
-0.12422386556863785,
-0.024861803278326988,
0.044100552797317505,
0.1669672727584839,
-0.010901673696935177,
-0.06783751398324966,
0.11280672252178192,
0.28802618384361267,
-0.06695371121168137,
-0.3172116279602051,
-0.0007910995627753437,
0.04479747265577316,
0.051585979759693146,
0.040562793612480164,
-0.20010530948638916,
0.12726688385009766,
0.009253783151507378,
-0.012190448120236397,
-0.029381878674030304,
-0.19047366082668304,
-0.15849244594573975,
0.18010461330413818,
0.08473912626504898,
0.07993315160274506,
-0.042181819677352905,
-0.03651405870914459,
-0.07661815732717514,
0.015200769528746605,
0.1459142416715622,
-0.18843013048171997,
0.08332853764295578,
0.04100522771477699,
-0.017721977084875107,
-0.0027862805873155594,
-0.037221308797597885,
0.059129294008016586,
0.0608331598341465,
0.05068871006369591,
-0.05560210347175598,
-0.006719131022691727,
0.14524415135383606,
-0.03174788877367973,
0.18742859363555908,
0.0031618941575288773,
0.09233181178569794,
-0.08883646875619888,
-0.08543404936790466,
-0.05017559975385666,
0.09880285710096359,
0.010512201115489006,
-0.1095050796866417,
-0.02238771878182888,
0.04417303577065468,
0.0265662744641304,
0.015238870866596699,
0.015048151835799217,
-0.0851285457611084,
0.03871196135878563,
0.10534809529781342,
0.1691179871559143,
-0.12537476420402527,
-0.033747926354408264,
0.017802996560931206,
-0.028045939281582832,
0.11628906428813934,
-0.13116106390953064,
0.038927674293518066,
0.05049685761332512,
0.04104992747306824,
0.10672101378440857,
0.029802577570080757,
-0.05867437645792961,
-0.017075780779123306,
0.06458209455013275,
-0.13079416751861572,
-0.03155776485800743,
-0.09045298397541046,
-0.04274303466081619,
-0.0669407993555069,
-0.0024730071891099215,
0.13178908824920654,
-0.03965950757265091,
-0.014710347168147564,
0.017212487757205963,
0.037607062608003616,
-0.04410165175795555,
0.10201654583215714,
0.07254570722579956,
0.016828564926981926,
-0.10455427318811417,
0.009814927354454994,
0.03579278290271759,
0.038504134863615036,
0.01670779287815094,
0.05542376637458801,
-0.13372331857681274,
-0.125060573220253,
-0.03745279088616371,
0.10580606758594513,
-0.03817179054021835,
-0.03893290087580681,
-0.0018975625280290842,
-0.05331501364707947,
0.037330515682697296,
0.09287476539611816,
0.060177214443683624,
-0.02751176245510578,
-0.08125040680170059,
-0.0328347347676754,
0.010630528442561626,
0.08611395955085754,
0.018242122605443,
-0.03706315904855728,
-0.038379967212677,
0.07004430890083313,
0.025923993438482285,
0.09462786465883255,
-0.07417386770248413,
-0.056107524782419205,
-0.11937917023897171,
0.028031524270772934,
-0.06638934463262558,
0.04681660979986191,
-0.09373148530721664,
0.02511175535619259,
-0.031095430254936218,
-0.019689131528139114,
-0.018559515476226807,
0.03675778955221176,
-0.08078599721193314,
-0.012757203541696072,
-0.048585787415504456,
0.06190487742424011,
-0.11545764654874802,
-0.008546639233827591,
0.022259443998336792,
-0.012026581913232803,
0.13017435371875763,
0.049769047647714615,
-0.055667877197265625,
0.03901653364300728,
-0.21329781413078308,
-0.027632983401417732,
0.003265928942710161,
0.07843288034200668,
0.028814924880862236,
-0.06670845299959183,
0.04302555322647095,
-0.0071947709657251835,
0.0067540849559009075,
-0.012629379518330097,
0.1035558208823204,
-0.1017020121216774,
0.02396601438522339,
-0.05223699286580086,
0.004044664558023214,
-0.04826325178146362,
0.0728255957365036,
0.024127714335918427,
0.07348595559597015,
0.08230309188365936,
-0.05491555854678154,
0.020904092118144035,
-0.11979269981384277,
0.023227708414196968,
0.004520460031926632,
-0.03619404137134552,
-0.13786590099334717,
-0.0967802032828331,
0.05946365371346474,
-0.055349837988615036,
0.23359017074108124,
0.12694473564624786,
0.001374105573631823,
-0.005449169781059027,
0.08195099979639053,
0.09456637501716614,
-0.021613484248518944,
0.1317329853773117,
-0.010138611309230328,
0.027062254026532173,
-0.014479435048997402,
0.08326195925474167,
0.02298891544342041,
-0.037907011806964874,
0.11206243187189102,
0.03773945942521095,
0.02805684506893158,
0.02263692580163479,
0.1160544753074646,
0.023887887597084045,
-0.06772660464048386,
-0.11763399839401245,
-0.0017614958342164755,
0.06480446457862854,
-0.04672616720199585,
-0.03293457627296448,
0.1503268927335739,
-0.1097816601395607,
0.10995207726955414,
0.07405032217502594,
-0.07787073403596878,
-0.13838937878608704,
-0.21370331943035126,
-0.05886254459619522,
-0.039770469069480896,
-0.03142128139734268,
-0.14580529928207397,
-0.0654190331697464,
0.01688416302204132,
0.051971517503261566,
0.003243057755753398,
0.16537432372570038,
-0.07062322646379471,
-0.0987730622291565,
0.0607294887304306,
-0.03339105471968651,
0.005305915605276823,
-0.027819102630019188,
-0.009758523665368557,
0.05085602402687073,
0.057036228477954865,
0.032107241451740265,
0.023592866957187653,
0.0036630777176469564,
0.03509202599525452,
-0.05647340044379234,
-0.06754438579082489,
-0.09015598148107529,
-0.0009700623340904713,
-0.0010366188362240791,
0.12062886357307434,
0.024902835488319397,
-0.11650843918323517,
0.03256719559431076,
0.12186513096094131,
-0.032892826944589615,
-0.06624062359333038,
-0.1515260934829712,
0.26383325457572937,
0.0018474431708455086,
0.01890658400952816,
-0.028070906177163124,
-0.03815600275993347,
-0.005130880977958441,
0.24619826674461365,
0.3407156467437744,
-0.07103578746318817,
0.007612635847181082,
0.037416961044073105,
0.006823876406997442,
0.04916536062955856,
0.14502213895320892,
0.0187817569822073,
0.27336564660072327,
-0.022485103458166122,
0.041809260845184326,
-0.006868932396173477,
-0.07039812207221985,
-0.05221797525882721,
-0.039966143667697906,
0.06848148256540298,
-0.05588299781084061,
-0.013955564238131046,
0.15277032554149628,
-0.13775534927845,
-0.12398526072502136,
-0.04364074394106865,
-0.026519646868109703,
-0.07654700428247452,
-0.07606479525566101,
-0.11390645056962967,
0.06626824289560318,
0.06902189552783966,
-0.029941201210021973,
0.021312318742275238,
0.03296424448490143,
0.04837736114859581,
-0.133428692817688,
-0.08698273450136185,
0.1195920780301094,
0.07310611009597778,
0.16079822182655334,
-0.01668482832610607,
0.05930905416607857,
0.08006696403026581,
-0.01908860169351101,
-0.058056265115737915,
0.07384268194437027,
-0.036407411098480225,
0.005059631075710058,
0.05690619349479675,
0.0004741645825561136,
-0.040704816579818726,
-0.027862807735800743,
0.004243544768542051,
-0.10232369601726532,
0.036840010434389114,
-0.06489962339401245,
-0.030617931857705116,
-0.10293823480606079,
0.057851411402225494,
-0.09226211905479431,
0.12044121325016022,
0.16307750344276428,
-0.030414016917347908,
-0.03591788932681084,
-0.08531999588012695,
0.052261821925640106,
0.017703544348478317,
-0.06997960060834885,
-0.06359841674566269,
-0.11604602634906769,
-0.030304228886961937,
-0.0661565512418747,
0.02452918142080307,
-0.30202460289001465,
-0.012930631637573242,
0.005985225085169077,
0.005483629181981087,
-0.00588432140648365,
0.057690225541591644,
0.1658705621957779,
0.04176924750208855,
-0.0061350357718765736,
-0.04380953311920166,
0.019412068650126457,
0.01647714339196682,
-0.20093472301959991,
-0.08420812338590622
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilhubert-ft-common-language
This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the common_language dataset.
It achieves the following results on the evaluation set:
- Loss: 2.7214
- Accuracy: 0.2797
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 4
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 3.6543 | 1.0 | 173 | 3.7611 | 0.0491 |
| 3.2221 | 2.0 | 346 | 3.4868 | 0.1352 |
| 2.9332 | 3.0 | 519 | 3.2732 | 0.1861 |
| 2.7299 | 4.0 | 692 | 3.0944 | 0.2172 |
| 2.5638 | 5.0 | 865 | 2.9790 | 0.2400 |
| 2.3871 | 6.0 | 1038 | 2.8668 | 0.2590 |
| 2.3384 | 7.0 | 1211 | 2.7972 | 0.2653 |
| 2.2648 | 8.0 | 1384 | 2.7625 | 0.2695 |
| 2.2162 | 9.0 | 1557 | 2.7405 | 0.2782 |
| 2.1915 | 10.0 | 1730 | 2.7214 | 0.2797 |
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["common_language"], "metrics": ["accuracy"], "model-index": [{"name": "distilhubert-ft-common-language", "results": []}]}
|
audio-classification
|
anton-l/distilhubert-ft-common-language
|
[
"transformers",
"pytorch",
"tensorboard",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:common_language",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us
|
distilhubert-ft-common-language
===============================
This model is a fine-tuned version of ntu-spml/distilhubert on the common\_language dataset.
It achieves the following results on the evaluation set:
* Loss: 2.7214
* Accuracy: 0.2797
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 4
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 10.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
57,
160,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.13944955170154572,
0.09749532490968704,
-0.0022293191868811846,
0.06074688956141472,
0.13571271300315857,
0.004952613729983568,
0.09805609285831451,
0.12043583393096924,
-0.10300640016794205,
0.08302368968725204,
0.09120486676692963,
0.08509926497936249,
0.04481355473399162,
0.09198489040136337,
-0.026559369638562202,
-0.3267236649990082,
0.012131194584071636,
0.03368677198886871,
-0.1791321188211441,
0.11675205081701279,
0.11332690715789795,
-0.09591410309076309,
0.034116875380277634,
0.04563039913773537,
-0.14081093668937683,
0.0073750680312514305,
-0.013397825881838799,
-0.06499702483415604,
0.09902187436819077,
0.044054463505744934,
0.10823817551136017,
0.02701273374259472,
0.07939531654119492,
-0.22488269209861755,
0.013299185782670975,
0.06680503487586975,
0.0422295518219471,
0.08075769245624542,
0.10258035361766815,
-0.012522784993052483,
0.1271856427192688,
-0.05817144364118576,
0.06947119534015656,
0.055771343410015106,
-0.11341512948274612,
-0.2959219813346863,
-0.10398013144731522,
0.040316637605428696,
0.0990997776389122,
0.08865472674369812,
-0.02410164102911949,
0.07044079899787903,
-0.067537821829319,
0.08264555782079697,
0.2275986671447754,
-0.24241399765014648,
-0.08806565403938293,
-0.0063020349480211735,
0.07284770905971527,
0.053014304488897324,
-0.12590454518795013,
-0.025882968679070473,
0.05917542427778244,
0.0370541550219059,
0.10509411990642548,
0.009700090624392033,
-0.014909680932760239,
0.0024008196778595448,
-0.14184144139289856,
-0.06018901988863945,
0.1616199016571045,
0.08348485082387924,
-0.05453746020793915,
-0.06360895931720734,
-0.009739404544234276,
-0.23820413649082184,
-0.03940710052847862,
0.01827516034245491,
0.0325266532599926,
-0.04852284863591194,
-0.10872388631105423,
0.012235024012625217,
-0.0842791423201561,
-0.09574863314628601,
0.027356302365660667,
0.18169791996479034,
0.04481480270624161,
-0.01931162364780903,
0.0012340915855020285,
0.10985821485519409,
0.04442868009209633,
-0.15406188368797302,
-0.020956316962838173,
0.02970775030553341,
-0.08481309562921524,
-0.030598800629377365,
-0.058162953704595566,
-0.02172182872891426,
-0.012638137675821781,
0.1534905880689621,
-0.06043855845928192,
0.07088601589202881,
0.02599525824189186,
0.029730727896094322,
-0.0810236781835556,
0.17902982234954834,
-0.0607791505753994,
-0.020991405472159386,
-0.03820949047803879,
0.0905412957072258,
-0.014042054302990437,
-0.015603918582201004,
-0.05345284938812256,
0.029743924736976624,
0.10240885615348816,
0.03042498603463173,
-0.044495441019535065,
0.0125502310693264,
-0.06606783717870712,
-0.03720971941947937,
0.002470082836225629,
-0.09314548224210739,
0.024731416255235672,
0.014495326206088066,
-0.0756351426243782,
0.012057560496032238,
0.011873951181769371,
0.023980876430869102,
0.004099992103874683,
0.12720035016536713,
-0.08721178770065308,
-0.011342187412083149,
-0.09289082884788513,
-0.09630489349365234,
0.0428781732916832,
-0.045369457453489304,
0.021855201572179794,
-0.06366661190986633,
-0.1112789586186409,
-0.03863737732172012,
0.06643463671207428,
-0.02941703237593174,
-0.09481772780418396,
-0.03511073812842369,
-0.07511546462774277,
0.04888851195573807,
-0.028941882774233818,
0.1552225947380066,
-0.06176183372735977,
0.11005501449108124,
0.0440429225564003,
0.04159769043326378,
0.019921813160181046,
0.07125658541917801,
-0.057336531579494476,
0.05581037327647209,
-0.1609039455652237,
0.06168751418590546,
-0.10437486320734024,
0.05819828435778618,
-0.118341363966465,
-0.13405539095401764,
-0.024346785619854927,
0.005019388161599636,
0.09117297828197479,
0.08253475278615952,
-0.154488205909729,
-0.10734343528747559,
0.14664246141910553,
-0.0804898664355278,
-0.12556040287017822,
0.12404067814350128,
-0.02132086455821991,
-0.006276966538280249,
0.04778927564620972,
0.1425875872373581,
0.13320323824882507,
-0.09843039512634277,
-0.025909652933478355,
-0.05626966804265976,
0.13581839203834534,
0.007672473322600126,
0.11345253139734268,
-0.018757013604044914,
-0.0015742797404527664,
0.001264277147129178,
-0.04560812562704086,
0.0627140924334526,
-0.10087335854768753,
-0.08740951120853424,
-0.02978150174021721,
-0.09409864991903305,
0.026875047013163567,
0.06558574736118317,
0.047243695706129074,
-0.0906592607498169,
-0.12535460293293,
0.08100669831037521,
0.11300306767225266,
-0.08946329355239868,
0.029436655342578888,
-0.06987020373344421,
0.04996612295508385,
-0.045877985656261444,
-0.03452954441308975,
-0.18537333607673645,
-0.03799540922045708,
0.018761415034532547,
-0.07466097176074982,
0.028815707191824913,
-0.02301493100821972,
0.07094451785087585,
0.05909452959895134,
-0.05932380259037018,
-0.06636368483304977,
-0.1090708076953888,
-0.010268169455230236,
-0.06939458847045898,
-0.20474563539028168,
-0.09125620126724243,
-0.01898857019841671,
0.16551831364631653,
-0.2228218913078308,
0.009309627115726471,
0.017079036682844162,
0.11731935292482376,
0.04147075116634369,
-0.05286053940653801,
-0.015307708643376827,
0.08341224491596222,
-0.02703934721648693,
-0.06215248256921768,
0.01779187098145485,
0.026275064796209335,
-0.11382617801427841,
-0.00755554111674428,
-0.09154440462589264,
0.12153994292020798,
0.0979890301823616,
-0.02212679572403431,
-0.07634757459163666,
-0.038817696273326874,
-0.08547632396221161,
-0.06364262104034424,
-0.025728890672326088,
-0.005110404919832945,
0.15752555429935455,
0.02102724462747574,
0.12489300966262817,
-0.08730889856815338,
-0.046102967113256454,
0.044609300792217255,
-0.0009982399642467499,
-0.005376819521188736,
0.1178484708070755,
0.06755764782428741,
-0.037882000207901,
0.11706814914941788,
0.09844833612442017,
-0.09995128959417343,
0.17157122492790222,
-0.08704449981451035,
-0.13084816932678223,
-0.01709200069308281,
0.005730083677917719,
0.03541664779186249,
0.12874819338321686,
-0.12895502150058746,
0.011525780893862247,
0.020877618342638016,
0.03798924759030342,
0.02830931358039379,
-0.20221808552742004,
-0.014276504516601562,
0.047751110047101974,
-0.05035634711384773,
-0.0653502345085144,
-0.00684671476483345,
-0.007465332746505737,
0.07675734907388687,
0.005544704385101795,
-0.033988021314144135,
-0.0019759046845138073,
-0.010207257233560085,
-0.07275336980819702,
0.19207800924777985,
-0.0883944183588028,
-0.1377883404493332,
-0.1635645627975464,
-0.032787520438432693,
-0.030771471560001373,
-0.008374322205781937,
0.04733490198850632,
-0.09941604733467102,
-0.031938426196575165,
-0.04140273854136467,
0.059575412422418594,
-0.051465217024087906,
0.03174624964594841,
0.011217008344829082,
0.017747294157743454,
0.09941120445728302,
-0.11016868054866791,
0.032485395669937134,
-0.006393399089574814,
-0.044038232415914536,
0.010272146202623844,
0.022785136476159096,
0.09901895374059677,
0.16356149315834045,
0.03464280441403389,
0.018669961020350456,
-0.02611931413412094,
0.1892198771238327,
-0.11911128461360931,
-0.0388646237552166,
0.11148424446582794,
-0.00031249088351614773,
0.0357314795255661,
0.09997227787971497,
0.06831644475460052,
-0.0807323306798935,
0.03031104989349842,
0.06951064616441727,
-0.03498527780175209,
-0.24936194717884064,
-0.03046390600502491,
-0.07649210095405579,
-0.00905338954180479,
0.1141095906496048,
0.028044596314430237,
0.03330465406179428,
0.047057051211595535,
-0.016069116070866585,
0.025654278695583344,
-0.008573214523494244,
0.06867857277393341,
0.07894030958414078,
0.05308857560157776,
0.11849802732467651,
-0.025642002001404762,
-0.027242962270975113,
0.03426285833120346,
0.018298489972949028,
0.2571081817150116,
0.00932314433157444,
0.17327366769313812,
0.06671775877475739,
0.14504048228263855,
0.01747782714664936,
0.09272903949022293,
0.015124903060495853,
-0.032333679497241974,
0.025528447702527046,
-0.05727880075573921,
-0.024143612012267113,
0.027577323839068413,
0.04790481925010681,
0.05229175463318825,
-0.1446441262960434,
-0.031126873567700386,
-0.0023640701547265053,
0.33825916051864624,
0.07235103845596313,
-0.3222335875034332,
-0.1211196556687355,
0.005512550473213196,
-0.060894858092069626,
-0.0516824908554554,
0.0371813140809536,
0.08370260894298553,
-0.08352804183959961,
0.0677676722407341,
-0.08031757175922394,
0.10896272212266922,
-0.035332825034856796,
-0.012817610055208206,
0.11341603100299835,
0.07650456577539444,
-0.010643473826348782,
0.05934401974081993,
-0.2238064855337143,
0.28050753474235535,
-0.015044738538563251,
0.08435001969337463,
-0.0183691568672657,
0.03242990002036095,
0.03988990560173988,
-0.010950770229101181,
0.05801093578338623,
-0.008868022821843624,
-0.09666489064693451,
-0.19921281933784485,
-0.08414053171873093,
0.023030148819088936,
0.11584606766700745,
-0.060859158635139465,
0.12664280831813812,
-0.03199676051735878,
-0.0023818649351596832,
0.06256870180368423,
-0.07741459459066391,
-0.11683440208435059,
-0.10413677245378494,
0.027749886736273766,
0.01278045866638422,
0.07855351269245148,
-0.10911156237125397,
-0.1264280378818512,
-0.056786440312862396,
0.14171993732452393,
-0.05656027793884277,
-0.01803501881659031,
-0.1338413804769516,
0.0893213301897049,
0.18081359565258026,
-0.05658606067299843,
0.07052233815193176,
0.017515098676085472,
0.13970869779586792,
0.040926527231931686,
-0.016246240586042404,
0.09087185561656952,
-0.07816442847251892,
-0.19560682773590088,
-0.03989299759268761,
0.17235404253005981,
0.04165728762745857,
0.06540104001760483,
-0.029599817469716072,
0.03608039766550064,
-0.007077720481902361,
-0.08666131645441055,
0.0335923433303833,
-0.041549649089574814,
0.017011985182762146,
0.04894476756453514,
-0.03825443983078003,
0.05500050261616707,
-0.04515531659126282,
-0.053492650389671326,
0.12954196333885193,
0.27011266350746155,
-0.073316790163517,
0.006535755470395088,
0.028585292398929596,
-0.04612274095416069,
-0.13589301705360413,
0.04632306843996048,
0.15584684908390045,
0.039094652980566025,
0.02075476199388504,
-0.23253946006298065,
0.0922793447971344,
0.09876561164855957,
-0.023934725672006607,
0.11736978590488434,
-0.2981918752193451,
-0.12498784065246582,
0.10171455889940262,
0.09998016059398651,
-0.04140745848417282,
-0.15839126706123352,
-0.06065985560417175,
-0.017833229154348373,
-0.15946142375469208,
0.11817256361246109,
-0.03318183869123459,
0.12573140859603882,
-0.0028493234422057867,
0.05877922102808952,
0.02041255496442318,
-0.04560243338346481,
0.15262529253959656,
0.0022728031035512686,
0.05273371562361717,
0.005586881656199694,
0.05104037746787071,
0.05452125892043114,
-0.0545622818171978,
0.0024076870176941156,
-0.06547357141971588,
0.016562165692448616,
-0.1374496966600418,
-0.038052164018154144,
-0.09113647788763046,
0.02805369719862938,
-0.05291842669248581,
-0.038219209760427475,
-0.025243088603019714,
0.048543818295001984,
0.05054793134331703,
-0.0046269213780760765,
0.16571921110153198,
-0.027118833735585213,
0.16177299618721008,
0.08135627955198288,
0.09182845056056976,
-0.014916365034878254,
-0.12425260990858078,
-0.002957363147288561,
-0.015618907287716866,
0.06274784356355667,
-0.13748466968536377,
0.03828088194131851,
0.15002432465553284,
0.06284210085868835,
0.1344301998615265,
0.07234227657318115,
-0.06012492999434471,
0.01910983957350254,
0.08219300955533981,
-0.07305283099412918,
-0.10427974164485931,
-0.033776164054870605,
0.03589130565524101,
-0.15431059896945953,
0.021073944866657257,
0.0977819412946701,
-0.06873248517513275,
-0.007829244248569012,
0.019389456138014793,
0.0014318636385723948,
-0.07013431936502457,
0.2266255021095276,
0.04212769865989685,
0.07919131219387054,
-0.09301169961690903,
0.08434594422578812,
0.04847823828458786,
-0.1717386394739151,
-0.012576752342283726,
0.06483426690101624,
-0.03991999849677086,
-0.00278618186712265,
0.006942487321794033,
0.05526464805006981,
-0.04808276146650314,
-0.0580565445125103,
-0.12190885841846466,
-0.14673352241516113,
0.08514336496591568,
0.09226450324058533,
0.051192618906497955,
0.04395449161529541,
-0.046806417405605316,
0.06040133535861969,
-0.11497675627470016,
0.10098332166671753,
0.102720707654953,
0.09402848780155182,
-0.16456350684165955,
0.12095978856086731,
0.00973015371710062,
0.013950426131486893,
0.000172546278918162,
-0.008803175762295723,
-0.08708664029836655,
0.020372062921524048,
-0.1289411336183548,
-0.02990860864520073,
-0.03575316444039345,
-0.0008369623683393002,
-0.0026160809211432934,
-0.05750546231865883,
-0.08741205930709839,
0.02785545587539673,
-0.12181130051612854,
-0.046923622488975525,
-0.002175724832341075,
0.07363421469926834,
-0.0983419418334961,
-0.009815910831093788,
0.06929003447294235,
-0.12025405466556549,
0.07404894381761551,
0.05288161709904671,
0.03634052351117134,
0.0467413067817688,
-0.07462860643863678,
0.009723659604787827,
0.047020990401506424,
-0.003757581114768982,
0.032735176384449005,
-0.17343811690807343,
-0.004192395135760307,
-0.020404627546668053,
0.051562462002038956,
-0.011431091465055943,
0.00004634398646885529,
-0.13271664083003998,
-0.060885097831487656,
-0.023672327399253845,
-0.051522962749004364,
-0.05376942828297615,
0.037526536732912064,
0.0800737664103508,
0.05924265831708908,
0.17588970065116882,
-0.06595960259437561,
0.026782894507050514,
-0.23297423124313354,
0.011859222315251827,
-0.027488354593515396,
-0.08351118117570877,
-0.07158919423818588,
-0.026590419933199883,
0.07003674656152725,
-0.0569167286157608,
0.09067479521036148,
-0.06168311834335327,
0.05571764335036278,
0.04166356846690178,
-0.06961360573768616,
0.03593989089131355,
0.05023195967078209,
0.2602899372577667,
0.057157374918460846,
-0.01766636222600937,
0.07514581829309464,
0.005865528713911772,
0.05592220276594162,
0.1410685032606125,
0.17307227849960327,
0.15784719586372375,
-0.008294107392430305,
0.07938459515571594,
0.04945861175656319,
-0.09329362213611603,
-0.1477026492357254,
0.10238417237997055,
-0.0194714292883873,
0.13619332015514374,
0.010230720043182373,
0.22855985164642334,
0.09818139672279358,
-0.20044921338558197,
0.06478603184223175,
-0.04036901891231537,
-0.08967254310846329,
-0.09452119469642639,
-0.03358372300863266,
-0.06961905211210251,
-0.18689893186092377,
0.0155038395896554,
-0.13162146508693695,
0.05440216511487961,
0.07088592648506165,
0.02698536403477192,
0.020197898149490356,
0.15846379101276398,
0.027597591280937195,
-0.0015616263262927532,
0.09784016758203506,
0.004246093332767487,
-0.015947148203849792,
-0.050197482109069824,
-0.09614071995019913,
0.03917231783270836,
-0.029200708493590355,
0.04476179555058479,
-0.07099905610084534,
-0.09952433407306671,
0.07010775059461594,
0.028540441766381264,
-0.10197585076093674,
0.02245042659342289,
-0.002468041842803359,
0.08159825950860977,
0.056281737983226776,
0.0053314026445150375,
0.01590125449001789,
-0.01782231591641903,
0.24871192872524261,
-0.09407331794500351,
-0.03532826527953148,
-0.14846590161323547,
0.2166300266981125,
0.015565052628517151,
-0.023080628365278244,
0.03846324607729912,
-0.07398435473442078,
-0.015292292460799217,
0.14895199239253998,
0.11886914074420929,
-0.009199229069054127,
-0.03461827337741852,
0.009269658476114273,
-0.014279942959547043,
-0.05212651193141937,
0.08668165653944016,
0.113979771733284,
0.08445252478122711,
-0.07116875052452087,
-0.05256807804107666,
-0.06061243638396263,
-0.04455895349383354,
-0.002889742609113455,
0.08568234741687775,
0.027486279606819153,
-0.02496875822544098,
-0.03198527172207832,
0.11462178081274033,
-0.07284537702798843,
-0.10941428691148758,
0.032124344259500504,
-0.1718515306711197,
-0.19381816685199738,
-0.06048591434955597,
0.05234741419553757,
0.011458391323685646,
0.0522964745759964,
-0.014455161057412624,
-0.042973797768354416,
0.10091017186641693,
-0.0009100152528844774,
-0.04274166002869606,
-0.1334211826324463,
0.10025620460510254,
-0.04826074466109276,
0.20418564975261688,
-0.04659649357199669,
0.03524238243699074,
0.10863275080919266,
0.0703139379620552,
-0.07415467500686646,
0.01496867649257183,
0.07680103927850723,
-0.1484028398990631,
0.027659138664603233,
0.19015368819236755,
-0.049077484756708145,
0.12150320410728455,
0.020416399464011192,
-0.14259018003940582,
0.006163208279758692,
-0.07588865607976913,
-0.05060949549078941,
-0.0665249451994896,
-0.03082767315208912,
-0.057799115777015686,
0.12590138614177704,
0.22096726298332214,
-0.06999412924051285,
-0.026571422815322876,
-0.061393145471811295,
0.03417360782623291,
0.0748000517487526,
0.12566320598125458,
-0.027284495532512665,
-0.2854473888874054,
0.014649897813796997,
0.015640508383512497,
-0.018728939816355705,
-0.24853301048278809,
-0.08875339478254318,
0.04340396448969841,
-0.06172145530581474,
-0.04219021648168564,
0.10956098139286041,
0.058977484703063965,
0.04419853910803795,
-0.05391436070203781,
-0.0755433440208435,
-0.06738006323575974,
0.17677174508571625,
-0.1698075830936432,
-0.0741862803697586
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilhubert-ft-keyword-spotting
This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1163
- Accuracy: 0.9706
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 256
- eval_batch_size: 32
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.8176 | 1.0 | 200 | 0.7718 | 0.8116 |
| 0.2364 | 2.0 | 400 | 0.2107 | 0.9662 |
| 0.1198 | 3.0 | 600 | 0.1374 | 0.9678 |
| 0.0891 | 4.0 | 800 | 0.1163 | 0.9706 |
| 0.085 | 5.0 | 1000 | 0.1180 | 0.9690 |
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["superb"], "metrics": ["accuracy"], "model-index": [{"name": "distilhubert-ft-keyword-spotting", "results": []}]}
|
audio-classification
|
anton-l/distilhubert-ft-keyword-spotting
|
[
"transformers",
"pytorch",
"tensorboard",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us
|
distilhubert-ft-keyword-spotting
================================
This model is a fine-tuned version of ntu-spml/distilhubert on the superb dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1163
* Accuracy: 0.9706
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 256
* eval\_batch\_size: 32
* seed: 0
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 256\n* eval\\_batch\\_size: 32\n* seed: 0\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 256\n* eval\\_batch\\_size: 32\n* seed: 0\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
55,
131,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 256\n* eval\\_batch\\_size: 32\n* seed: 0\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.11923036724328995,
0.1360771358013153,
-0.003427231917157769,
0.07167264819145203,
0.1276320219039917,
-0.0034213669132441282,
0.10394416749477386,
0.13054011762142181,
-0.1366923451423645,
0.06726245582103729,
0.09749910980463028,
0.13749314844608307,
0.031050803139805794,
0.13717973232269287,
-0.05294375494122505,
-0.27001991868019104,
0.02947348542511463,
0.048104822635650635,
-0.07651591300964355,
0.11918129771947861,
0.08413724601268768,
-0.12541857361793518,
0.054160889238119125,
0.020763462409377098,
-0.1705528348684311,
0.00977055449038744,
0.005778281483799219,
-0.08134301006793976,
0.0895116850733757,
0.006761232856661081,
0.09330276399850845,
0.05161244049668312,
0.07257416844367981,
-0.18618930876255035,
0.011628223583102226,
0.07747601717710495,
0.023696858435869217,
0.08807402104139328,
0.0774696096777916,
-0.020175674930214882,
0.11118193715810776,
-0.07444114983081818,
0.06964635103940964,
0.028016667813062668,
-0.10999990999698639,
-0.27503225207328796,
-0.11185569316148758,
0.051246318966150284,
0.06993179023265839,
0.07701359689235687,
-0.0013772493693977594,
0.11466077715158463,
-0.06870738416910172,
0.09966608136892319,
0.2682819366455078,
-0.2716176211833954,
-0.050367072224617004,
-0.006691863294690847,
0.05059563368558884,
0.09015698730945587,
-0.09637804329395294,
-0.029649510979652405,
0.03861213102936745,
0.04428645595908165,
0.1492113620042801,
-0.0202056635171175,
-0.06341196596622467,
-0.013544936664402485,
-0.145993173122406,
-0.06042854115366936,
0.13193821907043457,
0.01825256273150444,
-0.054364003241062164,
-0.05043242499232292,
-0.05346071720123291,
-0.24022747576236725,
-0.05238747224211693,
0.015159635804593563,
0.04662906005978584,
-0.05193604528903961,
-0.07535941153764725,
-0.0016038893954828382,
-0.0588737353682518,
-0.07634390145540237,
-0.026660343632102013,
0.1675741821527481,
0.05963495373725891,
0.0125435721129179,
-0.029172293841838837,
0.09516318142414093,
-0.010711747221648693,
-0.15435820817947388,
-0.010118145495653152,
0.022042740136384964,
-0.01989728771150112,
-0.029002517461776733,
-0.04742996022105217,
-0.0456966906785965,
0.0020246393978595734,
0.15297989547252655,
-0.12932707369327545,
0.08294203132390976,
-0.019232453778386116,
0.04174342378973961,
-0.0738682672381401,
0.15597915649414062,
-0.03503415361046791,
-0.003741578198969364,
0.017468402162194252,
0.07345332950353622,
0.0380062572658062,
-0.022070184350013733,
-0.08132375031709671,
0.03238774463534355,
0.0969230905175209,
0.030490504577755928,
-0.04290861263871193,
0.03233286738395691,
-0.05178414657711983,
-0.03106873482465744,
0.04467931389808655,
-0.11092473566532135,
0.035247769206762314,
0.0179817546159029,
-0.05553354322910309,
0.0017323625506833196,
0.016521023586392403,
0.0031806407496333122,
-0.04683883488178253,
0.12763473391532898,
-0.07177864015102386,
0.020504912361502647,
-0.0802607387304306,
-0.11226313561201096,
0.04290587455034256,
-0.12627895176410675,
0.012272605672478676,
-0.0654631108045578,
-0.10613968223333359,
-0.016902312636375427,
0.053008075803518295,
-0.023196857422590256,
-0.07381225377321243,
-0.033710967749357224,
-0.08531130850315094,
0.03778955712914467,
-0.041969556361436844,
0.10344177484512329,
-0.08041033893823624,
0.1010860726237297,
0.021767280995845795,
0.06516795605421066,
-0.025264067575335503,
0.07852613180875778,
-0.07118897140026093,
0.03448539599776268,
-0.22508803009986877,
0.05388011410832405,
-0.09882035851478577,
0.023277953267097473,
-0.09789062291383743,
-0.12816616892814636,
0.02332419343292713,
-0.005324422847479582,
0.09100030362606049,
0.08046462386846542,
-0.16216182708740234,
-0.0891554206609726,
0.17001184821128845,
-0.08913113176822662,
-0.08087305724620819,
0.11553585529327393,
-0.027013318613171577,
-0.055768225342035294,
0.058534134179353714,
0.20272395014762878,
0.10405220836400986,
-0.1142456904053688,
-0.013853238895535469,
-0.03475380316376686,
0.07986068725585938,
-0.05367753654718399,
0.0760195255279541,
0.0013988647842779756,
0.04817970469594002,
0.009951156564056873,
-0.018342988565564156,
0.03688068315386772,
-0.08595270663499832,
-0.07611145079135895,
-0.0479389950633049,
-0.07959616184234619,
0.022970059886574745,
0.05796483904123306,
0.03959663584828377,
-0.10611837357282639,
-0.10268234461545944,
0.08063442260026932,
0.09804078191518784,
-0.08537377417087555,
0.06571608781814575,
-0.08269110321998596,
0.0973278135061264,
-0.04764663800597191,
-0.02941088192164898,
-0.21439795196056366,
0.021384909749031067,
0.022655244916677475,
-0.04021996259689331,
0.031327586621046066,
-0.040459077805280685,
0.050613537430763245,
0.072276771068573,
-0.053792692720890045,
-0.05684583634138107,
-0.04969983920454979,
0.0057630822993814945,
-0.08209031075239182,
-0.22400586307048798,
-0.051621753722429276,
-0.034001901745796204,
0.04622337967157364,
-0.14299984276294708,
0.014759927056729794,
0.05517954006791115,
0.10168308764696121,
0.0444093681871891,
-0.043889109045267105,
-0.00048448724555782974,
0.10985586047172546,
-0.025997303426265717,
-0.06759949773550034,
0.05195210874080658,
0.03213381767272949,
-0.06416554749011993,
-0.0004182496923021972,
-0.11076714098453522,
0.1609768122434616,
0.12859001755714417,
-0.012820803560316563,
-0.05773162096738815,
0.010327068157494068,
-0.06551375985145569,
-0.037022583186626434,
-0.042281340807676315,
0.05622519925236702,
0.19259977340698242,
0.008529837243258953,
0.1419408768415451,
-0.09775366634130478,
-0.03493078425526619,
0.05180807784199715,
-0.011619732715189457,
0.011093724519014359,
0.10984858870506287,
0.07186645269393921,
-0.051655180752277374,
0.13811011612415314,
0.11785471439361572,
-0.09139134734869003,
0.15503954887390137,
-0.09153882414102554,
-0.09785103052854538,
-0.014237331226468086,
-0.02353449910879135,
0.009902404621243477,
0.12107834219932556,
-0.12227744609117508,
-0.002435592468827963,
0.026091933250427246,
0.04684828221797943,
0.02003578282892704,
-0.20679442584514618,
0.0021613806020468473,
0.02935490570962429,
-0.07068014144897461,
-0.07582588493824005,
-0.008293467573821545,
0.019542036578059196,
0.09462164342403412,
-0.01119663380086422,
-0.058127276599407196,
0.016588551923632622,
0.008676059544086456,
-0.06069643795490265,
0.1685124933719635,
-0.1182912215590477,
-0.15480169653892517,
-0.11428365856409073,
-0.071433886885643,
-0.046063896268606186,
-0.012765849009156227,
0.07571211457252502,
-0.08382657915353775,
-0.04056743159890175,
-0.06312883645296097,
0.00005326309474185109,
-0.017036685720086098,
0.032197896391153336,
0.019689567387104034,
-0.0005247191293165088,
0.08247920870780945,
-0.10725520551204681,
-0.0046055433340370655,
-0.025288982316851616,
-0.018793385475873947,
0.030910221859812737,
0.0656571313738823,
0.0873681977391243,
0.14959411323070526,
-0.005927828140556812,
0.026635667309165,
-0.030983708798885345,
0.19868507981300354,
-0.0865512415766716,
-0.023769641295075417,
0.1282912790775299,
-0.02023126929998398,
0.06931769847869873,
0.10783521831035614,
0.05917299911379814,
-0.07328657805919647,
-0.017673416063189507,
0.033929891884326935,
-0.05355406925082207,
-0.24587683379650116,
-0.04451165348291397,
-0.06519156694412231,
-0.006500094197690487,
0.09109115600585938,
0.027776474133133888,
0.009135770611464977,
0.03205808252096176,
0.00730674434453249,
0.02424166351556778,
-0.027511607855558395,
0.06901650875806808,
0.10559432953596115,
0.0459282286465168,
0.1281713992357254,
-0.05107373371720314,
-0.008389228954911232,
0.04879161715507507,
0.020835554227232933,
0.23298174142837524,
-0.0038861343637108803,
0.1492561548948288,
0.06507305055856705,
0.16389290988445282,
0.035308193415403366,
0.07034074515104294,
-0.012098893523216248,
-0.011693784035742283,
0.006001926958560944,
-0.05030406266450882,
-0.04074716195464134,
-0.007291084621101618,
-0.008644366636872292,
0.023669766262173653,
-0.12635061144828796,
-0.003413318656384945,
0.01648569293320179,
0.3162553310394287,
0.03672517463564873,
-0.31623393297195435,
-0.09606019407510757,
0.0033087809570133686,
-0.04153365269303322,
-0.032823264598846436,
0.050307098776102066,
0.10257252305746078,
-0.05956191197037697,
0.08070125430822372,
-0.06424814462661743,
0.09971913695335388,
-0.0597662553191185,
0.01230466179549694,
0.0822654440999031,
0.09636598080396652,
0.002064208500087261,
0.03675577789545059,
-0.26183921098709106,
0.24518780410289764,
0.012554673478007317,
0.08108922839164734,
-0.0377112552523613,
0.015045767650008202,
0.029986709356307983,
0.057743463665246964,
0.07660852372646332,
-0.007669078651815653,
-0.1440252661705017,
-0.16951827704906464,
-0.10589689761400223,
0.006455192342400551,
0.0987296923995018,
0.023762153461575508,
0.09645748883485794,
-0.01896512508392334,
-0.007799538783729076,
0.04887203499674797,
-0.10869965702295303,
-0.07365972548723221,
-0.09687995910644531,
0.0140445651486516,
0.04406669735908508,
-0.007766437251120806,
-0.08065018057823181,
-0.1174827292561531,
-0.06131480634212494,
0.13777969777584076,
-0.04528127238154411,
-0.05189963057637215,
-0.1289936602115631,
0.03966059908270836,
0.11707358062267303,
-0.06976621598005295,
0.06489365547895432,
0.00801766011863947,
0.10313278436660767,
0.030007269233465195,
-0.08286206424236298,
0.0958537831902504,
-0.06733914464712143,
-0.1752505600452423,
-0.038682255893945694,
0.15736818313598633,
0.03710421919822693,
0.07235069572925568,
-0.032202936708927155,
0.042818162590265274,
0.014696058817207813,
-0.08085725456476212,
0.031213678419589996,
-0.00003320681571494788,
0.08546099066734314,
0.01633266545832157,
-0.04350058734416962,
0.010294746607542038,
-0.03926674276590347,
-0.026320146396756172,
0.1636674404144287,
0.22306862473487854,
-0.08579371869564056,
0.10056145489215851,
0.04042693227529526,
-0.06714688986539841,
-0.1972190886735916,
0.029720153659582138,
0.09526151418685913,
0.008568013086915016,
0.006835756823420525,
-0.2240232229232788,
0.07564335316419601,
0.0774819478392601,
-0.015350226312875748,
0.11703552305698395,
-0.3188321888446808,
-0.1181938648223877,
0.11519090831279755,
0.11238354444503784,
0.05523740500211716,
-0.146937295794487,
-0.03521525114774704,
-0.004649181384593248,
-0.12135383486747742,
0.13369907438755035,
-0.09742327779531479,
0.13019414246082306,
-0.027222009375691414,
0.062213871628046036,
0.01438821665942669,
-0.05273505672812462,
0.10163401812314987,
0.010678291320800781,
0.07496565580368042,
-0.028631580993533134,
0.02974827028810978,
0.07505440711975098,
-0.05633258447051048,
0.014521858654916286,
-0.0821998342871666,
0.03337906301021576,
-0.09802982211112976,
-0.019346443936228752,
-0.08185099065303802,
0.007567993830889463,
-0.03518836200237274,
-0.026328498497605324,
-0.04370947554707527,
0.026335841044783592,
0.0749911293387413,
-0.02744789980351925,
0.20313037931919098,
0.001365669071674347,
0.12782837450504303,
0.15089693665504456,
0.08885617554187775,
-0.08113635331392288,
-0.1272822916507721,
-0.0006322195986285806,
-0.029091181233525276,
0.06280242651700974,
-0.15413406491279602,
0.049599647521972656,
0.1432599425315857,
0.038613684475421906,
0.09615454077720642,
0.06964613497257233,
-0.04774457588791847,
0.019544154405593872,
0.05971740186214447,
-0.14498308300971985,
-0.09781412780284882,
0.0013107666745781898,
0.002598719671368599,
-0.08684820681810379,
0.04576360806822777,
0.10056072473526001,
-0.0625116303563118,
-0.014869775623083115,
0.00873126182705164,
0.02542460896074772,
-0.05009792000055313,
0.21351215243339539,
0.03681330755352974,
0.0580902025103569,
-0.1281321793794632,
0.10518425703048706,
0.04221102595329285,
-0.1537836194038391,
0.033106155693531036,
0.08850117772817612,
-0.08031059801578522,
-0.012491201050579548,
0.07040325552225113,
0.10995610058307648,
-0.024818699806928635,
-0.06408239156007767,
-0.13211099803447723,
-0.13771606981754303,
0.10643484443426132,
0.16476239264011383,
0.0648331418633461,
0.023744851350784302,
-0.0584777407348156,
0.021324753761291504,
-0.1223858892917633,
0.09633927792310715,
0.05319135636091232,
0.07955837994813919,
-0.16140957176685333,
0.13170252740383148,
0.015876585617661476,
0.03487108647823334,
-0.01885499618947506,
0.015261669643223286,
-0.09577780216932297,
0.02350503019988537,
-0.14056630432605743,
0.006387854926288128,
-0.03267337381839752,
0.005616456735879183,
-0.023393183946609497,
-0.03970981761813164,
-0.0765305832028389,
0.048101846128702164,
-0.11606989055871964,
-0.02282080613076687,
-0.004217080771923065,
0.043240293860435486,
-0.11358336359262466,
-0.006636822130531073,
0.02416333183646202,
-0.09520962089300156,
0.06887873262166977,
0.06356662511825562,
-0.006974547170102596,
0.038731057196855545,
-0.05085628107190132,
-0.034415990114212036,
0.07418449968099594,
0.005572163034230471,
0.0600985512137413,
-0.14279432594776154,
-0.007278710603713989,
0.007177014369517565,
0.027247583493590355,
0.004146704915910959,
0.08181378245353699,
-0.12410932779312134,
-0.017834626138210297,
-0.024487560614943504,
-0.049869686365127563,
-0.0618613176047802,
0.04777555167675018,
0.1358834058046341,
0.03533449396491051,
0.18374978005886078,
-0.0769963189959526,
0.0250498466193676,
-0.1966249793767929,
0.003745034337043762,
-0.00650372076779604,
-0.13168686628341675,
-0.08693519234657288,
-0.023056987673044205,
0.075290746986866,
-0.055460456758737564,
0.12486490607261658,
-0.0057816277258098125,
0.018968814983963966,
0.04272744432091713,
-0.06270969659090042,
-0.028694365173578262,
0.046559568494558334,
0.2006354033946991,
0.02652428112924099,
-0.046569086611270905,
0.08062317967414856,
0.007288898807018995,
0.0767074003815651,
0.1501111090183258,
0.21119707822799683,
0.14996623992919922,
0.05148673430085182,
0.07793474197387695,
0.031121648848056793,
-0.0754869356751442,
-0.18426008522510529,
0.10097932815551758,
-0.047304704785346985,
0.15187229216098785,
0.004487709142267704,
0.22372941672801971,
0.07922160625457764,
-0.17483538389205933,
0.08711861819028854,
-0.037167370319366455,
-0.08968193829059601,
-0.12500421702861786,
-0.052387792617082596,
-0.07971449941396713,
-0.1711496263742447,
0.00593865942209959,
-0.13276252150535583,
0.0704062432050705,
0.06437172740697861,
0.015483388677239418,
0.017558805644512177,
0.14492356777191162,
0.017051860690116882,
0.008383405394852161,
0.08092105388641357,
0.013570936396718025,
-0.04615040868520737,
-0.03931952640414238,
-0.07939648628234863,
0.04958929866552353,
-0.030827170237898827,
0.052572283893823624,
-0.0295394416898489,
-0.05621611326932907,
0.07182791829109192,
-0.015324228443205357,
-0.08911046385765076,
0.028367839753627777,
0.00929174479097128,
0.09106852859258652,
0.0785461813211441,
0.010112782940268517,
0.0024528796784579754,
-0.006677132565528154,
0.2042115330696106,
-0.07994701713323593,
-0.0502486489713192,
-0.11278284341096878,
0.23673520982265472,
0.045869529247283936,
-0.02395777218043804,
0.05937601998448372,
-0.06071292981505394,
-0.03696010261774063,
0.16827651858329773,
0.1649693101644516,
-0.017349952831864357,
-0.017508940771222115,
-0.013164237141609192,
-0.010126007720828056,
-0.04061266779899597,
0.11070483177900314,
0.14204975962638855,
0.07115945219993591,
-0.06613090634346008,
-0.06752955168485641,
-0.06183783337473869,
-0.021118510514497757,
-0.03854776918888092,
0.0742148831486702,
0.021473394706845284,
-0.02485801838338375,
-0.03126630187034607,
0.05671374872326851,
-0.08822689205408096,
-0.10489924997091293,
0.03904462978243828,
-0.19047309458255768,
-0.1727735549211502,
-0.02232060581445694,
0.09565478563308716,
0.005695033818483353,
0.026056237518787384,
-0.00440544867888093,
-0.02166624739766121,
0.09645558893680573,
-0.016933375969529152,
-0.06357920169830322,
-0.08870051056146622,
0.08816719055175781,
-0.12607917189598083,
0.18409691751003265,
-0.034459494054317474,
0.05712800845503807,
0.10064418613910675,
0.07435926049947739,
-0.08620448410511017,
0.046539321541786194,
0.04167252779006958,
-0.1347496509552002,
-0.0035335903521627188,
0.1721571832895279,
-0.049251195043325424,
0.08509082347154617,
0.021458815783262253,
-0.08982940763235092,
-0.008914426900446415,
-0.0857376828789711,
-0.058893658220767975,
-0.022516610100865364,
-0.05278031900525093,
-0.05918436869978905,
0.11749996989965439,
0.17613928020000458,
-0.03710755333304405,
-0.0009544879430904984,
-0.07113808393478394,
0.024640778079628944,
0.07021304965019226,
0.04467160254716873,
-0.028790492564439774,
-0.2561860680580139,
0.019438566640019417,
0.05490388721227646,
-0.0127910440787673,
-0.2249448001384735,
-0.08243299275636673,
0.020092839375138283,
-0.04949593171477318,
-0.11103582382202148,
0.08564409613609314,
0.03596685826778412,
0.04858633875846863,
-0.06208707392215729,
-0.04031803086400032,
-0.06365787982940674,
0.15690642595291138,
-0.16469420492649078,
-0.0646822452545166
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hubert-base-ft-keyword-spotting
This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0774
- Accuracy: 0.9819
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.0422 | 1.0 | 399 | 0.8999 | 0.6918 |
| 0.3296 | 2.0 | 798 | 0.1505 | 0.9778 |
| 0.2088 | 3.0 | 1197 | 0.0901 | 0.9816 |
| 0.202 | 4.0 | 1596 | 0.0848 | 0.9813 |
| 0.1535 | 5.0 | 1995 | 0.0774 | 0.9819 |
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["superb"], "metrics": ["accuracy"], "model-index": [{"name": "hubert-base-ft-keyword-spotting", "results": []}]}
|
audio-classification
|
anton-l/hubert-base-ft-keyword-spotting
|
[
"transformers",
"pytorch",
"tensorboard",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
hubert-base-ft-keyword-spotting
===============================
This model is a fine-tuned version of facebook/hubert-base-ls960 on the superb dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0774
* Accuracy: 0.9819
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
59,
159,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #hubert #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.13547730445861816,
0.09770535677671432,
-0.0019511941354721785,
0.05823710933327675,
0.13565237820148468,
0.008398162201046944,
0.09706829488277435,
0.12149316817522049,
-0.09401124715805054,
0.08690274506807327,
0.0865107998251915,
0.08005515486001968,
0.04701423645019531,
0.10026959329843521,
-0.02306104078888893,
-0.3251911401748657,
0.01391033548861742,
0.028849992901086807,
-0.17643329501152039,
0.11318033933639526,
0.10885980725288391,
-0.10511209070682526,
0.03533075749874115,
0.03957663103938103,
-0.13592708110809326,
0.014277695678174496,
-0.01642421819269657,
-0.06772814691066742,
0.09578478336334229,
0.034157805144786835,
0.11175654828548431,
0.022090962156653404,
0.09395875781774521,
-0.21648962795734406,
0.014479033648967743,
0.07718644291162491,
0.038992736488580704,
0.08489358425140381,
0.10668163746595383,
-0.0070786550641059875,
0.13689526915550232,
-0.06451224535703659,
0.0696779415011406,
0.05056716501712799,
-0.11683299392461777,
-0.29799628257751465,
-0.10655258595943451,
0.04161407798528671,
0.097892165184021,
0.07850004732608795,
-0.0229227002710104,
0.07110382616519928,
-0.07149412482976913,
0.07423267513513565,
0.23440729081630707,
-0.23766760528087616,
-0.09374526888132095,
0.0003831015492323786,
0.07563744485378265,
0.04456326737999916,
-0.12359217554330826,
-0.03283858299255371,
0.05392521247267723,
0.03556104376912117,
0.109772689640522,
0.008856512606143951,
-0.015649428591132164,
0.00861862301826477,
-0.14997927844524384,
-0.054617445915937424,
0.15032874047756195,
0.08207832276821136,
-0.052119895815849304,
-0.05501104146242142,
-0.01311182975769043,
-0.22959336638450623,
-0.04371371492743492,
0.02605183608829975,
0.029947424307465553,
-0.0524652861058712,
-0.10939846187829971,
0.023261437192559242,
-0.08097215741872787,
-0.09908415377140045,
0.022029047831892967,
0.17641231417655945,
0.04385710507631302,
-0.01609937846660614,
0.0036586960777640343,
0.11429964005947113,
0.043116651475429535,
-0.15796248614788055,
-0.01606818288564682,
0.02755173109471798,
-0.09170562773942947,
-0.03233915567398071,
-0.05293168127536774,
-0.0036663233768194914,
-0.01079458836466074,
0.1451500803232193,
-0.05592067539691925,
0.06429550051689148,
0.03562990948557854,
0.035311173647642136,
-0.07485052198171616,
0.1645890176296234,
-0.0693059116601944,
-0.014694075100123882,
-0.03835892304778099,
0.09418438374996185,
-0.011782321147620678,
-0.012927855364978313,
-0.05006546527147293,
0.02719099074602127,
0.09037867933511734,
0.028642283752560616,
-0.0308640468865633,
0.0085268784314394,
-0.059603795409202576,
-0.02943253517150879,
0.008234372362494469,
-0.09511468559503555,
0.028477095067501068,
0.01892995834350586,
-0.08146019279956818,
0.0005250871181488037,
0.006327974144369364,
0.02454792708158493,
0.005206872243434191,
0.14200226962566376,
-0.08794185519218445,
-0.006898289546370506,
-0.09713630378246307,
-0.09688442945480347,
0.04632780700922012,
-0.05382585525512695,
0.0165867917239666,
-0.06296966969966888,
-0.11311034858226776,
-0.04001607373356819,
0.0640755146741867,
-0.03207596391439438,
-0.0950702652335167,
-0.028784872964024544,
-0.07875492423772812,
0.051608260720968246,
-0.03102804161608219,
0.16468873620033264,
-0.06437292695045471,
0.1123596727848053,
0.03350721299648285,
0.042284078896045685,
0.01311112754046917,
0.06929044425487518,
-0.06377854943275452,
0.05123518034815788,
-0.14831127226352692,
0.04594182223081589,
-0.10070975124835968,
0.05747224763035774,
-0.12005876749753952,
-0.13650035858154297,
-0.028480850160121918,
0.0030032657086849213,
0.09433094412088394,
0.08152009546756744,
-0.17015552520751953,
-0.10150780528783798,
0.1367214322090149,
-0.08231616765260696,
-0.12067407369613647,
0.11542054265737534,
-0.019106363877654076,
-0.003980756271630526,
0.03653542324900627,
0.13695785403251648,
0.12183668464422226,
-0.08866846561431885,
-0.028917772695422173,
-0.054930198937654495,
0.13691432774066925,
0.009076716378331184,
0.11313446611166,
-0.011090029031038284,
0.005572529509663582,
-0.005140483845025301,
-0.035209160298109055,
0.056551218032836914,
-0.10209139436483383,
-0.09263606369495392,
-0.04024570435285568,
-0.09396092593669891,
0.018256142735481262,
0.07255015522241592,
0.042761173099279404,
-0.0914420336484909,
-0.12489441782236099,
0.07581453770399094,
0.11374253779649734,
-0.08267559111118317,
0.026179946959018707,
-0.06647324562072754,
0.05728127434849739,
-0.03987666219472885,
-0.03518863022327423,
-0.18485042452812195,
-0.0295279361307621,
0.018459122627973557,
-0.07027266919612885,
0.027214858680963516,
-0.022611606866121292,
0.06714002043008804,
0.06153528392314911,
-0.05628955364227295,
-0.07299662381410599,
-0.10687915980815887,
-0.010501994751393795,
-0.06906342506408691,
-0.2135680466890335,
-0.10415568947792053,
-0.02211231179535389,
0.12716317176818848,
-0.21974553167819977,
0.008169523440301418,
0.008369682356715202,
0.123633474111557,
0.03656257688999176,
-0.05284770205616951,
-0.015157963149249554,
0.09191982448101044,
-0.026101229712367058,
-0.06741832941770554,
0.02156991697847843,
0.019955195486545563,
-0.10557428002357483,
-0.007816553115844727,
-0.09114263206720352,
0.1327369511127472,
0.09770161658525467,
-0.024091335013508797,
-0.0777164176106453,
-0.02962985634803772,
-0.08803173899650574,
-0.060711245983839035,
-0.0447043851017952,
0.004624620079994202,
0.14944830536842346,
0.02298792451620102,
0.1127612516283989,
-0.09091280400753021,
-0.05553164705634117,
0.043315865099430084,
0.0030786984134465456,
-0.006529092788696289,
0.11604870110750198,
0.08664966374635696,
-0.05451234057545662,
0.11771570891141891,
0.10775371640920639,
-0.1034204512834549,
0.17139483988285065,
-0.09005745500326157,
-0.1335512399673462,
-0.016504161059856415,
-0.00041800172766670585,
0.03220289200544357,
0.14087706804275513,
-0.12039349228143692,
0.014380578882992268,
0.017955070361495018,
0.03545031696557999,
0.028117068111896515,
-0.2059955596923828,
-0.022725043818354607,
0.03772830590605736,
-0.05047266185283661,
-0.08376432210206985,
0.00024216750171035528,
-0.0009020306752063334,
0.08588716387748718,
-0.004541510250419378,
-0.026804208755493164,
-0.009569281712174416,
-0.011891010217368603,
-0.06991422921419144,
0.1883455514907837,
-0.08483865112066269,
-0.12742821872234344,
-0.15153318643569946,
-0.013300531543791294,
-0.016478722915053368,
-0.011973676271736622,
0.04625262692570686,
-0.10016515105962753,
-0.03161530941724777,
-0.04611853510141373,
0.04663940891623497,
-0.050577033311128616,
0.03225308656692505,
-0.008165203966200352,
0.02466168813407421,
0.10507585853338242,
-0.11175898462533951,
0.03716789558529854,
-0.003654361702501774,
-0.05320555716753006,
0.009978999383747578,
0.04368038475513458,
0.09328369051218033,
0.15448395907878876,
0.03220409154891968,
0.00939586665481329,
-0.025196397677063942,
0.1872366964817047,
-0.11221407353878021,
-0.032369427382946014,
0.12099659442901611,
0.0012847883626818657,
0.04011662304401398,
0.104166679084301,
0.06968937814235687,
-0.07493962347507477,
0.026438025757670403,
0.06796976923942566,
-0.0333409309387207,
-0.24909906089305878,
-0.029310686513781548,
-0.07687897235155106,
-0.01442793570458889,
0.11765415221452713,
0.031985871493816376,
-0.001779391197487712,
0.04979143291711807,
-0.024774441495537758,
0.016230760142207146,
-0.032052841037511826,
0.06532014906406403,
0.06455844640731812,
0.05606384575366974,
0.11944291740655899,
-0.029358919709920883,
-0.02929956279695034,
0.031201764941215515,
0.020221246406435966,
0.2544774115085602,
0.00020953835337422788,
0.13987305760383606,
0.061134446412324905,
0.15533454716205597,
0.016430296003818512,
0.09593246132135391,
0.017817342653870583,
-0.031508516520261765,
0.022822383791208267,
-0.05775199830532074,
-0.01772884465754032,
0.034289076924324036,
0.05059587210416794,
0.050751715898513794,
-0.1396106779575348,
-0.034950848668813705,
-0.0011015040799975395,
0.3363879919052124,
0.07012175768613815,
-0.30731287598609924,
-0.11683222651481628,
0.011590301059186459,
-0.06832737475633621,
-0.04776758328080177,
0.03406572341918945,
0.1075514405965805,
-0.0775936022400856,
0.06549853086471558,
-0.07728951424360275,
0.10897934436798096,
-0.026704126968979836,
-0.015024884603917599,
0.11968642473220825,
0.0779503732919693,
-0.008482472039759159,
0.060358867049217224,
-0.22613857686519623,
0.2841654419898987,
-0.014079066924750805,
0.08428753167390823,
-0.018977485597133636,
0.031212978065013885,
0.04558946564793587,
-0.016310056671500206,
0.054997533559799194,
-0.004585327114909887,
-0.12575654685497284,
-0.20116271078586578,
-0.07490245252847672,
0.02628246136009693,
0.11545204371213913,
-0.053812962025403976,
0.12001758068799973,
-0.030469786375761032,
-0.0017956196097657084,
0.06573370099067688,
-0.08102092146873474,
-0.12346061319112778,
-0.09837896376848221,
0.027204131707549095,
0.005536279641091824,
0.06614799797534943,
-0.1162794828414917,
-0.1250070035457611,
-0.055658239871263504,
0.12445752322673798,
-0.061770129948854446,
-0.009933867491781712,
-0.13405168056488037,
0.09327196329832077,
0.18813592195510864,
-0.055306680500507355,
0.06980512291193008,
0.020292723551392555,
0.13360300660133362,
0.04293577000498772,
-0.020326819270849228,
0.09086785465478897,
-0.08355193585157394,
-0.1999897062778473,
-0.049350351095199585,
0.15991561114788055,
0.04621845483779907,
0.06873735040426254,
-0.032882269471883774,
0.03997649624943733,
-0.01180968340486288,
-0.0867651030421257,
0.043716154992580414,
-0.039404045790433884,
0.023523764684796333,
0.04970114305615425,
-0.042386494576931,
0.048824459314346313,
-0.036629606038331985,
-0.058341119438409805,
0.12474780529737473,
0.26330849528312683,
-0.07673020660877228,
0.006132456008344889,
0.011684724129736423,
-0.05187893286347389,
-0.14443981647491455,
0.05079047381877899,
0.15401725471019745,
0.03826010227203369,
0.017408357933163643,
-0.2412528097629547,
0.09564408659934998,
0.09671267122030258,
-0.02746802754700184,
0.12787367403507233,
-0.3009921610355377,
-0.12159433960914612,
0.09853371232748032,
0.09957126528024673,
-0.037088844925165176,
-0.15762977302074432,
-0.05286731198430061,
-0.010372128337621689,
-0.1415116786956787,
0.12489530444145203,
-0.03671939671039581,
0.12399575859308243,
-0.0055728028528392315,
0.05881623551249504,
0.019882623106241226,
-0.0475931242108345,
0.1504230499267578,
-0.005437162239104509,
0.0636485144495964,
0.007693106308579445,
0.05347311496734619,
0.05086482688784599,
-0.048954084515571594,
-0.005687350407242775,
-0.06812074780464172,
0.018200505524873734,
-0.13331599533557892,
-0.02473974972963333,
-0.09021201729774475,
0.03810718655586243,
-0.053462374955415726,
-0.03685648366808891,
-0.025831004604697227,
0.06033214554190636,
0.05051615834236145,
-0.005442235618829727,
0.16866379976272583,
-0.029232485219836235,
0.17630299925804138,
0.07398451119661331,
0.09111662954092026,
-0.004928876180201769,
-0.11222181469202042,
0.0014795190654695034,
-0.013206494972109795,
0.06581500172615051,
-0.14694394171237946,
0.0343051441013813,
0.14670290052890778,
0.06394441425800323,
0.12084683030843735,
0.0733567550778389,
-0.05772795155644417,
0.01643860712647438,
0.09078820794820786,
-0.07723691314458847,
-0.09308789670467377,
-0.024602899327874184,
0.06156957894563675,
-0.1572369635105133,
0.017255954444408417,
0.08785615116357803,
-0.0686180368065834,
-0.013888437300920486,
0.018000248819589615,
0.009368463419377804,
-0.07139191031455994,
0.2248278260231018,
0.044499147683382034,
0.08246057480573654,
-0.0888657420873642,
0.08660475164651871,
0.04635119438171387,
-0.17108191549777985,
-0.010313833132386208,
0.07123633474111557,
-0.03094221092760563,
-0.005014460068196058,
0.01357412338256836,
0.051049038767814636,
-0.04389060288667679,
-0.05488763004541397,
-0.11819764971733093,
-0.1431370973587036,
0.08371615409851074,
0.11129000782966614,
0.044547680765390396,
0.042232368141412735,
-0.04464082792401314,
0.061407893896102905,
-0.10880789905786514,
0.09823600947856903,
0.10063989460468292,
0.10018672794103622,
-0.16799920797348022,
0.13376958668231964,
0.005859185475856066,
0.010193089954555035,
0.0005340859643183649,
0.004302299581468105,
-0.09505602717399597,
0.015359542332589626,
-0.11902308464050293,
-0.03266561031341553,
-0.04485993832349777,
-0.002027304144576192,
-0.003772970987483859,
-0.059539008885622025,
-0.0906536728143692,
0.02364654839038849,
-0.12140177935361862,
-0.04572378844022751,
-0.006870661862194538,
0.07738132029771805,
-0.09759648889303207,
-0.004356429446488619,
0.07020726054906845,
-0.1164507195353508,
0.06964399665594101,
0.049603916704654694,
0.03361240774393082,
0.04202912002801895,
-0.08603210002183914,
0.010332527570426464,
0.037796784192323685,
0.0010550699662417173,
0.03174912557005882,
-0.16902919113636017,
0.00040280522080138326,
-0.027139928191900253,
0.05153440311551094,
-0.012445756234228611,
0.0036959261633455753,
-0.13481424748897552,
-0.06046035513281822,
-0.019834870472550392,
-0.053612153977155685,
-0.05232766643166542,
0.03149038180708885,
0.08609642088413239,
0.05323014408349991,
0.17437082529067993,
-0.06822766363620758,
0.027061166241765022,
-0.24067828059196472,
0.014675309881567955,
-0.02698744833469391,
-0.08805613964796066,
-0.07299728691577911,
-0.024630291387438774,
0.07502003014087677,
-0.058485329151153564,
0.10029537975788116,
-0.05097408965229988,
0.045793529599905014,
0.043069709092378616,
-0.08176640421152115,
0.04032163694500923,
0.053083136677742004,
0.26795274019241333,
0.053242143243551254,
-0.019761091098189354,
0.07195828855037689,
0.000355486263288185,
0.04640376940369606,
0.1406039148569107,
0.18132174015045166,
0.154890775680542,
0.010503550991415977,
0.07907944917678833,
0.054413266479969025,
-0.10798580944538116,
-0.12879517674446106,
0.11260814219713211,
-0.015347566455602646,
0.13270439207553864,
0.0017842046217992902,
0.22105851769447327,
0.09594425559043884,
-0.2067144513130188,
0.0603698305785656,
-0.05096447840332985,
-0.08250399678945541,
-0.09722957760095596,
-0.0338590107858181,
-0.06427222490310669,
-0.19126373529434204,
0.017918026074767113,
-0.13203725218772888,
0.06449846923351288,
0.06634289026260376,
0.015976836904883385,
0.02242577262222767,
0.14505544304847717,
0.04023243859410286,
0.010486013256013393,
0.10360842943191528,
0.014082865789532661,
-0.013647329062223434,
-0.03928234055638313,
-0.09936989843845367,
0.037844467908144,
-0.033593032509088516,
0.04051986709237099,
-0.07277504354715347,
-0.10284781455993652,
0.07681581377983093,
0.03029690869152546,
-0.09881535172462463,
0.02291577309370041,
-0.0075530605390667915,
0.0794256180524826,
0.06712453812360764,
0.00930712278932333,
0.025167372077703476,
-0.01999060995876789,
0.2594761550426483,
-0.10355346649885178,
-0.030572477728128433,
-0.14537549018859863,
0.22680674493312836,
0.011556806042790413,
-0.03032642789185047,
0.039280641824007034,
-0.07201515883207321,
-0.019581329077482224,
0.13537205755710602,
0.121879443526268,
-0.009367755614221096,
-0.030225509777665138,
0.00558538967743516,
-0.01727975346148014,
-0.0630258172750473,
0.091521255671978,
0.1216929480433464,
0.07750377804040909,
-0.07836303859949112,
-0.06415534764528275,
-0.0642075389623642,
-0.04228653013706207,
-0.0033509740605950356,
0.0824432224035263,
0.02891598269343376,
-0.02790153957903385,
-0.029919937252998352,
0.11422906070947647,
-0.08774719387292862,
-0.10086073726415634,
0.030560573562979698,
-0.16619057953357697,
-0.19445361196994781,
-0.05607016384601593,
0.06135573610663414,
0.014103330671787262,
0.047592535614967346,
-0.017458034679293633,
-0.03745556250214577,
0.09138274937868118,
-0.0015311464667320251,
-0.03436814621090889,
-0.13985684514045715,
0.09750009328126907,
-0.06270711123943329,
0.20640447735786438,
-0.04723117873072624,
0.03192834556102753,
0.1098254844546318,
0.06846129894256592,
-0.08042153716087341,
0.015745660290122032,
0.0769394040107727,
-0.15634626150131226,
0.023485135287046432,
0.20261235535144806,
-0.043866537511348724,
0.1315942108631134,
0.023058194667100906,
-0.13109634816646576,
0.009915119037032127,
-0.07670993357896805,
-0.04479464143514633,
-0.0655486062169075,
-0.023327386006712914,
-0.05596771836280823,
0.1328561007976532,
0.22113193571567535,
-0.0724187046289444,
-0.02754143439233303,
-0.06368187069892883,
0.0320252850651741,
0.06921549886465073,
0.12188533693552017,
-0.028851250186562538,
-0.28119438886642456,
0.018736088648438454,
0.018530355766415596,
-0.014561586081981659,
-0.23371431231498718,
-0.08473161607980728,
0.04968029260635376,
-0.06854333728551865,
-0.0464494451880455,
0.1085667535662651,
0.05651363730430603,
0.04396509379148483,
-0.05657532066106796,
-0.08459972590208054,
-0.06156960874795914,
0.17679159343242645,
-0.16763333976268768,
-0.07042577117681503
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sew-mid-100k-ft-common-language
This model is a fine-tuned version of [asapp/sew-mid-100k](https://huggingface.co/asapp/sew-mid-100k) on the common_language dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1189
- Accuracy: 0.3842
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 4
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 3.608 | 1.0 | 173 | 3.7266 | 0.0540 |
| 3.1298 | 2.0 | 346 | 3.2180 | 0.1654 |
| 2.8481 | 3.0 | 519 | 2.9270 | 0.2019 |
| 2.648 | 4.0 | 692 | 2.6991 | 0.2619 |
| 2.5 | 5.0 | 865 | 2.5236 | 0.3004 |
| 2.2578 | 6.0 | 1038 | 2.4019 | 0.3212 |
| 2.2782 | 7.0 | 1211 | 2.1698 | 0.3658 |
| 2.1665 | 8.0 | 1384 | 2.1976 | 0.3631 |
| 2.1626 | 9.0 | 1557 | 2.1473 | 0.3791 |
| 2.1514 | 10.0 | 1730 | 2.1189 | 0.3842 |
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["common_language"], "metrics": ["accuracy"], "model-index": [{"name": "sew-mid-100k-ft-common-language", "results": []}]}
|
audio-classification
|
anton-l/sew-mid-100k-ft-common-language
|
[
"transformers",
"pytorch",
"tensorboard",
"sew",
"audio-classification",
"generated_from_trainer",
"dataset:common_language",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #sew #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us
|
sew-mid-100k-ft-common-language
===============================
This model is a fine-tuned version of asapp/sew-mid-100k on the common\_language dataset.
It achieves the following results on the evaluation set:
* Loss: 2.1189
* Accuracy: 0.3842
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 4
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 10.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #sew #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
57,
160,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #sew #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.14161452651023865,
0.09353943914175034,
-0.0021780505776405334,
0.058706317096948624,
0.1335739940404892,
0.004155070520937443,
0.09658361971378326,
0.12251022458076477,
-0.1058545634150505,
0.08189885318279266,
0.09108421951532364,
0.08789347112178802,
0.0471930168569088,
0.09641586989164352,
-0.02123018354177475,
-0.3222353160381317,
0.01592879928648472,
0.03368976339697838,
-0.1771252304315567,
0.11805285513401031,
0.1124933660030365,
-0.09272513538599014,
0.03194079548120499,
0.04833695665001869,
-0.13967078924179077,
0.009410535916686058,
-0.01646442338824272,
-0.06268562376499176,
0.10094373673200607,
0.04714994505047798,
0.10648838430643082,
0.02459714002907276,
0.07841099053621292,
-0.23177945613861084,
0.014288623817265034,
0.06546465307474136,
0.04017005115747452,
0.07862341403961182,
0.10058523714542389,
-0.014408866874873638,
0.11553090810775757,
-0.06233343109488487,
0.07090769708156586,
0.059098031371831894,
-0.10951443016529083,
-0.2978116273880005,
-0.10083801299333572,
0.03797898069024086,
0.10413108021020889,
0.09414423257112503,
-0.024957064539194107,
0.06639979034662247,
-0.06678102165460587,
0.08461122959852219,
0.2274307906627655,
-0.23187190294265747,
-0.08573613315820694,
-0.012087088078260422,
0.07269884645938873,
0.04902789741754532,
-0.12240875512361526,
-0.027131536975502968,
0.05802086740732193,
0.039260074496269226,
0.10352834314107895,
0.011562815867364407,
-0.010127393528819084,
0.0006098762387409806,
-0.14188340306282043,
-0.06433653831481934,
0.16656793653964996,
0.08748461306095123,
-0.055690035223960876,
-0.059791479259729385,
-0.010407515801489353,
-0.24258272349834442,
-0.04216481000185013,
0.020074723288416862,
0.03131477162241936,
-0.04597165808081627,
-0.1087484359741211,
0.008783658035099506,
-0.08819707483053207,
-0.10133930295705795,
0.02887623757123947,
0.17999260127544403,
0.045824915170669556,
-0.01988626830279827,
-0.002380771329626441,
0.10682455450296402,
0.03370344638824463,
-0.14859327673912048,
-0.02262462116777897,
0.03628452122211456,
-0.09027760475873947,
-0.026118727400898933,
-0.059268008917570114,
-0.029572540894150734,
-0.011000974103808403,
0.15891027450561523,
-0.05950210988521576,
0.07202092558145523,
0.03747645765542984,
0.031960684806108475,
-0.08255111426115036,
0.17805932462215424,
-0.05798957496881485,
-0.014412648044526577,
-0.038705501705408096,
0.0870937779545784,
-0.012492929585278034,
-0.019444134086370468,
-0.052053868770599365,
0.02477322891354561,
0.104557566344738,
0.02904529869556427,
-0.03935583680868149,
0.009057958610355854,
-0.06761830300092697,
-0.03520214557647705,
0.0006158276228234172,
-0.09961185604333878,
0.019581710919737816,
0.010422898456454277,
-0.07194128632545471,
0.009725208394229412,
0.016146153211593628,
0.028564853593707085,
0.0020643225871026516,
0.12508894503116608,
-0.08507222682237625,
-0.01310693845152855,
-0.09304369240999222,
-0.08946873992681503,
0.042071107774972916,
-0.04558078944683075,
0.020491862669587135,
-0.06392797827720642,
-0.12592211365699768,
-0.040519583970308304,
0.0640743225812912,
-0.025390466675162315,
-0.10149842500686646,
-0.03701580688357353,
-0.07429121434688568,
0.049652423709630966,
-0.029695844277739525,
0.15364769101142883,
-0.06331638246774673,
0.10915397107601166,
0.044805314391851425,
0.03468872234225273,
0.013955684378743172,
0.07006214559078217,
-0.05833623558282852,
0.05664025992155075,
-0.15082868933677673,
0.0628385841846466,
-0.1006990596652031,
0.06099905073642731,
-0.12216784805059433,
-0.13059474527835846,
-0.02948722057044506,
0.007336983922868967,
0.09178736060857773,
0.08391477912664413,
-0.16584739089012146,
-0.10891204327344894,
0.14694972336292267,
-0.08438698947429657,
-0.12645657360553741,
0.1290578991174698,
-0.018365323543548584,
-0.0036937352269887924,
0.048261914402246475,
0.14817702770233154,
0.14090652763843536,
-0.08999133110046387,
-0.025500858202576637,
-0.05224880203604698,
0.13221527636051178,
0.0008088327595032752,
0.11530458927154541,
-0.026900723576545715,
0.003235303796827793,
0.0026212171651422977,
-0.04572639241814613,
0.06286939233541489,
-0.0972437784075737,
-0.08986382186412811,
-0.035364456474781036,
-0.09304220974445343,
0.025706879794597626,
0.06404543668031693,
0.04543063044548035,
-0.08957193791866302,
-0.12264877557754517,
0.0851946473121643,
0.11534937471151352,
-0.09096714854240417,
0.03323347866535187,
-0.07001040875911713,
0.050841882824897766,
-0.04576954245567322,
-0.032595884054899216,
-0.18471819162368774,
-0.04203784093260765,
0.014798782765865326,
-0.0803261399269104,
0.03090372495353222,
-0.02727533131837845,
0.07103331387042999,
0.055049456655979156,
-0.05119052901864052,
-0.06855393946170807,
-0.11386014521121979,
-0.009707847610116005,
-0.06981519609689713,
-0.20340198278427124,
-0.09218709915876389,
-0.017828192561864853,
0.1615101844072342,
-0.2146931141614914,
0.01486684288829565,
0.019816486164927483,
0.11561082303524017,
0.03611462190747261,
-0.054795097559690475,
-0.014611250720918179,
0.08253207057714462,
-0.024455836042761803,
-0.05930415540933609,
0.018916763365268707,
0.027456264942884445,
-0.12380877137184143,
-0.003953562118113041,
-0.09685465693473816,
0.12177461385726929,
0.09541333466768265,
-0.024847829714417458,
-0.07527842372655869,
-0.03804675117135048,
-0.08602894842624664,
-0.06708022952079773,
-0.030114123597741127,
-0.013208835385739803,
0.1612839698791504,
0.025057503953576088,
0.12333667278289795,
-0.08740261197090149,
-0.048083726316690445,
0.04447732865810394,
-0.004826645366847515,
-0.0034764432348310947,
0.1176019012928009,
0.06648111343383789,
-0.028866436332464218,
0.11861809343099594,
0.10310982167720795,
-0.10421689599752426,
0.17412087321281433,
-0.09047079086303711,
-0.13045138120651245,
-0.014192236587405205,
0.006290174555033445,
0.03202245756983757,
0.12636791169643402,
-0.1351790428161621,
0.015250383876264095,
0.020023001357913017,
0.04074689373373985,
0.0286357793956995,
-0.20503120124340057,
-0.01456177793443203,
0.05095342546701431,
-0.05571611970663071,
-0.06376148015260696,
-0.007525672670453787,
-0.010164109990000725,
0.07088664174079895,
0.007476708386093378,
-0.031805671751499176,
0.0030630072578787804,
-0.010671627707779408,
-0.07789346575737,
0.19467756152153015,
-0.08904826641082764,
-0.13448037207126617,
-0.1691529005765915,
-0.021627428010106087,
-0.028753239661455154,
-0.010384717024862766,
0.05094277858734131,
-0.1021234542131424,
-0.03354312852025032,
-0.044119417667388916,
0.056724365800619125,
-0.05524815246462822,
0.03361985832452774,
0.011966879479587078,
0.015758901834487915,
0.10236359387636185,
-0.10980677604675293,
0.03340444341301918,
-0.0032879915088415146,
-0.04201044514775276,
0.0031800116412341595,
0.02287176065146923,
0.10243132710456848,
0.16567865014076233,
0.03719739988446236,
0.017856325954198837,
-0.02423040010035038,
0.19636832177639008,
-0.11891329288482666,
-0.0424172580242157,
0.12137430161237717,
-0.0014448651345446706,
0.03967021033167839,
0.10131259262561798,
0.06811533123254776,
-0.07807987928390503,
0.03069266490638256,
0.07399731129407883,
-0.037513285875320435,
-0.24828459322452545,
-0.02772809937596321,
-0.07366157323122025,
-0.00429856451228261,
0.10852578282356262,
0.03041308932006359,
0.033435188233852386,
0.049345292150974274,
-0.017417507246136665,
0.02595793642103672,
-0.006326018366962671,
0.06888706982135773,
0.07871988415718079,
0.05505744740366936,
0.11810313910245895,
-0.025859903544187546,
-0.02512500435113907,
0.03328034281730652,
0.02270684391260147,
0.25169652700424194,
0.012833036482334137,
0.18522022664546967,
0.06853961944580078,
0.1474994719028473,
0.017358778044581413,
0.08712030947208405,
0.015653356909751892,
-0.030498644337058067,
0.02850235067307949,
-0.055079374462366104,
-0.026016125455498695,
0.02829020284116268,
0.04466407001018524,
0.053285062313079834,
-0.1462470442056656,
-0.027910206466913223,
-0.0006047018687240779,
0.33898061513900757,
0.06558208912611008,
-0.32218724489212036,
-0.12267231196165085,
0.007586234714835882,
-0.06524556130170822,
-0.04908004403114319,
0.034149035811424255,
0.07875054329633713,
-0.08633384108543396,
0.064796581864357,
-0.07872200012207031,
0.11138436943292618,
-0.0364004485309124,
-0.009380958043038845,
0.11364749073982239,
0.07690874487161636,
-0.008284131996333599,
0.06177399680018425,
-0.22242209315299988,
0.28091099858283997,
-0.014602270908653736,
0.08714938163757324,
-0.014384381473064423,
0.03147711977362633,
0.0360088087618351,
-0.007383888121694326,
0.0614435076713562,
-0.009188837371766567,
-0.09763643145561218,
-0.19565854966640472,
-0.0827634334564209,
0.02583116479218006,
0.11522186547517776,
-0.06394173949956894,
0.12329185009002686,
-0.033969443291425705,
0.0004706428153440356,
0.06290140748023987,
-0.07912294566631317,
-0.11630642414093018,
-0.10515028238296509,
0.028838183730840683,
0.0148983895778656,
0.08397485315799713,
-0.1098080724477768,
-0.12564460933208466,
-0.05926520749926567,
0.1496640294790268,
-0.0649622306227684,
-0.022619109600782394,
-0.13722479343414307,
0.08532343804836273,
0.17997528612613678,
-0.05859210714697838,
0.07385377585887909,
0.017011888325214386,
0.14322343468666077,
0.04013330116868019,
-0.015151642262935638,
0.08776340633630753,
-0.07778827100992203,
-0.19751952588558197,
-0.035813361406326294,
0.16927103698253632,
0.03860597312450409,
0.06288646161556244,
-0.028863895684480667,
0.032977525144815445,
-0.010736113414168358,
-0.08929566293954849,
0.03477973863482475,
-0.03594609722495079,
0.009610111825168133,
0.0471857488155365,
-0.03079964593052864,
0.052382323890924454,
-0.04434476047754288,
-0.05895334854722023,
0.13144783675670624,
0.2728806436061859,
-0.07192923873662949,
0.0056881411001086235,
0.030736178159713745,
-0.04802672564983368,
-0.13551364839076996,
0.03822328522801399,
0.1572658121585846,
0.04000643268227577,
0.02488001622259617,
-0.22834055125713348,
0.08883760124444962,
0.10428717732429504,
-0.025298651307821274,
0.10324934124946594,
-0.30815935134887695,
-0.12917177379131317,
0.09939131140708923,
0.1036774218082428,
-0.03840792179107666,
-0.15895096957683563,
-0.065183125436306,
-0.023119552060961723,
-0.15766951441764832,
0.11568957567214966,
-0.04234318435192108,
0.1241767406463623,
-0.0017044759588316083,
0.06453042477369308,
0.019393058493733406,
-0.045710306614637375,
0.15652652084827423,
0.004728466738015413,
0.05074405297636986,
0.005283234175294638,
0.06638195365667343,
0.046427324414253235,
-0.05570485070347786,
0.007357024122029543,
-0.06780386716127396,
0.016742326319217682,
-0.13369391858577728,
-0.037253424525260925,
-0.08957850933074951,
0.029181137681007385,
-0.053858816623687744,
-0.03486628457903862,
-0.022859565913677216,
0.04028855636715889,
0.048876240849494934,
-0.005118933506309986,
0.17282618582248688,
-0.028155086562037468,
0.1576841026544571,
0.07555077224969864,
0.09420399367809296,
-0.010054443031549454,
-0.13879892230033875,
-0.003106849966570735,
-0.017358899116516113,
0.059352681040763855,
-0.13992777466773987,
0.041501715779304504,
0.14820709824562073,
0.06268470734357834,
0.13630123436450958,
0.07255050539970398,
-0.06433375924825668,
0.020049354061484337,
0.0815921500325203,
-0.07005810737609863,
-0.11189959943294525,
-0.03366350382566452,
0.037394892424345016,
-0.16151590645313263,
0.025652553886175156,
0.0993729680776596,
-0.06786132603883743,
-0.00790323130786419,
0.018995169550180435,
0.003757922677323222,
-0.0742795392870903,
0.22890859842300415,
0.04467715695500374,
0.08043520897626877,
-0.09119150042533875,
0.08589403331279755,
0.044460147619247437,
-0.16994518041610718,
-0.008431035093963146,
0.06694109737873077,
-0.04070369899272919,
-0.005426664371043444,
0.011124160140752792,
0.053968992084264755,
-0.060874857008457184,
-0.06437116116285324,
-0.12700442969799042,
-0.1444469392299652,
0.08426422625780106,
0.10054760426282883,
0.05192504823207855,
0.045279067009687424,
-0.04250626266002655,
0.05756225809454918,
-0.11162727326154709,
0.1013684794306755,
0.10523894429206848,
0.09400773048400879,
-0.16751597821712494,
0.12146608531475067,
0.013671964406967163,
0.019824862480163574,
0.00176167581230402,
-0.009052883833646774,
-0.08537508547306061,
0.024164527654647827,
-0.12039517611265182,
-0.02845098450779915,
-0.03689064830541611,
-0.0016498379409313202,
0.00023360228806268424,
-0.05471161752939224,
-0.08242025971412659,
0.029584411531686783,
-0.11979416757822037,
-0.05189770832657814,
-0.0034273487981408834,
0.07097933441400528,
-0.10020746290683746,
-0.0078083183616399765,
0.07423964887857437,
-0.11783882975578308,
0.0698641687631607,
0.049813151359558105,
0.03890666738152504,
0.050515271723270416,
-0.081105537712574,
0.016978401690721512,
0.046520527452230453,
-0.005120843183249235,
0.027039308100938797,
-0.1652427315711975,
-0.009670712053775787,
-0.01945348270237446,
0.05090012028813362,
-0.012315123341977596,
0.0030884877778589725,
-0.13668453693389893,
-0.05776654928922653,
-0.02007148042321205,
-0.050590258091688156,
-0.0567496120929718,
0.03531118109822273,
0.07415780425071716,
0.06501850485801697,
0.1774347573518753,
-0.06430355459451675,
0.028456391766667366,
-0.22944721579551697,
0.015637779608368874,
-0.025986546650528908,
-0.08149635791778564,
-0.07342968136072159,
-0.028755242004990578,
0.06752409040927887,
-0.05933716893196106,
0.08476950228214264,
-0.06480946391820908,
0.06558587402105331,
0.0429382249712944,
-0.07360684126615524,
0.04147739335894585,
0.049617018550634384,
0.2632196247577667,
0.05467973276972771,
-0.013706155121326447,
0.07911419123411179,
0.004656882490962744,
0.051473088562488556,
0.13132523000240326,
0.16510312259197235,
0.1524379402399063,
-0.01193341612815857,
0.08704213052988052,
0.04735548049211502,
-0.08953163772821426,
-0.14619788527488708,
0.09621293842792511,
-0.01934598572552204,
0.13190729916095734,
0.0115238968282938,
0.22885851562023163,
0.10736313462257385,
-0.1974201202392578,
0.06618361175060272,
-0.0330854132771492,
-0.09325768053531647,
-0.09982369840145111,
-0.039001885801553726,
-0.06965995579957962,
-0.18521958589553833,
0.015243506990373135,
-0.13163934648036957,
0.054866332560777664,
0.07656972855329514,
0.02369334176182747,
0.023072315379977226,
0.15967901051044464,
0.03111250139772892,
-0.001875458750873804,
0.09081314504146576,
-0.0001028549813781865,
-0.01655622199177742,
-0.04785942658782005,
-0.09737473726272583,
0.04349267855286598,
-0.02851445972919464,
0.04512649402022362,
-0.0717630609869957,
-0.10022854059934616,
0.07106223702430725,
0.02663276344537735,
-0.10191847383975983,
0.021091219037771225,
-0.0029204708989709616,
0.07931744307279587,
0.05306888744235039,
0.008105084300041199,
0.01434324961155653,
-0.017862269654870033,
0.2529340982437134,
-0.09451369196176529,
-0.034867435693740845,
-0.14933963119983673,
0.20953738689422607,
0.01962951384484768,
-0.027088738977909088,
0.0363524854183197,
-0.07961182296276093,
-0.010246438905596733,
0.15091988444328308,
0.10938912630081177,
-0.0061972737312316895,
-0.031877052038908005,
0.006371483206748962,
-0.01427130401134491,
-0.051151569932699203,
0.08275499939918518,
0.11244682967662811,
0.08017571270465851,
-0.07364534586668015,
-0.04638553783297539,
-0.06222943589091301,
-0.046476125717163086,
-0.008504943922162056,
0.082515187561512,
0.021635059267282486,
-0.019873814657330513,
-0.03004581853747368,
0.11818798631429672,
-0.07554899901151657,
-0.10683570057153702,
0.028881821781396866,
-0.161584734916687,
-0.189999520778656,
-0.05940304324030876,
0.05594980716705322,
0.011637788265943527,
0.05053997039794922,
-0.014141584746539593,
-0.04106611758470535,
0.10237234085798264,
0.0006354512879624963,
-0.047931134700775146,
-0.13523593544960022,
0.09553980827331543,
-0.045181386172771454,
0.2037406712770462,
-0.045724768191576004,
0.034523237496614456,
0.10586614906787872,
0.07220327109098434,
-0.07514195144176483,
0.0204366073012352,
0.07470084726810455,
-0.13896892964839935,
0.02740560472011566,
0.193024143576622,
-0.04684806242585182,
0.1212342157959938,
0.01828812249004841,
-0.1472698450088501,
0.0035401422064751387,
-0.07588241249322891,
-0.04939865693449974,
-0.06294495612382889,
-0.029205381870269775,
-0.060865432024002075,
0.12426088750362396,
0.21579419076442719,
-0.07081108540296555,
-0.028577808290719986,
-0.0622774139046669,
0.033444080501794815,
0.07086998969316483,
0.11818382889032364,
-0.02754232846200466,
-0.28433600068092346,
0.016702638939023018,
0.022711096331477165,
-0.019593575969338417,
-0.24537116289138794,
-0.09267447888851166,
0.04433909058570862,
-0.06079292297363281,
-0.03774823248386383,
0.10828510671854019,
0.06535787135362625,
0.04121329262852669,
-0.053521778434515,
-0.07444304972887039,
-0.06479794532060623,
0.17723548412322998,
-0.17000211775302887,
-0.07969309389591217
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sew-mid-100k-ft-keyword-spotting
This model is a fine-tuned version of [asapp/sew-mid-100k](https://huggingface.co/asapp/sew-mid-100k) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0975
- Accuracy: 0.9757
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5999 | 1.0 | 399 | 0.2262 | 0.9635 |
| 0.4271 | 2.0 | 798 | 0.1230 | 0.9697 |
| 0.3778 | 3.0 | 1197 | 0.1052 | 0.9731 |
| 0.3227 | 4.0 | 1596 | 0.0975 | 0.9757 |
| 0.3081 | 5.0 | 1995 | 0.0962 | 0.9753 |
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["superb"], "metrics": ["accuracy"], "model-index": [{"name": "sew-mid-100k-ft-keyword-spotting", "results": []}]}
|
audio-classification
|
anton-l/sew-mid-100k-ft-keyword-spotting
|
[
"transformers",
"pytorch",
"tensorboard",
"sew",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #sew #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us
|
sew-mid-100k-ft-keyword-spotting
================================
This model is a fine-tuned version of asapp/sew-mid-100k on the superb dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0975
* Accuracy: 0.9757
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #sew #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
55,
159,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #sew #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.1464269757270813,
0.10994111001491547,
-0.0018083736067637801,
0.06266514956951141,
0.13755157589912415,
0.019578304141759872,
0.10784547030925751,
0.11849314719438553,
-0.11424553394317627,
0.06876502186059952,
0.09433490037918091,
0.09563075751066208,
0.035231925547122955,
0.10164856910705566,
-0.02110489271581173,
-0.30432066321372986,
0.016261553391814232,
0.025769973173737526,
-0.19370701909065247,
0.11918850988149643,
0.09979657828807831,
-0.1066652312874794,
0.049889061599969864,
0.046470124274492264,
-0.15093128383159637,
0.003791776718571782,
-0.018422815948724747,
-0.06537961959838867,
0.09929554164409637,
0.03477106988430023,
0.1061200350522995,
0.028334317728877068,
0.09027854353189468,
-0.19818969070911407,
0.013743946328759193,
0.07143158465623856,
0.030628902837634087,
0.08437518775463104,
0.08031569421291351,
-0.009000181220471859,
0.11263219267129898,
-0.06406868249177933,
0.07652657479047775,
0.05730175971984863,
-0.10893771052360535,
-0.3039189875125885,
-0.1081233099102974,
0.05624908208847046,
0.10268917679786682,
0.08344059437513351,
-0.012895042076706886,
0.07726631313562393,
-0.05920243263244629,
0.07952208071947098,
0.22976645827293396,
-0.2438025325536728,
-0.08705593645572662,
-0.012450048699975014,
0.06232363358139992,
0.052894704043865204,
-0.10887572914361954,
-0.017085224390029907,
0.047940097749233246,
0.03593296557664871,
0.1126600056886673,
0.009360568597912788,
0.003246760694310069,
0.010080952197313309,
-0.15006785094738007,
-0.06081346794962883,
0.1705900877714157,
0.0785042867064476,
-0.06759367138147354,
-0.055311888456344604,
-0.033884353935718536,
-0.24061831831932068,
-0.03973468765616417,
0.009351140819489956,
0.0358612947165966,
-0.05429903045296669,
-0.11823917925357819,
-0.008573133498430252,
-0.07506714016199112,
-0.1198205053806305,
0.019483687356114388,
0.21477214992046356,
0.047026392072439194,
0.0007775588892400265,
-0.009573346003890038,
0.12312144041061401,
0.0446581207215786,
-0.15857745707035065,
-0.03085954487323761,
0.03415955975651741,
-0.0849141925573349,
-0.022322863340377808,
-0.05919112637639046,
-0.012209699489176273,
-0.01070309802889824,
0.18365943431854248,
-0.047365691512823105,
0.06253450363874435,
0.046245794743299484,
0.028654132038354874,
-0.08723856508731842,
0.17759904265403748,
-0.04961608722805977,
-0.03625934571027756,
-0.019531041383743286,
0.09872780740261078,
0.022381996735930443,
-0.020654115825891495,
-0.06926199048757553,
0.020955165848135948,
0.0906134843826294,
0.0266744252294302,
-0.02340075746178627,
0.015307905152440071,
-0.06175671145319939,
-0.04064645245671272,
0.046736329793930054,
-0.09389056265354156,
0.018227016553282738,
0.014437536709010601,
-0.08061983436346054,
-0.010785163380205631,
0.018533673137426376,
0.03144121170043945,
0.013718261383473873,
0.13885943591594696,
-0.09162049740552902,
-0.019052328541874886,
-0.08815682679414749,
-0.07767848670482635,
0.038576602935791016,
-0.06256139278411865,
0.01685298979282379,
-0.06211911514401436,
-0.15292638540267944,
-0.017739158123731613,
0.05380076915025711,
-0.0227347519248724,
-0.10102114081382751,
-0.02621162123978138,
-0.08506011217832565,
0.049161456525325775,
-0.023966237902641296,
0.146957665681839,
-0.06709154695272446,
0.10594196617603302,
0.06980077922344208,
0.03733820468187332,
0.0015149449463933706,
0.06026165187358856,
-0.0585356280207634,
0.05257480964064598,
-0.16438491642475128,
0.05485754832625389,
-0.09079053997993469,
0.04962814599275589,
-0.11335024237632751,
-0.12586964666843414,
-0.015940196812152863,
0.0028341093566268682,
0.08281243592500687,
0.09614536166191101,
-0.15613064169883728,
-0.11698903888463974,
0.1289089024066925,
-0.08804436028003693,
-0.14050647616386414,
0.12056902050971985,
-0.01388632319867611,
-0.03143748641014099,
0.04613873362541199,
0.1092613935470581,
0.13638301193714142,
-0.08844972401857376,
-0.030518708750605583,
-0.04398062080144882,
0.12390368431806564,
-0.007461344823241234,
0.1280994415283203,
-0.019739173352718353,
0.014448489062488079,
0.004078418482095003,
-0.07227665930986404,
0.04636680707335472,
-0.09432988613843918,
-0.09782451391220093,
-0.045237865298986435,
-0.09512579441070557,
0.04058755934238434,
0.06113651394844055,
0.04155460745096207,
-0.08524098992347717,
-0.13326036930084229,
0.08618728816509247,
0.13030138611793518,
-0.07525157928466797,
0.032826680690050125,
-0.08290081471204758,
0.07118090242147446,
-0.05754417926073074,
-0.029442811384797096,
-0.184231698513031,
-0.042490892112255096,
0.009564791806042194,
-0.06429357826709747,
0.021985922008752823,
-0.023671023547649384,
0.06951891630887985,
0.07808148115873337,
-0.05672823637723923,
-0.07715020328760147,
-0.12330424785614014,
-0.014376583509147167,
-0.0692916139960289,
-0.2171511948108673,
-0.11067353934049606,
-0.017607925459742546,
0.1522606611251831,
-0.19080151617527008,
0.021112294867634773,
0.03142603486776352,
0.12488521635532379,
0.039387915283441544,
-0.057715851813554764,
-0.022244738414883614,
0.08314739167690277,
-0.031023986637592316,
-0.06515644490718842,
0.022364327684044838,
0.023122640326619148,
-0.12509147822856903,
-0.03908077999949455,
-0.09337125718593597,
0.14696559309959412,
0.10752394795417786,
-0.027972489595413208,
-0.06222788617014885,
-0.010374566540122032,
-0.09032108634710312,
-0.056693460792303085,
-0.033127348870038986,
-0.0007370691746473312,
0.14825664460659027,
0.027717387303709984,
0.12648119032382965,
-0.08357325941324234,
-0.054647937417030334,
0.04404725879430771,
0.00969831645488739,
-0.0037756473757326603,
0.09450046718120575,
0.07274428755044937,
-0.03837912902235985,
0.1238618865609169,
0.12332966923713684,
-0.10551611334085464,
0.17079050838947296,
-0.09027518332004547,
-0.10961487889289856,
-0.02041601575911045,
-0.01050730049610138,
0.029071591794490814,
0.13948993384838104,
-0.14354415237903595,
0.0027415051590651274,
0.018826685845851898,
0.033057354390621185,
0.018339386209845543,
-0.2066897600889206,
-0.012514290399849415,
0.053374066948890686,
-0.057057932019233704,
-0.06585050374269485,
-0.005320119671523571,
-0.017341405153274536,
0.0701657235622406,
0.01503005065023899,
-0.025362437590956688,
0.011952458880841732,
-0.0018418962135910988,
-0.06743919849395752,
0.18920806050300598,
-0.08566047996282578,
-0.11122943460941315,
-0.17404192686080933,
-0.011089412495493889,
-0.06060481816530228,
-0.0028678737580776215,
0.0468791201710701,
-0.09976111352443695,
-0.029060324653983116,
-0.033019449561834335,
0.058157358318567276,
-0.042938366532325745,
0.04461151733994484,
0.01927625760436058,
0.00961911678314209,
0.10416611284017563,
-0.11275308579206467,
0.0333067961037159,
-0.0066653285175561905,
-0.058286167681217194,
-0.01609293557703495,
0.053504474461078644,
0.10967440158128738,
0.1512359082698822,
0.03627030923962593,
0.018809093162417412,
-0.014382483437657356,
0.21256867051124573,
-0.1198185384273529,
-0.04057130590081215,
0.14727407693862915,
-0.011954415589571,
0.03966299071907997,
0.0853683352470398,
0.06960460543632507,
-0.07424885779619217,
0.011636219918727875,
0.06619174778461456,
-0.03753375634551048,
-0.23579610884189606,
-0.02390606328845024,
-0.06607483327388763,
-0.008079448714852333,
0.09337250143289566,
0.03459911420941353,
0.034714024513959885,
0.05805654078722,
-0.03214646875858307,
0.03497590497136116,
-0.02574063651263714,
0.0743037760257721,
0.07525385171175003,
0.06483859568834305,
0.12500828504562378,
-0.03263494744896889,
-0.0231313593685627,
0.034628547728061676,
0.019416851922869682,
0.2238251268863678,
0.004354848992079496,
0.18281488120555878,
0.0631437599658966,
0.15417763590812683,
0.009238218888640404,
0.07373785972595215,
0.02381855435669422,
-0.03138655424118042,
0.02166666090488434,
-0.054926082491874695,
-0.034996870905160904,
0.027450159192085266,
0.019767286255955696,
0.07454438507556915,
-0.1435571312904358,
-0.0045725698582828045,
0.012126132845878601,
0.3414156138896942,
0.05392611399292946,
-0.3301005959510803,
-0.1300644874572754,
0.015453940257430077,
-0.04765717312693596,
-0.03385189548134804,
0.023940501734614372,
0.08748676627874374,
-0.08023349195718765,
0.08451274037361145,
-0.08045424520969391,
0.10055956244468689,
-0.04114833474159241,
-0.0001990644377656281,
0.1234954372048378,
0.09264204651117325,
-0.002726704115048051,
0.05086581036448479,
-0.20230431854724884,
0.25897860527038574,
-0.0023435174953192472,
0.0984252318739891,
-0.021313590928912163,
0.03241772577166557,
0.043078530579805374,
0.03756396099925041,
0.058471061289310455,
-0.015261317603290081,
-0.08598168194293976,
-0.1846642643213272,
-0.07931381464004517,
0.026480015367269516,
0.110808826982975,
-0.08315783739089966,
0.12155381590127945,
-0.040648624300956726,
-0.011229258030653,
0.06378662586212158,
-0.06031901389360428,
-0.09569784998893738,
-0.09519190341234207,
0.02719051204621792,
0.003565124236047268,
0.05980692803859711,
-0.10925030708312988,
-0.12245293706655502,
-0.037564557045698166,
0.14666876196861267,
-0.09929556399583817,
-0.028423702344298363,
-0.1449442207813263,
0.07892604917287827,
0.1632673144340515,
-0.06128093600273132,
0.07197172939777374,
0.012437884695827961,
0.1445455104112625,
0.04363071545958519,
-0.034690145403146744,
0.0778936892747879,
-0.08352470397949219,
-0.2320113629102707,
-0.03653079271316528,
0.1549474447965622,
0.032365549355745316,
0.06035429239273071,
-0.04091804474592209,
0.04305504262447357,
-0.006937440950423479,
-0.08744975179433823,
0.04853403568267822,
-0.0271549541503191,
0.036820217967033386,
0.03599386289715767,
-0.0061542256735265255,
0.06263281404972076,
-0.030773311853408813,
-0.04111802577972412,
0.10848525911569595,
0.2801148593425751,
-0.0747540220618248,
0.0008054068312048912,
0.02673356793820858,
-0.04435303062200546,
-0.14812131226062775,
0.03951307386159897,
0.13520021736621857,
0.026895347982645035,
0.011573509313166142,
-0.23228754103183746,
0.09286826848983765,
0.09786577522754669,
-0.03292730078101158,
0.11580195277929306,
-0.31126686930656433,
-0.1349477618932724,
0.09541318565607071,
0.11111727356910706,
-0.0006576825398951769,
-0.16250771284103394,
-0.06066960468888283,
-0.02472636103630066,
-0.1549576073884964,
0.11005887389183044,
-0.06962379813194275,
0.11379067599773407,
-0.011772969737648964,
0.047324761748313904,
0.012975776568055153,
-0.04307277873158455,
0.16009780764579773,
0.004308432340621948,
0.06144694611430168,
0.0054453653283417225,
0.04528903588652611,
0.05718120560050011,
-0.05359211564064026,
-0.007668434176594019,
-0.08473356813192368,
0.02405661903321743,
-0.1173342764377594,
-0.028284870088100433,
-0.08604230731725693,
0.03563946858048439,
-0.05194873362779617,
-0.0368841215968132,
-0.02129359170794487,
0.03295491635799408,
0.030272454023361206,
-0.016104530543088913,
0.20898275077342987,
-0.002804997144266963,
0.15548712015151978,
0.08774248510599136,
0.09389036893844604,
-0.0026995742227882147,
-0.1409894973039627,
-0.007553845643997192,
-0.020151477307081223,
0.07235373556613922,
-0.15346446633338928,
0.033064860850572586,
0.12972527742385864,
0.0683557540178299,
0.11582747846841812,
0.07854235172271729,
-0.057542115449905396,
0.024261649698019028,
0.08633685111999512,
-0.10169712454080582,
-0.11724915355443954,
-0.04020141065120697,
-0.0005057934322394431,
-0.15788915753364563,
0.05411819741129875,
0.10345172882080078,
-0.06399746239185333,
-0.00916366558521986,
0.018600407987833023,
0.007945315912365913,
-0.07069684565067291,
0.2244824916124344,
0.05232546478509903,
0.08143594115972519,
-0.10311371833086014,
0.09738276153802872,
0.03246884047985077,
-0.14515291154384613,
-0.0035498347133398056,
0.06106853857636452,
-0.04760425165295601,
-0.003586351405829191,
0.016622023656964302,
0.07342609018087387,
-0.06846145540475845,
-0.06611054390668869,
-0.14479996263980865,
-0.1292276680469513,
0.07824185490608215,
0.11302752792835236,
0.06158265098929405,
0.04745763540267944,
-0.04694749042391777,
0.07017318159341812,
-0.10690976679325104,
0.1078057512640953,
0.08124115318059921,
0.09763695299625397,
-0.193048357963562,
0.13539594411849976,
0.011167665012180805,
0.027364252135157585,
-0.003857402130961418,
0.0019331977237015963,
-0.07981457561254501,
0.012926025316119194,
-0.14321567118167877,
-0.0374131016433239,
-0.03209518641233444,
0.002793906955048442,
-0.005388165824115276,
-0.06349879503250122,
-0.0900554209947586,
0.034737151116132736,
-0.11643563210964203,
-0.053204987198114395,
0.005676671862602234,
0.05097246170043945,
-0.11726521700620651,
0.004101961385458708,
0.06799830496311188,
-0.12601794302463531,
0.07097887992858887,
0.056316476315259933,
0.04335663467645645,
0.05370311811566353,
-0.03773367777466774,
0.010395047254860401,
0.0440339595079422,
-0.005411056336015463,
0.03648409619927406,
-0.14186008274555206,
-0.009993326850235462,
-0.03179171681404114,
0.04783330485224724,
-0.014511458575725555,
0.04365864396095276,
-0.13397669792175293,
-0.046908196061849594,
-0.010760299861431122,
-0.04527980089187622,
-0.06119379773736,
0.03296994790434837,
0.10706072300672531,
0.052312493324279785,
0.18877756595611572,
-0.05719342082738876,
0.016604384407401085,
-0.23556803166866302,
0.012940630316734314,
-0.017856761813163757,
-0.09992820769548416,
-0.10356677323579788,
-0.02509521134197712,
0.0685088261961937,
-0.06093328818678856,
0.0805496945977211,
-0.06601201742887497,
0.08361563086509705,
0.04645484313368797,
-0.04868502914905548,
0.04488325119018555,
0.05707133188843727,
0.24383489787578583,
0.057254016399383545,
-0.015704089775681496,
0.08232821524143219,
0.009731649421155453,
0.061697863042354584,
0.10007490962743759,
0.17960001528263092,
0.13245141506195068,
-0.01395723968744278,
0.09724531322717667,
0.057756923139095306,
-0.09485961496829987,
-0.17509937286376953,
0.07567312568426132,
-0.03173026815056801,
0.13343878090381622,
0.009696600027382374,
0.20287679135799408,
0.0980091392993927,
-0.1917829066514969,
0.05222088843584061,
-0.03395633399486542,
-0.08054788410663605,
-0.11181190609931946,
-0.026456793770194054,
-0.0641666129231453,
-0.19032472372055054,
0.018283661454916,
-0.12516652047634125,
0.043299559503793716,
0.07653743028640747,
0.00892125628888607,
0.016471220180392265,
0.16403447091579437,
0.03430855646729469,
0.015776250511407852,
0.0722745731472969,
0.011206933297216892,
-0.02532045915722847,
-0.028662120923399925,
-0.1045890748500824,
0.047107767313718796,
-0.034628067165613174,
0.04865638539195061,
-0.07606767863035202,
-0.11346542835235596,
0.08154606074094772,
0.03189607337117195,
-0.10997439920902252,
0.027651989832520485,
-0.0007702362490817904,
0.07966376096010208,
0.04247044399380684,
0.0018305692356079817,
0.02501695044338703,
-0.013564562425017357,
0.2534756064414978,
-0.10219138860702515,
-0.023395564407110214,
-0.14502592384815216,
0.21266688406467438,
0.01614692620933056,
-0.027031663805246353,
0.051626693457365036,
-0.08708295971155167,
-0.02961106412112713,
0.14008016884326935,
0.1227518618106842,
-0.017994295805692673,
-0.032297033816576004,
0.014549345709383488,
-0.01723831333220005,
-0.057307612150907516,
0.08452970534563065,
0.10470159351825714,
0.07849235087633133,
-0.06700693815946579,
-0.05120976269245148,
-0.04791531339287758,
-0.046962056308984756,
0.011231965385377407,
0.08979637920856476,
0.01428937166929245,
-0.017067313194274902,
-0.03166581690311432,
0.09615612030029297,
-0.06993148475885391,
-0.12841175496578217,
0.06317836791276932,
-0.1607201248407364,
-0.19134798645973206,
-0.056111808866262436,
0.07591629773378372,
0.009659224189817905,
0.05833805352449417,
0.00008136886754073203,
-0.04780058190226555,
0.10492429882287979,
-0.0040206206031143665,
-0.05554988235235214,
-0.14359921216964722,
0.11073422431945801,
-0.05559925362467766,
0.21132038533687592,
-0.05486885830760002,
0.038824066519737244,
0.11112584173679352,
0.06203469634056091,
-0.08611651510000229,
0.006658870726823807,
0.07015678286552429,
-0.1438428908586502,
0.017221765592694283,
0.1934671849012375,
-0.044964805245399475,
0.11131928116083145,
0.019035063683986664,
-0.12650837004184723,
-0.0039225067012012005,
-0.06805355101823807,
-0.05542083829641342,
-0.05216469243168831,
-0.032696161419153214,
-0.06142223998904228,
0.1252828687429428,
0.2121463119983673,
-0.06416624784469604,
-0.018748300150036812,
-0.06294357776641846,
0.044840916991233826,
0.08375304937362671,
0.1065390408039093,
-0.01800258457660675,
-0.2813463509082794,
0.012181305326521397,
0.03340345621109009,
-0.027641309425234795,
-0.2458919882774353,
-0.0956142470240593,
0.05172860994935036,
-0.056906357407569885,
-0.04148150607943535,
0.09242048859596252,
0.06424882262945175,
0.042315199971199036,
-0.05902902036905289,
-0.04632768779993057,
-0.07756847143173218,
0.16434630751609802,
-0.17739298939704895,
-0.07899942249059677
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-finetuned-ks
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0952
- Accuracy: 0.9823
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7908 | 1.0 | 399 | 0.6776 | 0.9009 |
| 0.3202 | 2.0 | 798 | 0.2061 | 0.9763 |
| 0.221 | 3.0 | 1197 | 0.1257 | 0.9785 |
| 0.1773 | 4.0 | 1596 | 0.0990 | 0.9813 |
| 0.1729 | 5.0 | 1995 | 0.0952 | 0.9823 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["superb"], "metrics": ["accuracy"], "model-index": [{"name": "wav2vec2-base-finetuned-ks", "results": []}]}
|
audio-classification
|
anton-l/wav2vec2-base-finetuned-ks
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-base-finetuned-ks
==========================
This model is a fine-tuned version of facebook/wav2vec2-base on the superb dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0952
* Accuracy: 0.9823
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.9.0+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
58,
144,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.13292020559310913,
0.14160001277923584,
-0.0018329831073060632,
0.0931263267993927,
0.16549243032932281,
0.037518419325351715,
0.10775992274284363,
0.10338520258665085,
-0.09774249792098999,
0.070259690284729,
0.09477230906486511,
0.11478443443775177,
0.03648606687784195,
0.14456753432750702,
-0.029113860800862312,
-0.24611984193325043,
0.00006973755807848647,
0.016581479460000992,
-0.1403423696756363,
0.13810475170612335,
0.06667745858430862,
-0.13379345834255219,
0.054184019565582275,
-0.00635681813582778,
-0.18558838963508606,
-0.022431455552577972,
-0.018084309995174408,
-0.04898759350180626,
0.12131547182798386,
0.00795680470764637,
0.1198899894952774,
0.03560185804963112,
0.12173403054475784,
-0.1706175059080124,
0.006817157380282879,
0.07318899780511856,
0.023359138518571854,
0.09576616436243057,
0.0780167356133461,
0.026451710611581802,
0.06501960009336472,
-0.09281643480062485,
0.047738607972860336,
0.00830062199383974,
-0.11237085610628128,
-0.21465857326984406,
-0.10655181109905243,
0.03365665674209595,
0.07098014652729034,
0.08557802438735962,
-0.0048933508805930614,
0.09147105365991592,
-0.0839894711971283,
0.07987122982740402,
0.24894408881664276,
-0.27100035548210144,
-0.06460247933864594,
0.02699064463376999,
0.028757179155945778,
0.08171640336513519,
-0.09776592254638672,
-0.02605043351650238,
0.03966725990176201,
0.02893788367509842,
0.13583973050117493,
-0.005073961336165667,
-0.06360664963722229,
-0.0006149651599116623,
-0.15552100539207458,
-0.04894380643963814,
0.10895698517560959,
0.04796505719423294,
-0.03849773481488228,
-0.02687789313495159,
-0.06427160650491714,
-0.26469072699546814,
-0.03211429342627525,
0.018178865313529968,
0.049147989600896835,
-0.06656530499458313,
-0.09567420929670334,
0.032505203038454056,
-0.06348748505115509,
-0.08981190621852875,
-0.020518222823739052,
0.13886307179927826,
0.040248844772577286,
0.00980477686971426,
0.014332949183881283,
0.12186679244041443,
0.02594848722219467,
-0.15454748272895813,
0.030811334028840065,
0.03594719618558884,
-0.08581025898456573,
-0.019499722868204117,
-0.05002789571881294,
0.013920841738581657,
0.0005195726989768445,
0.1265384405851364,
-0.048790257424116135,
0.04242115095257759,
0.04751765355467796,
0.03483309969305992,
-0.07795310020446777,
0.17208485305309296,
-0.09691937267780304,
-0.0787753313779831,
-0.013025807216763496,
0.10625309497117996,
0.02564896270632744,
-0.022318532690405846,
-0.08674163371324539,
0.0027338021900504827,
0.09912948310375214,
0.019877895712852478,
-0.032478466629981995,
0.03727514669299126,
-0.0597708635032177,
-0.036195769906044006,
0.04880330339074135,
-0.10287640988826752,
0.025419307872653008,
0.018888715654611588,
-0.07994107156991959,
-0.004594522062689066,
0.003815456759184599,
0.005951030645519495,
-0.013379806652665138,
0.15933679044246674,
-0.08844994008541107,
0.005107209552079439,
-0.08093132823705673,
-0.10048282891511917,
0.03155042603611946,
-0.13040292263031006,
0.001629500649869442,
-0.04764130711555481,
-0.12573857605457306,
-0.028310954570770264,
0.04925200343132019,
-0.044799741357564926,
-0.09510018676519394,
-0.039117347449064255,
-0.09045092761516571,
0.03406654670834541,
-0.038321781903505325,
0.15222682058811188,
-0.07190466672182083,
0.11589734256267548,
0.02073569968342781,
0.07605139911174774,
-0.018475214019417763,
0.06823822110891342,
-0.06286998838186264,
0.03530207276344299,
-0.17542406916618347,
0.048550114035606384,
-0.07252666354179382,
0.03147392347455025,
-0.09428621828556061,
-0.11988306790590286,
0.016208214685320854,
-0.01418651919811964,
0.09411520510911942,
0.0815465971827507,
-0.20322053134441376,
-0.08628984540700912,
0.14704719185829163,
-0.0691567063331604,
-0.10032885521650314,
0.12060417979955673,
-0.018311670050024986,
-0.052919864654541016,
0.04810034856200218,
0.13344456255435944,
0.09078602492809296,
-0.07927141338586807,
-0.031591713428497314,
-0.02702341601252556,
0.10210859775543213,
-0.05036536976695061,
0.1045633852481842,
0.007727033458650112,
0.05349799618124962,
-0.0017135718371719122,
-0.06696438044309616,
0.058994684368371964,
-0.10763576626777649,
-0.09418841451406479,
-0.031619008630514145,
-0.09981882572174072,
0.05381780117750168,
0.06348719447851181,
0.03434073179960251,
-0.06783189624547958,
-0.10224369168281555,
0.03994397446513176,
0.10522231459617615,
-0.08200085908174515,
0.023860538378357887,
-0.05835887789726257,
0.07971024513244629,
-0.07267294824123383,
-0.02983718551695347,
-0.2080259919166565,
-0.020700113847851753,
0.014697542414069176,
-0.024985911324620247,
0.021821536123752594,
0.0022535433527082205,
0.05149742215871811,
0.07807795703411102,
-0.0538502037525177,
-0.050404101610183716,
-0.06399203091859818,
-0.011759346351027489,
-0.090988889336586,
-0.23129360377788544,
-0.08775651454925537,
-0.01577221043407917,
0.12926799058914185,
-0.17993015050888062,
-0.001313139102421701,
-0.009761056862771511,
0.08804244548082352,
0.017931664362549782,
-0.040343109518289566,
-0.025030381977558136,
0.10472200065851212,
-0.01747937873005867,
-0.07146696001291275,
0.05322963371872902,
0.007373636122792959,
-0.08069445937871933,
-0.04629332944750786,
-0.08600083738565445,
0.14349742233753204,
0.12117039412260056,
-0.030158255249261856,
-0.07532261312007904,
0.01325940527021885,
-0.07070422172546387,
-0.04942438006401062,
-0.045102789998054504,
0.02586052194237709,
0.15610451996326447,
0.022991109639406204,
0.12337073683738708,
-0.07957956939935684,
-0.04224734753370285,
0.039676517248153687,
0.014547384344041348,
0.03415398672223091,
0.10551232844591141,
0.13445276021957397,
-0.06218412518501282,
0.13705140352249146,
0.12576191127300262,
-0.10223805159330368,
0.13552524149417877,
-0.0705857053399086,
-0.10582143813371658,
-0.023788919672369957,
-0.030727334320545197,
0.00687813526019454,
0.127915620803833,
-0.10845499485731125,
0.003984188660979271,
0.02868187241256237,
0.024717675521969795,
0.011600133962929249,
-0.21443983912467957,
-0.02552958019077778,
0.02860027737915516,
-0.06430180370807648,
-0.061211246997117996,
-0.03593241423368454,
0.00794040784239769,
0.10571783035993576,
-0.004082084633409977,
-0.06759358942508698,
0.006228316575288773,
-0.007001960184425116,
-0.05257995054125786,
0.18201258778572083,
-0.07404231280088425,
-0.13304027915000916,
-0.11889450252056122,
-0.04070528596639633,
-0.033981118351221085,
-0.005738738924264908,
0.03100387565791607,
-0.08265175670385361,
-0.030733700841665268,
-0.03989104554057121,
0.01304609701037407,
0.02374662086367607,
0.0511506050825119,
0.04111725464463234,
-0.00675467774271965,
0.0796830803155899,
-0.09786087274551392,
0.027146562933921814,
-0.03553948178887367,
-0.05345839262008667,
0.026173656806349754,
0.08488006144762039,
0.10802116245031357,
0.16659393906593323,
0.0014764037914574146,
0.0004920163191854954,
-0.007931952364742756,
0.2170514315366745,
-0.10055727511644363,
-0.0216209776699543,
0.1088079884648323,
-0.030032062903046608,
0.05288190394639969,
0.1364239752292633,
0.0710596814751625,
-0.07430049777030945,
-0.010828717611730099,
0.03849329799413681,
-0.046824559569358826,
-0.22997456789016724,
-0.05233605206012726,
-0.059868697077035904,
-0.01441357471048832,
0.09937149286270142,
0.011568338610231876,
-0.022911356762051582,
0.047638773918151855,
0.010264892131090164,
0.052456486970186234,
-0.04713256284594536,
0.025542594492435455,
0.0598980113863945,
0.06615585088729858,
0.1298701912164688,
-0.032876789569854736,
-0.037626978009939194,
0.03415627405047417,
0.011057757772505283,
0.24818089604377747,
-0.038264982402324677,
0.16837985813617706,
0.04474649205803871,
0.170525461435318,
0.017248107120394707,
0.1054636612534523,
-0.003945503383874893,
-0.020789125934243202,
-0.0025144813116639853,
-0.043624743819236755,
-0.0411246195435524,
-0.020886577665805817,
0.0040092323906719685,
0.03653200343251228,
-0.10624366253614426,
0.008685152046382427,
0.021520745009183884,
0.29092007875442505,
0.06230011582374573,
-0.32969796657562256,
-0.08380039036273956,
-0.013241667300462723,
-0.012605752795934677,
-0.013900610618293285,
0.01384660042822361,
0.1466597020626068,
-0.07100638747215271,
0.055764783173799515,
-0.06871270388364792,
0.08253752440214157,
-0.06897427886724472,
0.014512726105749607,
0.1339932680130005,
0.09802761673927307,
0.007960557006299496,
0.045701831579208374,
-0.213328555226326,
0.26092132925987244,
-0.003653140040114522,
0.0637589693069458,
-0.051918722689151764,
0.010090604424476624,
0.02418627217411995,
0.014957455918192863,
0.05973535031080246,
-0.005103067494928837,
-0.036056749522686005,
-0.19903509318828583,
-0.11426419764757156,
0.025423817336559296,
0.10057216137647629,
-0.03690071031451225,
0.11156649887561798,
-0.0254500862210989,
-0.024696646258234978,
0.05436837300658226,
-0.04970276728272438,
-0.03209666907787323,
-0.10017082095146179,
0.017021119594573975,
0.007039558608084917,
-0.0005832458846271038,
-0.06110069528222084,
-0.13480393588542938,
-0.10638206452131271,
0.12917093932628632,
-0.0581447072327137,
-0.028455976396799088,
-0.11842731386423111,
0.09706339240074158,
0.16303275525569916,
-0.0687461644411087,
0.058539409190416336,
0.02827715314924717,
0.08935243636369705,
0.049660809338092804,
-0.05511658266186714,
0.09063214063644409,
-0.06916581839323044,
-0.20799772441387177,
-0.054640863090753555,
0.13886019587516785,
0.03506830707192421,
0.07654088735580444,
-0.040191397070884705,
0.02908887155354023,
0.011422655545175076,
-0.092044398188591,
0.033741869032382965,
-0.015682676807045937,
0.06242487579584122,
0.04975373297929764,
-0.025037085637450218,
-0.002378764096647501,
-0.03424438461661339,
-0.03002510964870453,
0.15335385501384735,
0.25227776169776917,
-0.08939989656209946,
0.039835125207901,
0.040562260895967484,
-0.051181379705667496,
-0.19120685756206512,
0.02315894514322281,
0.12818355858325958,
0.028779488056898117,
0.02853696048259735,
-0.19499287009239197,
0.11416357755661011,
0.1061670109629631,
-0.016890812665224075,
0.12122273445129395,
-0.33249932527542114,
-0.11869871616363525,
0.071055106818676,
0.11508453637361526,
0.040975432842969894,
-0.14761246740818024,
-0.034756824374198914,
-0.008562436327338219,
-0.14913833141326904,
0.1426139771938324,
-0.06365017592906952,
0.13499951362609863,
-0.013826049864292145,
0.03285965695977211,
0.009593771770596504,
-0.059723421931266785,
0.12208682298660278,
0.032633356750011444,
0.06923440098762512,
-0.017099764198064804,
-0.004678396042436361,
0.06220027804374695,
-0.03386465460062027,
-0.01900186948478222,
-0.06563428789377213,
0.025435203686356544,
-0.06956332176923752,
-0.014411259442567825,
-0.09622009843587875,
-0.005425030365586281,
-0.04526868090033531,
-0.035851746797561646,
-0.01416696049273014,
0.04650221765041351,
0.06428273022174835,
-0.022842999547719955,
0.12802590429782867,
0.023706860840320587,
0.1181001216173172,
0.11218971759080887,
0.0690365880727768,
-0.029963932931423187,
-0.11019422113895416,
-0.01724945940077305,
-0.008795138448476791,
0.054483819752931595,
-0.1120414137840271,
0.033793095499277115,
0.1562238186597824,
0.05355849117040634,
0.10294023156166077,
0.07426108419895172,
-0.027624722570180893,
0.00689343037083745,
0.0730409026145935,
-0.13190236687660217,
-0.08675222098827362,
0.01554748136550188,
-0.0648866519331932,
-0.13057450950145721,
0.03750118613243103,
0.09502464532852173,
-0.04548424109816551,
-0.01104484312236309,
0.008736752904951572,
0.017169442027807236,
-0.052932728081941605,
0.2547827959060669,
0.04437951371073723,
0.07482743263244629,
-0.12810125946998596,
0.08098051697015762,
0.05552076920866966,
-0.12393363565206528,
-0.0034358715638518333,
0.06316099315881729,
-0.08783425390720367,
0.0019990436267107725,
0.07458862662315369,
0.0902676209807396,
-0.034967828541994095,
-0.037645403295755386,
-0.1223113089799881,
-0.1206040307879448,
0.0855058953166008,
0.12254413217306137,
0.07556243985891342,
0.04095177352428436,
-0.04644985496997833,
0.020168792456388474,
-0.12327691912651062,
0.09695801138877869,
0.07169844210147858,
0.0902491956949234,
-0.16760647296905518,
0.15539398789405823,
0.005004244856536388,
0.03793412819504738,
-0.019615888595581055,
0.041450705379247665,
-0.09690234810113907,
0.005571108311414719,
-0.10723336786031723,
-0.010697109624743462,
-0.028895145282149315,
-0.0012602935312315822,
-0.01884867623448372,
-0.05177999287843704,
-0.07047083973884583,
0.012623945251107216,
-0.1073126569390297,
-0.041373111307621,
0.004870655480772257,
0.03427305817604065,
-0.11206936091184616,
-0.008738573640584946,
0.0362890399992466,
-0.09346852451562881,
0.08410666882991791,
0.049379296600818634,
0.033011049032211304,
0.040843039751052856,
-0.09664437919855118,
-0.029433876276016235,
0.0567774772644043,
0.01121549867093563,
0.07482219487428665,
-0.122516930103302,
-0.003157929051667452,
-0.022086860612034798,
0.041604891419410706,
0.0031665179412811995,
0.06099652498960495,
-0.13201090693473816,
-0.0001684325688984245,
-0.038521163165569305,
-0.03583001345396042,
-0.058185040950775146,
0.02683623693883419,
0.115218386054039,
0.03582606092095375,
0.18294359743595123,
-0.06119827181100845,
0.037050068378448486,
-0.23674063384532928,
-0.006754583213478327,
-0.02096453495323658,
-0.10034748166799545,
-0.10809691995382309,
-0.03341108188033104,
0.08883825689554214,
-0.055328890681266785,
0.10882990807294846,
-0.02505972981452942,
0.057604264467954636,
0.01399767491966486,
-0.03091868944466114,
-0.013478203676640987,
0.05319414287805557,
0.1997898519039154,
0.0451301634311676,
-0.03954211249947548,
0.07007326930761337,
0.024823879823088646,
0.06960996985435486,
0.12446337193250656,
0.20899802446365356,
0.12230919301509857,
0.03280170261859894,
0.06342235207557678,
0.054754793643951416,
-0.11570703238248825,
-0.1714906394481659,
0.10045171529054642,
-0.08666711300611496,
0.13059264421463013,
0.001142563996836543,
0.22971314191818237,
0.026666270568966866,
-0.1918465495109558,
0.05315720662474632,
-0.05553538724780083,
-0.0941031351685524,
-0.08995404094457626,
-0.028302229940891266,
-0.06767082214355469,
-0.15559491515159607,
0.006863724440336227,
-0.12367382645606995,
0.043995898216962814,
0.14839079976081848,
0.012371557764708996,
0.008751148357987404,
0.1487138420343399,
0.036547936499118805,
0.0244801864027977,
0.06322386860847473,
0.03640993312001228,
-0.02130054496228695,
-0.0440017431974411,
-0.08056679368019104,
0.025472430512309074,
-0.004263410810381174,
0.06287835538387299,
-0.0692351758480072,
-0.08084067702293396,
0.07981601357460022,
0.020473266020417213,
-0.09825381636619568,
0.025646550580859184,
-0.011982720345258713,
0.08827077597379684,
0.05108199268579483,
-0.0014061963884159923,
0.030826549977064133,
-0.024424074217677116,
0.22400827705860138,
-0.0840471088886261,
-0.06312545388936996,
-0.12254367023706436,
0.18153779208660126,
0.01475516613572836,
-0.024593044072389603,
0.06235533580183983,
-0.07229908555746078,
-0.02978495880961418,
0.16486109793186188,
0.1832755208015442,
-0.08759883046150208,
-0.028204375877976418,
0.019853051751852036,
-0.009702462702989578,
-0.04147450253367424,
0.11832468211650848,
0.14073973894119263,
0.08964929729700089,
-0.09124607592821121,
-0.05313711613416672,
-0.06259522587060928,
-0.027866194024682045,
0.005020329728722572,
0.046736203134059906,
0.02942916937172413,
-0.004128186963498592,
-0.03976082056760788,
0.06479977816343307,
-0.07806272804737091,
-0.15297932922840118,
0.06298880279064178,
-0.22240372002124786,
-0.21016526222229004,
-0.03656960651278496,
0.10652728378772736,
0.01658814586699009,
0.042924072593450546,
-0.011456522159278393,
-0.021539276465773582,
0.08077035844326019,
-0.019200367853045464,
-0.055885184556245804,
-0.0863572210073471,
0.07654043287038803,
-0.08507269620895386,
0.18166372179985046,
-0.043767549097537994,
0.07412503659725189,
0.10073723644018173,
0.0767650455236435,
-0.09291588515043259,
0.028109094128012657,
0.05662601813673973,
-0.13177745044231415,
0.02617710642516613,
0.15247361361980438,
-0.05041183531284332,
0.07381783425807953,
0.027949152514338493,
-0.0953359305858612,
-0.004148508887737989,
-0.04894779995083809,
-0.021633213385939598,
-0.0351218543946743,
-0.056788213551044464,
-0.05089694261550903,
0.14820516109466553,
0.21430453658103943,
-0.04651148244738579,
0.008726864121854305,
-0.06240411475300789,
0.012476162053644657,
0.05527716875076294,
0.10025676339864731,
-0.047190338373184204,
-0.26516327261924744,
0.02242155559360981,
0.017601054161787033,
-0.005422630812972784,
-0.19425992667675018,
-0.07272659987211227,
0.025318069383502007,
-0.07186102867126465,
-0.10080267488956451,
0.0932963490486145,
0.030455147847533226,
0.060340765863657,
-0.055405061691999435,
-0.031728919595479965,
-0.0777798742055893,
0.15451651811599731,
-0.1897014081478119,
-0.09212898463010788
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-ft-keyword-spotting
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0824
- Accuracy: 0.9826
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.8972 | 1.0 | 399 | 0.7023 | 0.8174 |
| 0.3274 | 2.0 | 798 | 0.1634 | 0.9773 |
| 0.1993 | 3.0 | 1197 | 0.1048 | 0.9788 |
| 0.1777 | 4.0 | 1596 | 0.0824 | 0.9826 |
| 0.1527 | 5.0 | 1995 | 0.0812 | 0.9810 |
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["superb"], "metrics": ["accuracy"], "model-index": [{"name": "wav2vec2-base-ft-keyword-spotting", "results": []}]}
|
audio-classification
|
anton-l/wav2vec2-base-ft-keyword-spotting
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-base-ft-keyword-spotting
=================================
This model is a fine-tuned version of facebook/wav2vec2-base on the superb dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0824
* Accuracy: 0.9826
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
58,
159,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.131468266248703,
0.09674689918756485,
-0.0018097981810569763,
0.0574999637901783,
0.148492231965065,
0.011044739745557308,
0.1009833812713623,
0.11945847421884537,
-0.1072453185915947,
0.07640687376260757,
0.0957227274775505,
0.08147742599248886,
0.04722527414560318,
0.1107378751039505,
-0.020090430974960327,
-0.31812188029289246,
0.01401514932513237,
0.03227485343813896,
-0.19269488751888275,
0.11679298430681229,
0.10836882144212723,
-0.10637814551591873,
0.04136242717504501,
0.04154438152909279,
-0.15582993626594543,
0.012559392489492893,
-0.01067893672734499,
-0.06854795664548874,
0.09509400278329849,
0.04838189110159874,
0.11052601039409637,
0.02097649686038494,
0.09288404881954193,
-0.2115459144115448,
0.012724477797746658,
0.08155570179224014,
0.0386321097612381,
0.08968117833137512,
0.09844505786895752,
0.006567643955349922,
0.11310631036758423,
-0.06664655357599258,
0.07232260704040527,
0.054727859795093536,
-0.11834400147199631,
-0.31107616424560547,
-0.10493624210357666,
0.040726032108068466,
0.10061302781105042,
0.08195213228464127,
-0.017954329028725624,
0.08286172151565552,
-0.05947168171405792,
0.07848113030195236,
0.21961601078510284,
-0.24016280472278595,
-0.0934762954711914,
-0.0028684469871222973,
0.06603454798460007,
0.05101194977760315,
-0.11668600887060165,
-0.023279549553990364,
0.055630799382925034,
0.028923509642481804,
0.11290236562490463,
0.014679335057735443,
-0.011927626095712185,
0.006240608170628548,
-0.14563621580600739,
-0.0560280941426754,
0.1653508096933365,
0.0902206152677536,
-0.053430262953042984,
-0.054068662226200104,
-0.02008729986846447,
-0.23115356266498566,
-0.039472389966249466,
0.013896851800382137,
0.03585680201649666,
-0.053723614662885666,
-0.12140991538763046,
0.01332120131701231,
-0.08111404627561569,
-0.10614287108182907,
0.0173047948628664,
0.18684493005275726,
0.039457570761442184,
-0.008230200037360191,
-0.005488723050802946,
0.11489645391702652,
0.03372802957892418,
-0.15485134720802307,
-0.019223805516958237,
0.026819860562682152,
-0.0964246615767479,
-0.02984756976366043,
-0.05930789187550545,
-0.02117312327027321,
-0.013758133165538311,
0.15951356291770935,
-0.05220888927578926,
0.06894028186798096,
0.030820615589618683,
0.030423082411289215,
-0.08018633723258972,
0.17476056516170502,
-0.06625702232122421,
-0.01861782930791378,
-0.04016067832708359,
0.08825132250785828,
-0.0038083407562226057,
-0.01262732595205307,
-0.05328800529241562,
0.027960294857621193,
0.08485347777605057,
0.026473864912986755,
-0.027551837265491486,
0.01894286461174488,
-0.06021492928266525,
-0.03842290863394737,
0.01898340880870819,
-0.09200578927993774,
0.024458840489387512,
0.009371689520776272,
-0.08606850355863571,
0.012466753832995892,
0.014919646084308624,
0.022141845896840096,
0.009537425823509693,
0.1318759024143219,
-0.09523110091686249,
-0.014038185589015484,
-0.10032736510038376,
-0.09495346248149872,
0.04177911579608917,
-0.042474377900362015,
0.01395627111196518,
-0.0661507174372673,
-0.12787918746471405,
-0.033336054533720016,
0.06675736606121063,
-0.03314261510968208,
-0.09173412621021271,
-0.02067066729068756,
-0.07081113755702972,
0.05348401516675949,
-0.030300891026854515,
0.16288110613822937,
-0.06053708866238594,
0.1067788228392601,
0.05521401762962341,
0.04791015759110451,
0.01071803830564022,
0.06348145753145218,
-0.06048721820116043,
0.050811175256967545,
-0.16330140829086304,
0.04581940174102783,
-0.09859197586774826,
0.06355596333742142,
-0.12549400329589844,
-0.12921424210071564,
-0.02578824944794178,
0.0038973174523562193,
0.0901017114520073,
0.08173922449350357,
-0.15455470979213715,
-0.10604076832532883,
0.14566798508167267,
-0.08644609898328781,
-0.13169676065444946,
0.11373354494571686,
-0.011790902353823185,
-0.0074006663635373116,
0.04560348764061928,
0.12302835285663605,
0.12457768619060516,
-0.08336082100868225,
-0.025778548792004585,
-0.05543006956577301,
0.1306692510843277,
0.002534095197916031,
0.1163242980837822,
-0.015343175269663334,
0.005383976269513369,
0.0029497998766601086,
-0.05598892271518707,
0.06173677369952202,
-0.10496803373098373,
-0.08977906405925751,
-0.044904373586177826,
-0.09518440812826157,
0.028359202668070793,
0.06615979224443436,
0.04858701676130295,
-0.0848282054066658,
-0.1297425925731659,
0.08940334618091583,
0.11994349956512451,
-0.08356047421693802,
0.027072420343756676,
-0.07551788538694382,
0.0398702472448349,
-0.053178489208221436,
-0.027789371088147163,
-0.1848340779542923,
-0.044754259288311005,
0.015030596405267715,
-0.08210726827383041,
0.01967899687588215,
-0.02680351585149765,
0.07264859229326248,
0.06153271719813347,
-0.06482593715190887,
-0.06876977533102036,
-0.113701231777668,
-0.010277947410941124,
-0.06363833695650101,
-0.21567966043949127,
-0.10305555164813995,
-0.019618907943367958,
0.13934361934661865,
-0.2206350415945053,
0.013566563837230206,
0.014109628275036812,
0.1287156045436859,
0.047078002244234085,
-0.05495821312069893,
-0.020258313044905663,
0.08165138959884644,
-0.02633345127105713,
-0.06915091723203659,
0.020078791305422783,
0.019605126231908798,
-0.11496659368276596,
-0.01977040432393551,
-0.08656170219182968,
0.13540013134479523,
0.10254412144422531,
-0.025522340089082718,
-0.07431081682443619,
-0.02389167994260788,
-0.08736021816730499,
-0.06297191977500916,
-0.03683941438794136,
-0.00976602267473936,
0.12342598289251328,
0.024048352614045143,
0.12247291207313538,
-0.08295346051454544,
-0.05197149142622948,
0.04602936655282974,
0.0020585639867931604,
-0.007076033391058445,
0.1015702486038208,
0.0895761251449585,
-0.056781698018312454,
0.11927705258131027,
0.11184205114841461,
-0.101168192923069,
0.1746359020471573,
-0.09023594111204147,
-0.12696848809719086,
-0.020187746733427048,
-0.0018960274755954742,
0.03651463985443115,
0.12982779741287231,
-0.12187846004962921,
0.018549123778939247,
0.021912051364779472,
0.035873860120773315,
0.023686053231358528,
-0.20088522136211395,
-0.013579201884567738,
0.04493183270096779,
-0.05309048667550087,
-0.062309619039297104,
-0.011106963269412518,
-0.012272546999156475,
0.07921015471220016,
0.00737251341342926,
-0.013368056155741215,
0.004020070191472769,
-0.006478775292634964,
-0.06992094218730927,
0.19316793978214264,
-0.08672529458999634,
-0.13158531486988068,
-0.16589689254760742,
-0.02006872370839119,
-0.03165794536471367,
-0.008405836299061775,
0.039986420422792435,
-0.10539189726114273,
-0.03839407488703728,
-0.03761319816112518,
0.051332227885723114,
-0.03812796622514725,
0.04009632021188736,
0.02424706146121025,
0.019557904452085495,
0.11340451240539551,
-0.11122440546751022,
0.0389040932059288,
-0.0005371253937482834,
-0.05533565208315849,
0.005736387334764004,
0.04435636103153229,
0.10324444621801376,
0.15591461956501007,
0.033136941492557526,
0.014235093258321285,
-0.01773500256240368,
0.1936393678188324,
-0.11901555955410004,
-0.0305774062871933,
0.12874267995357513,
0.002595841186121106,
0.03759562596678734,
0.09463635087013245,
0.07187493145465851,
-0.07948959618806839,
0.030374303460121155,
0.07011786103248596,
-0.034259699285030365,
-0.24541491270065308,
-0.02620360627770424,
-0.06666672974824905,
-0.01225943211466074,
0.11451153457164764,
0.030291253700852394,
0.0184988621622324,
0.05400968715548515,
-0.0220261812210083,
0.028238261118531227,
-0.02770189382135868,
0.06962812691926956,
0.0779251828789711,
0.06301385164260864,
0.12685222923755646,
-0.03555408492684364,
-0.02581190876662731,
0.037578292191028595,
0.016217824071645737,
0.2563018798828125,
0.0083649642765522,
0.1663147658109665,
0.06048247590661049,
0.13999882340431213,
0.01652994379401207,
0.09194610267877579,
0.02107285149395466,
-0.0383346863090992,
0.0221103485673666,
-0.056810684502124786,
-0.01568056084215641,
0.029096029698848724,
0.029231086373329163,
0.05315038189291954,
-0.1343388855457306,
-0.026298463344573975,
0.006758738774806261,
0.33427777886390686,
0.06321214884519577,
-0.3263874053955078,
-0.1269351691007614,
0.007772033102810383,
-0.06100921332836151,
-0.047534823417663574,
0.024945342913269997,
0.09568993002176285,
-0.08320356160402298,
0.07773645967245102,
-0.08541535586118698,
0.10495734214782715,
-0.02522108145058155,
-0.007136163301765919,
0.12590977549552917,
0.08802493661642075,
-0.008698862977325916,
0.05497579276561737,
-0.2217557430267334,
0.2752665579319,
-0.012110348790884018,
0.08762598782777786,
-0.017367709428071976,
0.035184454172849655,
0.037986546754837036,
0.00479429867118597,
0.0466056689620018,
-0.01192309521138668,
-0.11134377866983414,
-0.19761799275875092,
-0.07028927654027939,
0.02854912541806698,
0.11514364928007126,
-0.058561429381370544,
0.12564626336097717,
-0.03481495380401611,
-0.005660113412886858,
0.07013016194105148,
-0.07473059743642807,
-0.12035327404737473,
-0.10158300399780273,
0.02895430289208889,
0.0050812410190701485,
0.08032777905464172,
-0.11240161210298538,
-0.12368812412023544,
-0.046564072370529175,
0.13963989913463593,
-0.07218823581933975,
-0.0156157948076725,
-0.13758808374404907,
0.10204187780618668,
0.1815129518508911,
-0.05607108399271965,
0.07464448362588882,
0.01668453775346279,
0.14482775330543518,
0.05276317521929741,
-0.026453828439116478,
0.09534456580877304,
-0.08317249268293381,
-0.21183960139751434,
-0.042983606457710266,
0.1466093510389328,
0.034587640315294266,
0.059201035648584366,
-0.03474728763103485,
0.0363280363380909,
-0.002036310965195298,
-0.08986105024814606,
0.04748071730136871,
-0.030046362429857254,
0.009110745042562485,
0.0488242544233799,
-0.027767624706029892,
0.043538112193346024,
-0.03282879665493965,
-0.06023416668176651,
0.11787354201078415,
0.2698827385902405,
-0.07556456327438354,
-0.0017613805830478668,
0.029859747737646103,
-0.04898058623075485,
-0.1383172571659088,
0.05376376584172249,
0.1489458829164505,
0.027808303013443947,
0.026221036911010742,
-0.23874594271183014,
0.10393450409173965,
0.10826633125543594,
-0.027467478066682816,
0.12760791182518005,
-0.29933449625968933,
-0.1275200992822647,
0.08962573856115341,
0.10674095153808594,
-0.03336738422513008,
-0.16174213588237762,
-0.060910094529390335,
-0.023556523025035858,
-0.15364718437194824,
0.13194429874420166,
-0.0525091327726841,
0.12031769007444382,
-0.007043286692351103,
0.04834480956196785,
0.01495286263525486,
-0.0469634085893631,
0.16004009544849396,
-0.011015371419489384,
0.06143678352236748,
0.0034992117434740067,
0.06087980046868324,
0.06618247181177139,
-0.04810290411114693,
-0.004900273401290178,
-0.0641217976808548,
0.019473224878311157,
-0.12397908419370651,
-0.031106775626540184,
-0.09001844376325607,
0.025675883516669273,
-0.0516522116959095,
-0.03980284556746483,
-0.02100451849400997,
0.05311572551727295,
0.04262199252843857,
-0.010313677601516247,
0.17667417228221893,
-0.0158319603651762,
0.16981835663318634,
0.09940935671329498,
0.08421741425991058,
-0.004400661215186119,
-0.11968811601400375,
-0.006357287988066673,
-0.017718281596899033,
0.07166600227355957,
-0.15371441841125488,
0.03823569416999817,
0.140893816947937,
0.05780614912509918,
0.12396255135536194,
0.07162956893444061,
-0.05852249637246132,
0.019416220486164093,
0.09183741360902786,
-0.07916074991226196,
-0.11160781979560852,
-0.02597864530980587,
0.03783278912305832,
-0.16133883595466614,
0.03107544779777527,
0.10213366150856018,
-0.06468947231769562,
-0.0032447336707264185,
0.020887553691864014,
0.009865459986031055,
-0.06627735495567322,
0.2335187792778015,
0.04554164782166481,
0.08339706063270569,
-0.09117581695318222,
0.08996524661779404,
0.04668007418513298,
-0.16835640370845795,
-0.010067339986562729,
0.0672922432422638,
-0.03951650485396385,
-0.0011758047621697187,
0.011060002259910107,
0.05028030648827553,
-0.06428028643131256,
-0.06473339349031448,
-0.13049380481243134,
-0.14014947414398193,
0.08366822451353073,
0.09679309278726578,
0.044776152819395065,
0.04000459983944893,
-0.03814185410737991,
0.0647992342710495,
-0.11124547570943832,
0.09751516580581665,
0.09393899142742157,
0.10284321010112762,
-0.1762203723192215,
0.13334251940250397,
0.010971855372190475,
0.010499494150280952,
-0.001174698700197041,
0.002777945250272751,
-0.09282765537500381,
0.013655568473041058,
-0.12941698729991913,
-0.03800063952803612,
-0.03492569178342819,
-0.0042169722728431225,
-0.0014697769656777382,
-0.0603262335062027,
-0.09370569884777069,
0.02600676380097866,
-0.12047865241765976,
-0.049163661897182465,
0.00477133272215724,
0.0658988207578659,
-0.1022486761212349,
0.0013265330344438553,
0.06877284497022629,
-0.11672186851501465,
0.0687146782875061,
0.0455733984708786,
0.04327644407749176,
0.048764459788799286,
-0.08093523979187012,
0.016025660559535027,
0.04320647194981575,
-0.0065246326848864555,
0.03129274398088455,
-0.15777373313903809,
-0.00348460441455245,
-0.02645762450993061,
0.056914497166872025,
-0.011782671324908733,
0.011909043416380882,
-0.13852453231811523,
-0.05270683765411377,
-0.020150532945990562,
-0.04698541387915611,
-0.05838606879115105,
0.03222348913550377,
0.08202109485864639,
0.054022081196308136,
0.17788107693195343,
-0.06922733038663864,
0.01855849102139473,
-0.24360091984272003,
0.015819700434803963,
-0.02562696672976017,
-0.09074470400810242,
-0.07154259830713272,
-0.018828682601451874,
0.06868311762809753,
-0.06148828938603401,
0.0936252623796463,
-0.06387684494256973,
0.06062899902462959,
0.0497715137898922,
-0.08199567347764969,
0.024863116443157196,
0.05047878995537758,
0.26527920365333557,
0.0601995550096035,
-0.013259157538414001,
0.0726260393857956,
0.010659011080861092,
0.05226875841617584,
0.12260841578245163,
0.1774415820837021,
0.14841316640377045,
-0.01797599345445633,
0.08698145300149918,
0.04973680153489113,
-0.10040933638811111,
-0.14192277193069458,
0.10809405893087387,
-0.03162848949432373,
0.14463013410568237,
0.006080768071115017,
0.2042887806892395,
0.10048840194940567,
-0.19547368586063385,
0.05145850032567978,
-0.04416133463382721,
-0.08391691744327545,
-0.09571576118469238,
-0.033715713769197464,
-0.06687873601913452,
-0.20347605645656586,
0.014225547201931477,
-0.1307881474494934,
0.05977410078048706,
0.07362943142652512,
0.018261658027768135,
0.017606595531105995,
0.16044972836971283,
0.043269991874694824,
0.0055989851243793964,
0.08992359787225723,
0.005744335241615772,
-0.01950557716190815,
-0.04006386920809746,
-0.10093411803245544,
0.037801653146743774,
-0.020969780161976814,
0.043270740658044815,
-0.06712181866168976,
-0.10720827430486679,
0.07810528576374054,
0.03840314596891403,
-0.10243899375200272,
0.02359403669834137,
0.001944802817888558,
0.07223876565694809,
0.04678385704755783,
0.001067367964424193,
0.0208407212048769,
-0.022261781617999077,
0.26122811436653137,
-0.1032160297036171,
-0.030587127432227135,
-0.15085485577583313,
0.20872275531291962,
0.015998585149645805,
-0.022826412692666054,
0.04028152674436569,
-0.07902440428733826,
-0.027944296598434448,
0.13937370479106903,
0.11404145509004593,
-0.004175654146820307,
-0.03426516428589821,
0.00776559766381979,
-0.017766334116458893,
-0.058225374668836594,
0.09921792149543762,
0.11398406326770782,
0.06925970315933228,
-0.07206937670707703,
-0.04822290688753128,
-0.052508458495140076,
-0.046234626322984695,
0.0004938420606777072,
0.07401447743177414,
0.028072968125343323,
-0.018038157373666763,
-0.03332243114709854,
0.11065897345542908,
-0.06784655153751373,
-0.11192528903484344,
0.04269380122423172,
-0.1631719022989273,
-0.19402335584163666,
-0.059450291097164154,
0.06801042705774307,
0.009869780391454697,
0.051925770938396454,
-0.012469436042010784,
-0.040254734456539154,
0.09650168567895889,
-0.0003694577026180923,
-0.04161086678504944,
-0.14902934432029724,
0.10331355780363083,
-0.06070569157600403,
0.2064337283372879,
-0.0486503429710865,
0.027578221634030342,
0.11104299128055573,
0.05684174224734306,
-0.0906338319182396,
0.014344524592161179,
0.06868449598550797,
-0.14950385689735413,
0.02726217545568943,
0.19172197580337524,
-0.04425995051860809,
0.1290593147277832,
0.014987969771027565,
-0.14102894067764282,
-0.0011990458006039262,
-0.06399548053741455,
-0.04150570556521416,
-0.06014283746480942,
-0.029967129230499268,
-0.05859997496008873,
0.1273331642150879,
0.22712397575378418,
-0.07315037399530411,
-0.02115788124501705,
-0.06478550285100937,
0.045057736337184906,
0.07299971580505371,
0.11654549092054367,
-0.029585711658000946,
-0.28537940979003906,
0.019632957875728607,
0.030740467831492424,
-0.015141326934099197,
-0.23948688805103302,
-0.08746934682130814,
0.045584458857774734,
-0.06449227035045624,
-0.04005696251988411,
0.10657888650894165,
0.07022026926279068,
0.04828299209475517,
-0.05773157253861427,
-0.08454901725053787,
-0.06788038462400436,
0.17352542281150818,
-0.17059504985809326,
-0.07648508250713348
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-keyword-spotting
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0746
- Accuracy: 0.9843
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.8279 | 1.0 | 399 | 0.6792 | 0.8558 |
| 0.2961 | 2.0 | 798 | 0.1383 | 0.9798 |
| 0.2069 | 3.0 | 1197 | 0.0972 | 0.9809 |
| 0.1757 | 4.0 | 1596 | 0.0843 | 0.9825 |
| 0.1607 | 5.0 | 1995 | 0.0746 | 0.9843 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["superb"], "metrics": ["accuracy"], "model-index": [{"name": "wav2vec2-base-keyword-spotting", "results": []}]}
|
audio-classification
|
anton-l/wav2vec2-base-keyword-spotting
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-base-keyword-spotting
==============================
This model is a fine-tuned version of facebook/wav2vec2-base on the superb dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0746
* Accuracy: 0.9843
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 5.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
58,
159,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-superb #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.131146639585495,
0.09553849697113037,
-0.0017406520200893283,
0.05712660029530525,
0.14965024590492249,
0.009924981743097305,
0.10162191838026047,
0.1193154826760292,
-0.10567817091941833,
0.0765218734741211,
0.09589580446481705,
0.0814262181520462,
0.047931279987096786,
0.11014098674058914,
-0.01986263133585453,
-0.31815698742866516,
0.014941944740712643,
0.03178611397743225,
-0.19274182617664337,
0.11699850112199783,
0.10869798809289932,
-0.10560557246208191,
0.04157564416527748,
0.042175695300102234,
-0.15432153642177582,
0.012538744136691093,
-0.010556303896009922,
-0.06811061501502991,
0.09591779112815857,
0.04967812821269035,
0.11155230551958084,
0.020164620131254196,
0.09255200624465942,
-0.2138490080833435,
0.012992102652788162,
0.08262265473604202,
0.03832229599356651,
0.08902127295732498,
0.09781774133443832,
0.005540352314710617,
0.11341596394777298,
-0.06654510647058487,
0.07222943007946014,
0.05641777813434601,
-0.11949792504310608,
-0.31166207790374756,
-0.10370936244726181,
0.040778957307338715,
0.10003913938999176,
0.08316849917173386,
-0.01877075619995594,
0.0831739529967308,
-0.05845385417342186,
0.07937702536582947,
0.22227726876735687,
-0.2368718534708023,
-0.09288786351680756,
-0.0036094910465180874,
0.06404095143079758,
0.049385666847229004,
-0.11652734875679016,
-0.024688681587576866,
0.05519124120473862,
0.028650492429733276,
0.11290338635444641,
0.014492087066173553,
-0.012303796596825123,
0.0051570311188697815,
-0.1452418863773346,
-0.057711537927389145,
0.16433562338352203,
0.0904216468334198,
-0.05296161770820618,
-0.05391585826873779,
-0.020649490877985954,
-0.23148666322231293,
-0.03891010209918022,
0.01454820204526186,
0.035033270716667175,
-0.052824053913354874,
-0.12199490517377853,
0.015398065559566021,
-0.08122637122869492,
-0.10557282716035843,
0.0172891728579998,
0.18658490478992462,
0.03911834582686424,
-0.008783157914876938,
-0.005956664215773344,
0.11523842811584473,
0.03494776412844658,
-0.15544258058071136,
-0.019365018233656883,
0.02797088772058487,
-0.09711520373821259,
-0.03022371232509613,
-0.05873570963740349,
-0.022621633484959602,
-0.01440280769020319,
0.16042454540729523,
-0.05138028785586357,
0.06926571577787399,
0.03135775402188301,
0.030452989041805267,
-0.07969488203525543,
0.1755112111568451,
-0.0674121081829071,
-0.016397953033447266,
-0.04045824706554413,
0.08644254505634308,
-0.004259804263710976,
-0.012763986364006996,
-0.05245109647512436,
0.029330549761652946,
0.08539826422929764,
0.026720255613327026,
-0.026932144537568092,
0.019297847524285316,
-0.06012565642595291,
-0.03812386095523834,
0.01938825286924839,
-0.09178312867879868,
0.024146176874637604,
0.00966875534504652,
-0.08708475530147552,
0.011927510611712933,
0.015465758740901947,
0.02139437198638916,
0.008724655024707317,
0.13184450566768646,
-0.09604257345199585,
-0.012995735742151737,
-0.09974203258752823,
-0.0967763289809227,
0.04223324730992317,
-0.04280447959899902,
0.01407079678028822,
-0.06716860830783844,
-0.12788182497024536,
-0.034321628510951996,
0.06725012511014938,
-0.03211604431271553,
-0.09045057743787766,
-0.021337978541851044,
-0.07080250233411789,
0.05343867465853691,
-0.029932232573628426,
0.1609705090522766,
-0.06081632524728775,
0.10674257576465607,
0.05410373955965042,
0.047527480870485306,
0.011751305311918259,
0.06412922590970993,
-0.06222229823470116,
0.051536623388528824,
-0.16358259320259094,
0.04657385125756264,
-0.09967871755361557,
0.0652831494808197,
-0.1266571581363678,
-0.1292496919631958,
-0.02532307244837284,
0.003937961533665657,
0.09119057655334473,
0.08174586296081543,
-0.15366792678833008,
-0.10645613819360733,
0.14610901474952698,
-0.08717475086450577,
-0.1321558952331543,
0.11470432579517365,
-0.012181094847619534,
-0.0061158607713878155,
0.04547356069087982,
0.12364852428436279,
0.12476550042629242,
-0.08404682576656342,
-0.02699551358819008,
-0.0564306378364563,
0.12967348098754883,
0.003016140777617693,
0.11565293371677399,
-0.01559250894933939,
0.005762850400060415,
0.0026673644315451384,
-0.0546804778277874,
0.06352508068084717,
-0.10564173012971878,
-0.08929242193698883,
-0.04519428685307503,
-0.09463407099246979,
0.02639700099825859,
0.06700262427330017,
0.049068424850702286,
-0.08439823240041733,
-0.12948822975158691,
0.08975720405578613,
0.11930419504642487,
-0.08420424163341522,
0.027152834460139275,
-0.07507837563753128,
0.03822082653641701,
-0.05294805020093918,
-0.026806728914380074,
-0.18491506576538086,
-0.04543256759643555,
0.014999481849372387,
-0.08376818895339966,
0.019197043031454086,
-0.028537603095173836,
0.07240885496139526,
0.059657931327819824,
-0.06362129002809525,
-0.06771063804626465,
-0.11275292932987213,
-0.00983731634914875,
-0.0626165047287941,
-0.21443510055541992,
-0.10365030914545059,
-0.019775932654738426,
0.13999325037002563,
-0.22081980109214783,
0.014420515857636929,
0.013876647688448429,
0.12759846448898315,
0.04716971516609192,
-0.05532298982143402,
-0.019802970811724663,
0.08142542093992233,
-0.026415793225169182,
-0.07014694064855576,
0.01918518729507923,
0.019691303372383118,
-0.11590113490819931,
-0.01887752301990986,
-0.0875759869813919,
0.1348492056131363,
0.10172855854034424,
-0.026543911546468735,
-0.07562941312789917,
-0.023636534810066223,
-0.08721207082271576,
-0.06330832093954086,
-0.03818916529417038,
-0.011174933984875679,
0.12328620254993439,
0.025280997157096863,
0.1223151758313179,
-0.08216099441051483,
-0.051399413496255875,
0.047502949833869934,
0.001004174118861556,
-0.0068093594163656235,
0.10316276550292969,
0.08936676383018494,
-0.05805462226271629,
0.11938060075044632,
0.11017217487096786,
-0.10058743506669998,
0.1735946387052536,
-0.09038649499416351,
-0.12671826779842377,
-0.020161505788564682,
-0.00224503711797297,
0.03685278818011284,
0.12945392727851868,
-0.12277191877365112,
0.018489085137844086,
0.021629193797707558,
0.036801356822252274,
0.023120321333408356,
-0.20099280774593353,
-0.014259831048548222,
0.04630705714225769,
-0.053329650312662125,
-0.06458857655525208,
-0.011781695298850536,
-0.011996546760201454,
0.07795562595129013,
0.007338597904890776,
-0.012624471448361874,
0.0034505194053053856,
-0.006759533658623695,
-0.07119101285934448,
0.19512809813022614,
-0.08600317686796188,
-0.13156093657016754,
-0.1664983630180359,
-0.020186815410852432,
-0.0314442478120327,
-0.007923951372504234,
0.04123306646943092,
-0.10553011298179626,
-0.038917843252420425,
-0.03916694223880768,
0.05275632441043854,
-0.037378210574388504,
0.03839869052171707,
0.0229334756731987,
0.01886414922773838,
0.11286496371030807,
-0.11264189332723618,
0.038381267338991165,
0.00010553374886512756,
-0.055755723267793655,
0.0066122389398515224,
0.042106274515390396,
0.10359275341033936,
0.1578151285648346,
0.03341074660420418,
0.014361206442117691,
-0.01786409690976143,
0.19435550272464752,
-0.1194228082895279,
-0.029949968680739403,
0.12722328305244446,
0.002986985258758068,
0.03845946863293648,
0.09558086842298508,
0.0721735879778862,
-0.07907876372337341,
0.031203852966427803,
0.07086651772260666,
-0.03562864288687706,
-0.24395444989204407,
-0.025993268936872482,
-0.06658034771680832,
-0.010823293589055538,
0.11631263792514801,
0.030166279524564743,
0.020128343254327774,
0.05461302772164345,
-0.02243995852768421,
0.027039365842938423,
-0.02707660011947155,
0.07049073278903961,
0.0776251032948494,
0.06311672925949097,
0.12571150064468384,
-0.03551212325692177,
-0.026009198278188705,
0.037049759179353714,
0.015363316051661968,
0.25755515694618225,
0.008583485148847103,
0.17002518475055695,
0.060419704765081406,
0.13838866353034973,
0.015972550958395004,
0.09292317926883698,
0.021020609885454178,
-0.039026547223329544,
0.0227290578186512,
-0.056548822671175,
-0.015903327614068985,
0.028983991593122482,
0.027969835326075554,
0.052929703146219254,
-0.1343483328819275,
-0.02673507109284401,
0.006529357749968767,
0.33474162220954895,
0.06321343779563904,
-0.32561546564102173,
-0.12771910429000854,
0.007103435695171356,
-0.061527542769908905,
-0.0478101521730423,
0.024790583178400993,
0.09531018137931824,
-0.08367288112640381,
0.07550378143787384,
-0.08499711751937866,
0.10668276995420456,
-0.02532212622463703,
-0.007846123538911343,
0.12479804456233978,
0.08654459565877914,
-0.010374835692346096,
0.054613299667835236,
-0.22185581922531128,
0.27743926644325256,
-0.012950212694704533,
0.08782455325126648,
-0.018166758120059967,
0.03472523391246796,
0.03776603192090988,
0.003902279306203127,
0.046568043529987335,
-0.010983404703438282,
-0.11041350662708282,
-0.20039066672325134,
-0.07006517052650452,
0.028382064774632454,
0.11363761872053146,
-0.05813052877783775,
0.12544545531272888,
-0.034830644726753235,
-0.004741883836686611,
0.07080167531967163,
-0.07199237495660782,
-0.1198287233710289,
-0.10129889845848083,
0.028268149122595787,
0.005522324703633785,
0.08129891008138657,
-0.11239681392908096,
-0.12310992181301117,
-0.04655962064862251,
0.13645882904529572,
-0.0708429291844368,
-0.01539171114563942,
-0.13732358813285828,
0.1022072434425354,
0.18143227696418762,
-0.05746716260910034,
0.07413627952337265,
0.016352767124772072,
0.14317366480827332,
0.05158984661102295,
-0.02618732489645481,
0.09609320759773254,
-0.0825045257806778,
-0.21144913136959076,
-0.043317265808582306,
0.14521360397338867,
0.035435691475868225,
0.05939819663763046,
-0.03469390794634819,
0.036450836807489395,
-0.002909614471718669,
-0.08985010534524918,
0.04683413356542587,
-0.03170083090662956,
0.006936286576092243,
0.04940233379602432,
-0.02609170414507389,
0.044819511473178864,
-0.03441588580608368,
-0.06063546612858772,
0.11724326014518738,
0.27120640873908997,
-0.07582992315292358,
-0.0037676789797842503,
0.030069034546613693,
-0.048280250281095505,
-0.13761621713638306,
0.05390883609652519,
0.14842748641967773,
0.026800410822033882,
0.02566410042345524,
-0.23872917890548706,
0.10479123890399933,
0.10912895202636719,
-0.027892207726836205,
0.12844520807266235,
-0.30061548948287964,
-0.1286226361989975,
0.09173950552940369,
0.10757840424776077,
-0.032394327223300934,
-0.1625761091709137,
-0.0602591298520565,
-0.02377849817276001,
-0.1564292460680008,
0.13289430737495422,
-0.05063936114311218,
0.12047059088945389,
-0.006374917458742857,
0.048692651093006134,
0.015164673328399658,
-0.04736795648932457,
0.15930522978305817,
-0.008951135911047459,
0.06273982673883438,
0.0030879713594913483,
0.0599055215716362,
0.06618790328502655,
-0.04870780184864998,
-0.003314474131911993,
-0.06295205652713776,
0.019131537526845932,
-0.12412464618682861,
-0.03103448450565338,
-0.0899997130036354,
0.026301562786102295,
-0.051604263484478,
-0.040239814668893814,
-0.021371616050601006,
0.053135085850954056,
0.04403255879878998,
-0.010727905668318272,
0.1789998859167099,
-0.01690857671201229,
0.17034070193767548,
0.10016792267560959,
0.08121853321790695,
-0.005979036446660757,
-0.11899497359991074,
-0.005734962411224842,
-0.016719859093427658,
0.07176894694566727,
-0.15353941917419434,
0.03957771137356758,
0.14097023010253906,
0.05932067334651947,
0.1257493942975998,
0.07146710157394409,
-0.057694606482982635,
0.0193649772554636,
0.09056425839662552,
-0.07691656053066254,
-0.11369530856609344,
-0.026305513456463814,
0.038311224430799484,
-0.16108693182468414,
0.031039737164974213,
0.1013755276799202,
-0.06547880917787552,
-0.0025984016247093678,
0.020728930830955505,
0.009816775098443031,
-0.06625336408615112,
0.23346763849258423,
0.04682109132409096,
0.0840088501572609,
-0.09135805815458298,
0.08992914855480194,
0.04791201651096344,
-0.16958071291446686,
-0.010606157593429089,
0.067203089594841,
-0.039284124970436096,
-0.0013644021237269044,
0.01032058522105217,
0.050516437739133835,
-0.06672915071249008,
-0.06386429816484451,
-0.13158759474754333,
-0.1403900682926178,
0.08390096575021744,
0.09972643852233887,
0.04465137794613838,
0.041379664093256,
-0.03794006258249283,
0.06460724025964737,
-0.11209308356046677,
0.09700959920883179,
0.09504247456789017,
0.1025223508477211,
-0.17545634508132935,
0.1334843933582306,
0.011180936358869076,
0.010257748886942863,
-0.0007267320179380476,
0.0009400531416758895,
-0.09445895999670029,
0.014275469817221165,
-0.12984547019004822,
-0.03786598891019821,
-0.03711129352450371,
-0.00436067022383213,
-0.000775595661252737,
-0.05955265834927559,
-0.09290719777345657,
0.025787485763430595,
-0.12034667283296585,
-0.04921281710267067,
0.004829149227589369,
0.0653463751077652,
-0.10318274796009064,
0.0011419699294492602,
0.06953393667936325,
-0.11647472530603409,
0.06865660101175308,
0.04595337435603142,
0.04241902753710747,
0.04696165770292282,
-0.08272500336170197,
0.01773781143128872,
0.042519424110651016,
-0.006985923275351524,
0.03094164840877056,
-0.15823994576931,
-0.0036166030913591385,
-0.02748984657227993,
0.055752210319042206,
-0.012217124924063683,
0.011312220245599747,
-0.1392277330160141,
-0.05121684446930885,
-0.01967180147767067,
-0.04761205241084099,
-0.05836019292473793,
0.03231131657958031,
0.08130752295255661,
0.05372024327516556,
0.17711201310157776,
-0.06940103322267532,
0.020053310319781303,
-0.24372172355651855,
0.015108305960893631,
-0.024557407945394516,
-0.09040450304746628,
-0.06961788982152939,
-0.019804732874035835,
0.06793557107448578,
-0.06119868531823158,
0.09327080100774765,
-0.06366412341594696,
0.0592326857149601,
0.04987586662173271,
-0.08162178099155426,
0.024555491283535957,
0.04951244220137596,
0.2661017179489136,
0.05927712097764015,
-0.014430120587348938,
0.07229438424110413,
0.011005637235939503,
0.05208523944020271,
0.12423763424158096,
0.1774904727935791,
0.14887236058712006,
-0.019473090767860413,
0.08658059686422348,
0.04877844825387001,
-0.10098767280578613,
-0.13992224633693695,
0.10632093250751495,
-0.03030981682240963,
0.1445702314376831,
0.007237004116177559,
0.20383155345916748,
0.10040038824081421,
-0.19476807117462158,
0.05212974175810814,
-0.04340146481990814,
-0.0835166722536087,
-0.09538645297288895,
-0.035538915544748306,
-0.06639759987592697,
-0.2040030062198639,
0.014059806242585182,
-0.1306103765964508,
0.05978861451148987,
0.07327847182750702,
0.018825793638825417,
0.017673173919320107,
0.16151325404644012,
0.04278961941599846,
0.004749804735183716,
0.08942785859107971,
0.005355472676455975,
-0.02044554241001606,
-0.042022235691547394,
-0.09974472224712372,
0.03851104900240898,
-0.021794697269797325,
0.04316575080156326,
-0.06607624888420105,
-0.10638630390167236,
0.0779743492603302,
0.03800412639975548,
-0.10288617014884949,
0.02316512167453766,
0.0026522090192884207,
0.07120753079652786,
0.048683542758226395,
0.0012564826756715775,
0.022159701213240623,
-0.02239968813955784,
0.26184147596359253,
-0.10245542228221893,
-0.031533993780612946,
-0.1513042002916336,
0.21080061793327332,
0.016040191054344177,
-0.0233886931091547,
0.03938978910446167,
-0.07952483743429184,
-0.026911932975053787,
0.1387140303850174,
0.11392160505056381,
-0.0024358611553907394,
-0.03343122825026512,
0.0068269106559455395,
-0.01791452243924141,
-0.057121068239212036,
0.0981622114777565,
0.11248847097158432,
0.0706176608800888,
-0.07215622812509537,
-0.047265272587537766,
-0.05307149887084961,
-0.047301165759563446,
0.00020286280778236687,
0.07419230043888092,
0.02901800163090229,
-0.018191656097769737,
-0.03405202180147171,
0.11145389825105667,
-0.07016230374574661,
-0.11142954230308533,
0.043509360402822495,
-0.16301392018795013,
-0.1916457861661911,
-0.06031017377972603,
0.06549113988876343,
0.010642167180776596,
0.051890112459659576,
-0.012120726518332958,
-0.03956082463264465,
0.09740635752677917,
0.0005566579638980329,
-0.042464423924684525,
-0.1494586318731308,
0.10285580903291702,
-0.06062379106879234,
0.20660313963890076,
-0.04904315620660782,
0.026067344471812248,
0.11133231967687607,
0.057710129767656326,
-0.09080378711223602,
0.015182914212346077,
0.0682859718799591,
-0.1483708918094635,
0.025745421648025513,
0.19061386585235596,
-0.04357847571372986,
0.12894847989082336,
0.013957370072603226,
-0.14369967579841614,
-0.000559928419534117,
-0.06240564584732056,
-0.040389932692050934,
-0.061087969690561295,
-0.0313262976706028,
-0.05782740190625191,
0.12725940346717834,
0.22766102850437164,
-0.07319771498441696,
-0.021348660811781883,
-0.06501992046833038,
0.04352978244423866,
0.07323102653026581,
0.11592020094394684,
-0.030314305797219276,
-0.2855053246021271,
0.020897775888442993,
0.03124786540865898,
-0.015399588271975517,
-0.24048399925231934,
-0.0878898873925209,
0.04587149992585182,
-0.06598217785358429,
-0.03694431111216545,
0.10593309998512268,
0.07038582116365433,
0.05028035119175911,
-0.057970330119132996,
-0.08530087023973465,
-0.06697632372379303,
0.17490001022815704,
-0.16951578855514526,
-0.07729749381542206
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-lang-id
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the anton-l/common_language dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9836
- Accuracy: 0.7945
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 4
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.9568 | 1.0 | 173 | 3.2866 | 0.1146 |
| 1.9243 | 2.0 | 346 | 2.1241 | 0.3840 |
| 1.2923 | 3.0 | 519 | 1.5498 | 0.5489 |
| 0.8659 | 4.0 | 692 | 1.4953 | 0.6126 |
| 0.5539 | 5.0 | 865 | 1.2431 | 0.6926 |
| 0.4101 | 6.0 | 1038 | 1.1443 | 0.7232 |
| 0.2945 | 7.0 | 1211 | 1.0870 | 0.7544 |
| 0.1552 | 8.0 | 1384 | 1.1080 | 0.7661 |
| 0.0968 | 9.0 | 1557 | 0.9836 | 0.7945 |
| 0.0623 | 10.0 | 1730 | 1.0252 | 0.7993 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.1+cu111
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["audio-classification", "generated_from_trainer"], "datasets": ["common_language"], "metrics": ["accuracy"], "model-index": [{"name": "wav2vec2-base-lang-id", "results": []}]}
|
audio-classification
|
anton-l/wav2vec2-base-lang-id
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"audio-classification",
"generated_from_trainer",
"dataset:common_language",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-base-lang-id
=====================
This model is a fine-tuned version of facebook/wav2vec2-base on the anton-l/common\_language dataset.
It achieves the following results on the evaluation set:
* Loss: 0.9836
* Accuracy: 0.7945
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 32
* eval\_batch\_size: 4
* seed: 0
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 10.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.1+cu111
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
60,
159,
4,
37
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #audio-classification #generated_from_trainer #dataset-common_language #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 4\n* seed: 0\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.1+cu111\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.13796932995319366,
0.08837594836950302,
-0.002310005249455571,
0.04870469495654106,
0.13987183570861816,
0.0017145684687420726,
0.09324874728918076,
0.13313263654708862,
-0.10557238012552261,
0.07396221160888672,
0.0899132564663887,
0.09111474454402924,
0.057126350700855255,
0.09441149234771729,
-0.024979662150144577,
-0.3204045295715332,
0.012677767314016819,
0.0320541188120842,
-0.17387503385543823,
0.1169586330652237,
0.11458119004964828,
-0.099046990275383,
0.03298747539520264,
0.04374368116259575,
-0.13371331989765167,
0.015532617457211018,
-0.014826705679297447,
-0.06953268498182297,
0.10290279239416122,
0.04319638013839722,
0.11376781761646271,
0.013602702878415585,
0.08591626584529877,
-0.24381384253501892,
0.015060484409332275,
0.07762012630701065,
0.04282446950674057,
0.07999052107334137,
0.11921443045139313,
-0.015925530344247818,
0.1416296362876892,
-0.06677228957414627,
0.069569893181324,
0.05722960829734802,
-0.11437393724918365,
-0.32690349221229553,
-0.09614520519971848,
0.027777528390288353,
0.0977698564529419,
0.09614411741495132,
-0.023227179422974586,
0.06869713217020035,
-0.05930902063846588,
0.08259144425392151,
0.22324837744235992,
-0.2170248180627823,
-0.08975612372159958,
-0.00627273740246892,
0.061881911009550095,
0.04484618455171585,
-0.12626095116138458,
-0.03290681168437004,
0.055170051753520966,
0.03715714439749718,
0.10468048602342606,
0.013918745331466198,
-0.02449934370815754,
0.01056602317839861,
-0.14064918458461761,
-0.06601972877979279,
0.14937251806259155,
0.08749447762966156,
-0.05089978128671646,
-0.0543166808784008,
0.00044278401765041053,
-0.2246398627758026,
-0.04492189362645149,
0.03359397128224373,
0.028053678572177887,
-0.03965241461992264,
-0.11293182522058487,
0.01987743377685547,
-0.09550028294324875,
-0.09862978011369705,
0.02610151283442974,
0.1603625863790512,
0.04682941362261772,
-0.01724625937640667,
-0.004832873120903969,
0.10837491601705551,
0.05123424530029297,
-0.14628833532333374,
-0.009957652539014816,
0.03649716451764107,
-0.09066701680421829,
-0.027852725237607956,
-0.04758994281291962,
-0.030224688351154327,
-0.010031341575086117,
0.11631593853235245,
-0.04908638447523117,
0.07565757632255554,
0.023938270285725594,
0.03507537022233009,
-0.08360842615365982,
0.17166998982429504,
-0.06700112670660019,
0.007577488664537668,
-0.046725280582904816,
0.08595055341720581,
-0.030581103637814522,
-0.014303418807685375,
-0.045750051736831665,
0.02896479144692421,
0.0988580659031868,
0.03340386599302292,
-0.033867694437503815,
0.009689314290881157,
-0.06818307936191559,
-0.029590502381324768,
-0.020208753645420074,
-0.0995837077498436,
0.025356054306030273,
0.016563039273023605,
-0.08283674716949463,
0.01503810379654169,
0.01077877078205347,
0.027142565697431564,
-0.0008859350346028805,
0.12365862727165222,
-0.07645305246114731,
-0.0035972220357507467,
-0.10258886963129044,
-0.10297297686338425,
0.044663798063993454,
-0.036554522812366486,
0.017133038491010666,
-0.06540556252002716,
-0.11526533216238022,
-0.04455997049808502,
0.07714904099702835,
-0.02812029793858528,
-0.08333008736371994,
-0.04009604454040527,
-0.07311511039733887,
0.0500597320497036,
-0.024765755981206894,
0.17616817355155945,
-0.06139063835144043,
0.11942098289728165,
0.027603069320321083,
0.0331469364464283,
0.018263211473822594,
0.07245023548603058,
-0.05765175819396973,
0.05056120082736015,
-0.13046465814113617,
0.05272798612713814,
-0.09838259220123291,
0.07026958465576172,
-0.13413770496845245,
-0.13295301795005798,
-0.02958904579281807,
0.00578316580504179,
0.09869305044412613,
0.0715266615152359,
-0.17498156428337097,
-0.09849400073289871,
0.152212455868721,
-0.08226149529218674,
-0.12108661234378815,
0.11914146691560745,
-0.024040862917900085,
0.014319200068712234,
0.04110926017165184,
0.14233705401420593,
0.12293751537799835,
-0.09621001034975052,
-0.011289463378489017,
-0.05276960879564285,
0.1321718841791153,
0.00584487384185195,
0.11596924066543579,
-0.03247328847646713,
0.0018866769969463348,
-0.004999453201889992,
-0.028548359870910645,
0.06811469048261642,
-0.10296902805566788,
-0.09155284613370895,
-0.03188905119895935,
-0.0892563983798027,
0.004847842734307051,
0.07195297628641129,
0.042694251984357834,
-0.09230347722768784,
-0.11972132325172424,
0.06196059286594391,
0.11849141865968704,
-0.09596113860607147,
0.027245933189988136,
-0.067889504134655,
0.039006102830171585,
-0.04733525589108467,
-0.026063470169901848,
-0.17677657306194305,
-0.028960956260561943,
0.017150988802313805,
-0.06775730848312378,
0.0201248936355114,
-0.01821703277528286,
0.075403593480587,
0.04466494172811508,
-0.05224437266588211,
-0.06958729773759842,
-0.11023727059364319,
-0.009180156514048576,
-0.06529010087251663,
-0.21586833894252777,
-0.08510712534189224,
-0.021862614899873734,
0.1466454267501831,
-0.23544932901859283,
0.008134410716593266,
0.02323336899280548,
0.10622001439332962,
0.04040190204977989,
-0.05380212515592575,
-0.013650523498654366,
0.07890421897172928,
-0.022235289216041565,
-0.06832532584667206,
0.02282886952161789,
0.019702672958374023,
-0.1158057227730751,
0.011609614826738834,
-0.08667862415313721,
0.12005451321601868,
0.09757228195667267,
-0.043381836265325546,
-0.07511549443006516,
-0.05027619004249573,
-0.08308827877044678,
-0.06664329767227173,
-0.0367189384996891,
-0.003502103267237544,
0.1629343032836914,
0.026162616908550262,
0.11797668039798737,
-0.09038397669792175,
-0.04344893619418144,
0.04509410262107849,
-0.005430174525827169,
-0.008405043743550777,
0.12862904369831085,
0.07675713300704956,
-0.04481511935591698,
0.11248289048671722,
0.10562735050916672,
-0.09247861057519913,
0.17083114385604858,
-0.09059559553861618,
-0.1417737901210785,
-0.00861765630543232,
0.017147254198789597,
0.03644153103232384,
0.131103977560997,
-0.12295925617218018,
0.018396243453025818,
0.022967714816331863,
0.042898159474134445,
0.03147721290588379,
-0.21354421973228455,
-0.020082518458366394,
0.04668577015399933,
-0.057951703667640686,
-0.07795005291700363,
-0.009363514371216297,
-0.0031459438614547253,
0.0821339413523674,
-0.0021221379283815622,
-0.028975266963243484,
-0.003900354029610753,
-0.0182492658495903,
-0.07958003878593445,
0.19720236957073212,
-0.09391454607248306,
-0.12798278033733368,
-0.15670068562030792,
-0.023713160306215286,
-0.00373174250125885,
-0.013230572454631329,
0.04231487587094307,
-0.113588847219944,
-0.03423409163951874,
-0.048183873295784,
0.048885591328144073,
-0.07507862150669098,
0.02432910166680813,
0.0036141511518508196,
0.03270141780376434,
0.10309196263551712,
-0.10917824506759644,
0.03226642310619354,
-0.00259365513920784,
-0.05219193547964096,
0.024131398648023605,
0.022154996171593666,
0.09485536068677902,
0.16219566762447357,
0.03296168893575668,
0.016716431826353073,
-0.03149569779634476,
0.16529132425785065,
-0.10955026745796204,
-0.04142827168107033,
0.11605630815029144,
0.005179135594516993,
0.04035216197371483,
0.101664699614048,
0.06795209646224976,
-0.08186844736337662,
0.03659861907362938,
0.07221388816833496,
-0.03089173510670662,
-0.25771889090538025,
-0.01943138986825943,
-0.07060442119836807,
-0.017986033111810684,
0.11970078945159912,
0.02793109603226185,
0.007387799676507711,
0.04801519960165024,
-0.01586766168475151,
0.015836238861083984,
-0.010857517831027508,
0.06607034802436829,
0.05960535258054733,
0.04863227903842926,
0.11194843798875809,
-0.03246898949146271,
-0.023794660344719887,
0.03555215522646904,
0.02128755860030651,
0.2761599123477936,
0.004045558627694845,
0.1677752584218979,
0.07208429276943207,
0.1493280827999115,
0.012630782090127468,
0.09058127552270889,
0.013645502738654613,
-0.03027331829071045,
0.023854516446590424,
-0.05871427431702614,
-0.022763006389141083,
0.03515060618519783,
0.04520629346370697,
0.044665444642305374,
-0.14598684012889862,
-0.030610481277108192,
-0.002219863934442401,
0.3404091000556946,
0.0611143484711647,
-0.3084312677383423,
-0.10659011453390121,
0.004288546275347471,
-0.06812748312950134,
-0.05812470242381096,
0.030100025236606598,
0.09981391578912735,
-0.0927056148648262,
0.05372413247823715,
-0.08203819394111633,
0.11515038460493088,
-0.03960203751921654,
-0.01692540943622589,
0.10760980099439621,
0.07625866681337357,
-0.00997809786349535,
0.07606714963912964,
-0.25362253189086914,
0.30383360385894775,
-0.01771255023777485,
0.08771906048059464,
-0.025843381881713867,
0.03155158832669258,
0.03806038573384285,
-0.03646518662571907,
0.05296064913272858,
-0.0075733778066933155,
-0.12171794474124908,
-0.2110813409090042,
-0.07478120923042297,
0.03217145800590515,
0.11975069344043732,
-0.04863034188747406,
0.1168055310845375,
-0.033701714128255844,
0.004292212426662445,
0.06855370104312897,
-0.08619045466184616,
-0.12142505496740341,
-0.10483994334936142,
0.022123876959085464,
0.015509583987295628,
0.07833077013492584,
-0.12078799307346344,
-0.12104664742946625,
-0.05719992145895958,
0.1382887214422226,
-0.052989114075899124,
-0.014127636328339577,
-0.1331322342157364,
0.09757228195667267,
0.1828068345785141,
-0.05783150717616081,
0.07065165787935257,
0.019116871058940887,
0.13524237275123596,
0.04269351437687874,
-0.010273340158164501,
0.09144613891839981,
-0.08450832962989807,
-0.18796353042125702,
-0.04860922694206238,
0.15837730467319489,
0.04631673917174339,
0.06302191317081451,
-0.027658184990286827,
0.024942828342318535,
-0.01921713352203369,
-0.0858868807554245,
0.04119439423084259,
-0.01951039955019951,
0.018426505848765373,
0.05397392436861992,
-0.046502068638801575,
0.04447655379772186,
-0.05179277062416077,
-0.07670291513204575,
0.13626323640346527,
0.27436092495918274,
-0.06731510162353516,
-0.00028107044636271894,
0.020024800673127174,
-0.05636226385831833,
-0.13167476654052734,
0.057764891535043716,
0.1590389907360077,
0.03795474022626877,
0.02117173932492733,
-0.24498113989830017,
0.08128705620765686,
0.09648154675960541,
-0.022306514903903008,
0.10670188814401627,
-0.3066181540489197,
-0.11998794227838516,
0.1034051924943924,
0.11350519210100174,
-0.04806331545114517,
-0.15694016218185425,
-0.058477502316236496,
-0.0203995443880558,
-0.15310411155223846,
0.1125616803765297,
-0.03247576579451561,
0.11927872151136398,
-0.003891953034326434,
0.06778062880039215,
0.023863080888986588,
-0.051194749772548676,
0.15680469572544098,
-0.013327471911907196,
0.06381916999816895,
0.004134550224989653,
0.07208189368247986,
0.04622277989983559,
-0.047184109687805176,
-0.00043943821219727397,
-0.06574391573667526,
0.019019825384020805,
-0.14399521052837372,
-0.031763408333063126,
-0.08968111127614975,
0.027996577322483063,
-0.04926740378141403,
-0.04138398915529251,
-0.021690264344215393,
0.05271027609705925,
0.05352063849568367,
-0.0023379854392260313,
0.1535867601633072,
-0.0377592109143734,
0.16012655198574066,
0.05461645498871803,
0.08809760957956314,
-0.0036442826967686415,
-0.10869430750608444,
-0.004381980746984482,
-0.010466402396559715,
0.06041127070784569,
-0.14249898493289948,
0.038424838334321976,
0.15257719159126282,
0.05289172753691673,
0.14125946164131165,
0.06732124835252762,
-0.0698796883225441,
0.014633859507739544,
0.08414485305547714,
-0.056753452867269516,
-0.09001703560352325,
-0.024544255807995796,
0.0844033807516098,
-0.15995104610919952,
0.008821079507470131,
0.08766460418701172,
-0.06183895096182823,
-0.009398354217410088,
0.013270409777760506,
0.0073997266590595245,
-0.07305272668600082,
0.22809556126594543,
0.0429798886179924,
0.08003725856542587,
-0.08527979999780655,
0.09081782400608063,
0.05070243403315544,
-0.17968082427978516,
-0.004743820056319237,
0.054443880915641785,
-0.02886364422738552,
-0.005507813300937414,
0.029737120494246483,
0.0651964619755745,
-0.03538290038704872,
-0.066461481153965,
-0.115996815264225,
-0.14423543214797974,
0.08250075578689575,
0.08987189829349518,
0.04364857077598572,
0.042521022260189056,
-0.03863224387168884,
0.04947874695062637,
-0.11521477997303009,
0.09113224595785141,
0.10713248699903488,
0.08762598037719727,
-0.15922832489013672,
0.1369444727897644,
0.00868837907910347,
0.008975621312856674,
0.010708649642765522,
-0.007601126097142696,
-0.08785899728536606,
0.03114314191043377,
-0.1128155067563057,
-0.026033764705061913,
-0.04570610821247101,
-0.004013286903500557,
0.008816461078822613,
-0.05347530543804169,
-0.07513200491666794,
0.02364598959684372,
-0.12697993218898773,
-0.04820771887898445,
-0.0003143065550830215,
0.0813666582107544,
-0.09451647847890854,
-0.018887639045715332,
0.06888219714164734,
-0.11161048710346222,
0.06836561113595963,
0.047912560403347015,
0.036658719182014465,
0.04564351215958595,
-0.1094951406121254,
0.02240907959640026,
0.0452384352684021,
-0.002745239296928048,
0.02249974198639393,
-0.16811992228031158,
-0.0005932401982136071,
-0.016647392883896828,
0.045446351170539856,
-0.011491112411022186,
0.0013319944264367223,
-0.13915970921516418,
-0.06906094402074814,
-0.02173205278813839,
-0.05741189420223236,
-0.052138037979602814,
0.04545363783836365,
0.06449376046657562,
0.062169600278139114,
0.17109745740890503,
-0.06917081028223038,
0.032950084656476974,
-0.2312527447938919,
0.01787721924483776,
-0.03359284996986389,
-0.07890728116035461,
-0.0552436038851738,
-0.02935967966914177,
0.0721505656838417,
-0.06767727434635162,
0.08441103994846344,
-0.05513383820652962,
0.04784834384918213,
0.0364268496632576,
-0.10017922520637512,
0.046846311539411545,
0.045086298137903214,
0.2749527096748352,
0.05132455378770828,
-0.0116311339661479,
0.07333049178123474,
-0.000036100791476201266,
0.0484592542052269,
0.16147170960903168,
0.15969476103782654,
0.17445240914821625,
0.0032389936968684196,
0.08515892177820206,
0.04957573115825653,
-0.10683107376098633,
-0.11466927081346512,
0.11347309499979019,
-0.013424336910247803,
0.13616108894348145,
0.0054407245479524136,
0.23697668313980103,
0.10583849996328354,
-0.20544052124023438,
0.05961812660098076,
-0.03623422235250473,
-0.08792899549007416,
-0.09401232749223709,
-0.043517809361219406,
-0.06471042335033417,
-0.18567077815532684,
0.018026424571871758,
-0.12627604603767395,
0.06628318876028061,
0.06420877575874329,
0.028650907799601555,
0.018468191847205162,
0.14830684661865234,
0.03829778730869293,
-0.006855770945549011,
0.10524117201566696,
0.0051079620607197285,
-0.014665567316114902,
-0.05365879833698273,
-0.09856677800416946,
0.042004916816949844,
-0.03445442020893097,
0.042748428881168365,
-0.06952649354934692,
-0.10009396821260452,
0.07219330221414566,
0.023966718465089798,
-0.10028965771198273,
0.018910948187112808,
-0.010259737260639668,
0.07857465744018555,
0.07157541066408157,
0.015275600366294384,
0.019147515296936035,
-0.019044071435928345,
0.2625255584716797,
-0.09722757339477539,
-0.038330402225255966,
-0.1433386504650116,
0.2205168753862381,
0.016529861837625504,
-0.03272928670048714,
0.02953553944826126,
-0.07128532230854034,
-0.003493847092613578,
0.15233953297138214,
0.10973650962114334,
-0.005627343896776438,
-0.03243276849389076,
0.0016583919059485197,
-0.01752379536628723,
-0.06052275747060776,
0.0947629064321518,
0.11598309874534607,
0.060705605894327164,
-0.08149716258049011,
-0.057945042848587036,
-0.06560888886451721,
-0.03591018170118332,
-0.010748832486569881,
0.07675860822200775,
0.025756599381566048,
-0.02433266118168831,
-0.024022327736020088,
0.12376157939434052,
-0.06488808244466782,
-0.09739609062671661,
0.01906471885740757,
-0.15585313737392426,
-0.18717236816883087,
-0.05821096897125244,
0.049818459898233414,
0.017019089311361313,
0.044603995978832245,
-0.023778382688760757,
-0.02662540413439274,
0.09904997050762177,
0.004295985214412212,
-0.03905632346868515,
-0.14165249466896057,
0.09681130945682526,
-0.06582546979188919,
0.20265044271945953,
-0.0448349229991436,
0.02993122860789299,
0.11239264160394669,
0.07220131158828735,
-0.0738816186785698,
0.028827430680394173,
0.0708124116063118,
-0.13360847532749176,
0.03155412897467613,
0.19257324934005737,
-0.046348925679922104,
0.13074062764644623,
0.024144135415554047,
-0.12358918786048889,
0.012738782912492752,
-0.0944015383720398,
-0.044299568980932236,
-0.06416837126016617,
-0.024422088637948036,
-0.058612752705812454,
0.12543074786663055,
0.22178798913955688,
-0.07043767720460892,
-0.024874793365597725,
-0.06999990344047546,
0.031291112303733826,
0.06436032056808472,
0.11341764777898788,
-0.03470560163259506,
-0.2779785096645355,
0.01684478111565113,
0.018415626138448715,
-0.011574962176382542,
-0.23687121272087097,
-0.08633106201887131,
0.05089086666703224,
-0.07174354046583176,
-0.04314631596207619,
0.10730203986167908,
0.06623422354459763,
0.04753733426332474,
-0.057067614048719406,
-0.07664386183023453,
-0.05753280222415924,
0.18240323662757874,
-0.16319629549980164,
-0.06797380745410919
] |
null | null |
transformers
|
# Model Card for wav2vec2-base-superb-sv
# Model Details
## Model Description
- **Developed by:** Shu-wen Yang et al.
- **Shared by:** Anton Lozhkov
- **Model type:** Wav2Vec2 with an XVector head
- **Language(s) (NLP):** English
- **License:** Apache 2.0
- **Related Models:**
- **Parent Model:** wav2vec2-large-lv60
- **Resources for more information:**
- [GitHub Repo](https://github.com/s3prl/s3prl/tree/master/s3prl/downstream/sv_voxceleb1)
- [Associated Paper](https://arxiv.org/abs/2105.010517)
# Uses
## Direct Use
This is a ported version of
[S3PRL's Wav2Vec2 for the SUPERB Speaker Verification task](https://github.com/s3prl/s3prl/tree/master/s3prl/downstream/sv_voxceleb1).
The base model is [wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60), which is pretrained on 16kHz
sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
For more information refer to [SUPERB: Speech processing Universal PERformance Benchmark](https://arxiv.org/abs/2105.01051)
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
See the [superb dataset card](https://huggingface.co/datasets/superb)
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
See the [superb dataset card](https://huggingface.co/datasets/superb)
### Factors
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
# Citation
**BibTeX:**
```
@misc{https://doi.org/10.48550/arxiv.2006.11477,
doi = {10.48550/ARXIV.2006.11477},
url = {https://arxiv.org/abs/2006.11477},
author = {Baevski, Alexei and Zhou, Henry and Mohamed, Abdelrahman and Auli, Michael},
keywords = {Computation and Language (cs.CL), Machine Learning (cs.LG), Sound (cs.SD), Audio and Speech Processing (eess.AS), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Electrical engineering, electronic engineering, information engineering, FOS: Electrical engineering, electronic engineering, information engineering},
title = {wav2vec 2.0: A Framework for Self-Supervised Learning of Speech Representations},
publisher = {arXiv},
@misc{https://doi.org/10.48550/arxiv.2105.01051,
doi = {10.48550/ARXIV.2105.01051},
url = {https://arxiv.org/abs/2105.01051},
author = {Yang, Shu-wen and Chi, Po-Han and Chuang, Yung-Sung and Lai, Cheng-I Jeff and Lakhotia, Kushal and Lin, Yist Y. and Liu, Andy T. and Shi, Jiatong and Chang, Xuankai and Lin, Guan-Ting and Huang, Tzu-Hsien and Tseng, Wei-Cheng and Lee, Ko-tik and Liu, Da-Rong and Huang, Zili and Dong, Shuyan and Li, Shang-Wen and Watanabe, Shinji and Mohamed, Abdelrahman and Lee, Hung-yi},
keywords = {Computation and Language (cs.CL), Sound (cs.SD), Audio and Speech Processing (eess.AS), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Electrical engineering, electronic engineering, information engineering, FOS: Electrical engineering, electronic engineering, information engineering},
title = {SUPERB: Speech processing Universal PERformance Benchmark},
publisher = {arXiv},
year = {2021},
}
```
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Anton Lozhkov in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoProcessor, AutoModelForAudioXVector
processor = AutoProcessor.from_pretrained("anton-l/wav2vec2-base-superb-sv")
model = AutoModelForAudioXVector.from_pretrained("anton-l/wav2vec2-base-superb-sv")
```
</details>
|
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "wav2vec2", "audio-classification"], "datasets": ["superb"]}
|
audio-classification
|
anton-l/wav2vec2-base-superb-sv
|
[
"transformers",
"pytorch",
"wav2vec2",
"audio-xvector",
"speech",
"audio",
"audio-classification",
"en",
"dataset:superb",
"arxiv:2105.01051",
"arxiv:1910.09700",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.01051",
"1910.09700",
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #audio-xvector #speech #audio #audio-classification #en #dataset-superb #arxiv-2105.01051 #arxiv-1910.09700 #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Model Card for wav2vec2-base-superb-sv
# Model Details
## Model Description
- Developed by: Shu-wen Yang et al.
- Shared by: Anton Lozhkov
- Model type: Wav2Vec2 with an XVector head
- Language(s) (NLP): English
- License: Apache 2.0
- Related Models:
- Parent Model: wav2vec2-large-lv60
- Resources for more information:
- GitHub Repo
- Associated Paper
# Uses
## Direct Use
This is a ported version of
S3PRL's Wav2Vec2 for the SUPERB Speaker Verification task.
The base model is wav2vec2-large-lv60, which is pretrained on 16kHz
sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
For more information refer to SUPERB: Speech processing Universal PERformance Benchmark
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
See the superb dataset card
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
See the superb dataset card
### Factors
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: More information needed
- Hours used: More information needed
- Cloud Provider: More information needed
- Compute Region: More information needed
- Carbon Emitted: More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
BibTeX:
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Anton Lozhkov in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
</details>
|
[
"# Model Card for wav2vec2-base-superb-sv",
"# Model Details",
"## Model Description\n \n \n- Developed by: Shu-wen Yang et al.\n- Shared by: Anton Lozhkov\n- Model type: Wav2Vec2 with an XVector head\n- Language(s) (NLP): English\n- License: Apache 2.0\n- Related Models:\n - Parent Model: wav2vec2-large-lv60\n- Resources for more information: \n - GitHub Repo\n - Associated Paper",
"# Uses",
"## Direct Use\n \nThis is a ported version of \nS3PRL's Wav2Vec2 for the SUPERB Speaker Verification task.\n\nThe base model is wav2vec2-large-lv60, which is pretrained on 16kHz \nsampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nFor more information refer to SUPERB: Speech processing Universal PERformance Benchmark",
"## Out-of-Scope Use\n \nThe model should not be used to intentionally create hostile or alienating environments for people.",
"# Bias, Risks, and Limitations\n \nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.",
"## Recommendations\n \nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"# Training Details",
"## Training Data\n \nSee the superb dataset card",
"## Training Procedure",
"### Preprocessing\n \nMore information needed",
"### Speeds, Sizes, Times\n \nMore information needed",
"# Evaluation",
"## Testing Data, Factors & Metrics",
"### Testing Data\n \nSee the superb dataset card",
"### Factors",
"### Metrics\n \nMore information needed",
"## Results \n \nMore information needed",
"# Model Examination\n \nMore information needed",
"# Environmental Impact\n \n \nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n \nMore information needed",
"## Compute Infrastructure\n \nMore information needed",
"### Hardware\n \nMore information needed",
"### Software\nMore information needed\n \nBibTeX:",
"# Glossary [optional]\nMore information needed",
"# More Information [optional]\n \nMore information needed",
"# Model Card Authors [optional]\n \n \nAnton Lozhkov in collaboration with Ezi Ozoani and the Hugging Face team",
"# Model Card Contact\n \nMore information needed",
"# How to Get Started with the Model\n \nUse the code below to get started with the model.\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #audio-xvector #speech #audio #audio-classification #en #dataset-superb #arxiv-2105.01051 #arxiv-1910.09700 #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Model Card for wav2vec2-base-superb-sv",
"# Model Details",
"## Model Description\n \n \n- Developed by: Shu-wen Yang et al.\n- Shared by: Anton Lozhkov\n- Model type: Wav2Vec2 with an XVector head\n- Language(s) (NLP): English\n- License: Apache 2.0\n- Related Models:\n - Parent Model: wav2vec2-large-lv60\n- Resources for more information: \n - GitHub Repo\n - Associated Paper",
"# Uses",
"## Direct Use\n \nThis is a ported version of \nS3PRL's Wav2Vec2 for the SUPERB Speaker Verification task.\n\nThe base model is wav2vec2-large-lv60, which is pretrained on 16kHz \nsampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nFor more information refer to SUPERB: Speech processing Universal PERformance Benchmark",
"## Out-of-Scope Use\n \nThe model should not be used to intentionally create hostile or alienating environments for people.",
"# Bias, Risks, and Limitations\n \nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.",
"## Recommendations\n \nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"# Training Details",
"## Training Data\n \nSee the superb dataset card",
"## Training Procedure",
"### Preprocessing\n \nMore information needed",
"### Speeds, Sizes, Times\n \nMore information needed",
"# Evaluation",
"## Testing Data, Factors & Metrics",
"### Testing Data\n \nSee the superb dataset card",
"### Factors",
"### Metrics\n \nMore information needed",
"## Results \n \nMore information needed",
"# Model Examination\n \nMore information needed",
"# Environmental Impact\n \n \nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n \nMore information needed",
"## Compute Infrastructure\n \nMore information needed",
"### Hardware\n \nMore information needed",
"### Software\nMore information needed\n \nBibTeX:",
"# Glossary [optional]\nMore information needed",
"# More Information [optional]\n \nMore information needed",
"# Model Card Authors [optional]\n \n \nAnton Lozhkov in collaboration with Ezi Ozoani and the Hugging Face team",
"# Model Card Contact\n \nMore information needed",
"# How to Get Started with the Model\n \nUse the code below to get started with the model.\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] |
[
93,
15,
3,
91,
3,
98,
28,
87,
41,
3,
9,
4,
8,
12,
3,
11,
11,
4,
8,
5,
8,
68,
9,
10,
8,
6,
11,
11,
10,
27,
7,
41
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #audio-xvector #speech #audio #audio-classification #en #dataset-superb #arxiv-2105.01051 #arxiv-1910.09700 #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Model Card for wav2vec2-base-superb-sv# Model Details## Model Description\n \n \n- Developed by: Shu-wen Yang et al.\n- Shared by: Anton Lozhkov\n- Model type: Wav2Vec2 with an XVector head\n- Language(s) (NLP): English\n- License: Apache 2.0\n- Related Models:\n - Parent Model: wav2vec2-large-lv60\n- Resources for more information: \n - GitHub Repo\n - Associated Paper# Uses## Direct Use\n \nThis is a ported version of \nS3PRL's Wav2Vec2 for the SUPERB Speaker Verification task.\n\nThe base model is wav2vec2-large-lv60, which is pretrained on 16kHz \nsampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nFor more information refer to SUPERB: Speech processing Universal PERformance Benchmark## Out-of-Scope Use\n \nThe model should not be used to intentionally create hostile or alienating environments for people.# Bias, Risks, and Limitations\n \nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.## Recommendations\n \nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.# Training Details## Training Data\n \nSee the superb dataset card## Training Procedure### Preprocessing\n \nMore information needed### Speeds, Sizes, Times\n \nMore information needed# Evaluation"
] |
[
-0.0795808881521225,
0.13047201931476593,
-0.005528058856725693,
-0.007876221090555191,
0.10476787388324738,
-0.02524806559085846,
0.04749766364693642,
0.08203325420618057,
-0.05444943904876709,
0.12089361995458603,
0.012087037786841393,
-0.012852263636887074,
0.10395054519176483,
0.07120783627033234,
0.07128898054361343,
-0.19283653795719147,
0.06366246938705444,
-0.08455077558755875,
0.04962926357984543,
0.08075285702943802,
0.11691729724407196,
-0.10073541849851608,
0.05608636140823364,
-0.013523699715733528,
-0.029340045526623726,
0.03771989792585373,
-0.023761337623000145,
-0.05432458221912384,
0.05210691690444946,
0.06066041812300682,
0.04738341271877289,
0.044496044516563416,
0.11837057769298553,
-0.2886888086795807,
0.020650682970881462,
0.08944142609834671,
0.04713203012943268,
0.05478651449084282,
0.07943431288003922,
-0.007660206872969866,
0.02444702945649624,
-0.09123662859201431,
0.04455587640404701,
0.07642940431833267,
-0.0999118834733963,
-0.13873963057994843,
-0.07954934984445572,
0.16559748351573944,
0.06422721594572067,
0.03013775870203972,
-0.042780984193086624,
0.08115028589963913,
-0.03498360142111778,
0.039233703166246414,
0.194609135389328,
-0.133217453956604,
-0.014155619777739048,
0.032564304769039154,
0.0043941885232925415,
0.05434723198413849,
-0.09764993190765381,
0.04014533385634422,
0.03380525857210159,
0.0002917821111623198,
0.03905796632170677,
-0.012380952946841717,
0.004278087988495827,
-0.014654426835477352,
-0.13252201676368713,
-0.05991208180785179,
0.08320972323417664,
0.04135238006711006,
-0.12510976195335388,
-0.14174990355968475,
-0.029636047780513763,
-0.03670789673924446,
-0.006809160578995943,
-0.07600701600313187,
0.049898888915777206,
-0.01129845716059208,
0.03363078832626343,
-0.05759035423398018,
-0.08347705006599426,
-0.08770047873258591,
0.020704513415694237,
0.13194218277931213,
0.03151555731892586,
0.03093820810317993,
0.027658307924866676,
0.08917079865932465,
-0.051745835691690445,
-0.08185284584760666,
-0.08071177452802658,
-0.06762093305587769,
-0.11430585384368896,
-0.03580395132303238,
-0.010467366315424442,
-0.04001353681087494,
0.010218098759651184,
0.1176559254527092,
-0.03877211734652519,
0.05436289310455322,
0.0013168258592486382,
0.02027720771729946,
0.09202226996421814,
0.06472498923540115,
-0.06843072921037674,
-0.02913564257323742,
-0.019781187176704407,
0.0008081009145826101,
0.022264886647462845,
-0.02873975783586502,
-0.00010540127550484613,
-0.03446388617157936,
0.021488796919584274,
0.09381312876939774,
0.01724160648882389,
0.02954632230103016,
-0.02818874828517437,
-0.0270366370677948,
0.1054891049861908,
-0.15092910826206207,
0.0123068206012249,
0.039283495396375656,
-0.01985103078186512,
0.07836591452360153,
0.01940727047622204,
0.008037985302507877,
-0.11800865828990936,
0.0835837572813034,
-0.02205631695687771,
0.004155614413321018,
-0.06211468577384949,
-0.038856107741594315,
0.03769184648990631,
-0.00857471115887165,
-0.0397975817322731,
-0.08760649710893631,
-0.12837667763233185,
-0.0594562366604805,
-0.0030567485373467207,
-0.03563956916332245,
-0.03312939032912254,
-0.04131694883108139,
-0.0016060492489486933,
0.00025324340094812214,
-0.048035528510808945,
0.04239349812269211,
-0.023140188306570053,
0.03393629193305969,
-0.019873280078172684,
0.02446400187909603,
0.04868147149682045,
0.028210150077939034,
-0.06677667051553726,
0.01851079612970352,
-0.09870481491088867,
0.10389328747987747,
-0.08025112748146057,
-0.03212656453251839,
-0.12844806909561157,
-0.056428976356983185,
-0.05303853377699852,
0.05368934944272041,
0.02874496392905712,
0.09939420223236084,
-0.16038143634796143,
-0.035898059606552124,
0.21848037838935852,
-0.1547565907239914,
-0.0031418560538440943,
0.12728248536586761,
-0.013314031064510345,
0.013616645708680153,
0.13402609527111053,
0.09155787527561188,
0.05684446915984154,
-0.14516307413578033,
-0.06918002665042877,
-0.033531565219163895,
-0.03345249220728874,
0.141781747341156,
0.06907844543457031,
-0.06511729210615158,
0.07134630531072617,
0.014460175298154354,
-0.06181497871875763,
-0.07072687894105911,
-0.011347010731697083,
-0.051305141299963,
-0.018095141276717186,
-0.04767390713095665,
0.03812455013394356,
-0.028605183586478233,
-0.06900027394294739,
-0.032753217965364456,
-0.1251000612974167,
0.058440808206796646,
0.1297048032283783,
-0.03716525435447693,
0.051525358110666275,
-0.11669142544269562,
0.039214327931404114,
0.02928205020725727,
-0.0020259495358914137,
-0.1455410122871399,
0.06427045911550522,
0.0017658349825069308,
-0.07066057622432709,
0.1005316749215126,
0.0442652627825737,
0.01748291216790676,
0.07512523978948593,
-0.050701066851615906,
-0.013192334212362766,
-0.030831392854452133,
0.010905897244811058,
-0.006201517768204212,
-0.1272953301668167,
-0.028739195317029953,
-0.022318365052342415,
0.038052912801504135,
-0.1615709811449051,
0.02325817570090294,
0.08065511286258698,
0.09167736768722534,
0.027803579345345497,
-0.07443230599164963,
0.035414520651102066,
0.008225394412875175,
0.013431044295430183,
-0.01771491952240467,
0.019246894866228104,
-0.009088949300348759,
0.00931637641042471,
0.041589733213186264,
-0.14846371114253998,
-0.09937452524900436,
0.08265198022127151,
-0.019077293574810028,
-0.1008271649479866,
-0.0031865600030869246,
-0.005902240984141827,
-0.003053411142900586,
-0.1230812668800354,
-0.0877111479640007,
0.17272162437438965,
0.014989932999014854,
0.06118541583418846,
-0.0819067731499672,
-0.03808971494436264,
0.010352566838264465,
-0.04807138070464134,
-0.002575425198301673,
0.01975788176059723,
0.04353870078921318,
-0.09366507828235626,
0.07901837676763535,
0.005883481819182634,
0.015896355733275414,
0.147483229637146,
0.004757444374263287,
-0.11478745937347412,
-0.006676278077065945,
-0.005594434682279825,
-0.018007805570960045,
0.08001723885536194,
-0.04807350039482117,
0.005521818995475769,
0.025042571127414703,
0.05638531222939491,
0.005788533948361874,
-0.0960378348827362,
0.054133206605911255,
0.04237016290426254,
-0.059126097708940506,
-0.09059654176235199,
-0.03739851713180542,
-0.02327919378876686,
0.08218398690223694,
-0.005541044753044844,
0.08332627266645432,
-0.0339062437415123,
-0.06577395647764206,
-0.13547545671463013,
0.08821571618318558,
-0.07397297024726868,
-0.21978813409805298,
-0.1461334526538849,
-0.009761554189026356,
-0.01081145741045475,
0.015193219296634197,
0.030348781496286392,
-0.06806878000497818,
-0.07797753810882568,
-0.10796234011650085,
0.09327349811792374,
-0.029520193114876747,
-0.04369377717375755,
0.010387221351265907,
0.0766945332288742,
0.05194096267223358,
-0.09997635334730148,
0.020271558314561844,
-0.012713470496237278,
-0.09966059774160385,
-0.04774767905473709,
0.09765809774398804,
0.058824412524700165,
0.09089828282594681,
0.036526601761579514,
-0.04011793062090874,
-0.021866511553525925,
0.1826951950788498,
-0.10086192190647125,
0.05297670513391495,
0.17908531427383423,
-0.10493979603052139,
0.04582693800330162,
0.14483730494976044,
0.030604034662246704,
-0.054924607276916504,
0.01104893907904625,
0.033050816506147385,
-0.055052272975444794,
-0.23058786988258362,
-0.056041959673166275,
-0.024424266070127487,
-0.009473882615566254,
0.04815547168254852,
0.05750683322548866,
-0.01556481048464775,
0.05478028953075409,
-0.10595526546239853,
-0.04395386576652527,
0.07364438474178314,
0.054687101393938065,
0.014703674241900444,
-0.008255092427134514,
0.07425179332494736,
-0.051216524094343185,
0.006758611649274826,
0.0700993463397026,
0.021844547241926193,
0.16766275465488434,
0.011891983449459076,
0.12898606061935425,
0.07157474756240845,
0.05626831576228142,
0.06523580104112625,
0.0494413748383522,
-0.02484273724257946,
0.007691556122153997,
-0.02033226005733013,
-0.07761871069669724,
-0.06923291087150574,
0.05813110992312431,
0.0652921199798584,
0.007694825064390898,
-0.024568505585193634,
-0.05684540048241615,
0.05127652734518051,
0.24248522520065308,
0.04128441959619522,
-0.11482103168964386,
-0.0748315304517746,
0.059193991124629974,
-0.07580750435590744,
-0.04714185744524002,
0.022993676364421844,
0.12748977541923523,
-0.15402637422084808,
0.08280695229768753,
0.05971837788820267,
0.0874868705868721,
-0.08791401237249374,
0.01330493949353695,
-0.044667571783065796,
0.07125566154718399,
-0.02294756844639778,
0.08673742413520813,
-0.1607070118188858,
0.06315620243549347,
-0.008727899752557278,
0.09319647401571274,
-0.033672209829092026,
0.05109013617038727,
-0.0005298035102896392,
0.048303086310625076,
0.16928307712078094,
0.015676112845540047,
-0.13376086950302124,
-0.1279630959033966,
-0.0689830556511879,
0.017545711249113083,
0.09158138185739517,
-0.03622284159064293,
0.07931384444236755,
-0.03515257686376572,
0.003605230711400509,
-0.03134454786777496,
-0.013341298326849937,
-0.16999387741088867,
-0.13757312297821045,
0.01731976680457592,
-0.06953496485948563,
0.07129655033349991,
-0.06726980209350586,
-0.03374149277806282,
-0.08147914707660675,
0.12192924320697784,
-0.24687904119491577,
-0.07330509275197983,
-0.08959265798330307,
0.001272983499802649,
0.06717802584171295,
-0.04591086506843567,
-0.0031940084882080555,
0.04210939630866051,
0.1857859492301941,
-0.0068710786290466785,
-0.05564646050333977,
0.021535446867346764,
-0.07117016613483429,
-0.2164442092180252,
-0.04018431901931763,
0.18184460699558258,
0.06010965630412102,
0.08389124274253845,
0.010661349631845951,
0.04802219942212105,
0.07429561764001846,
-0.10903206467628479,
0.04560946300625801,
0.2035343497991562,
-0.026450444012880325,
0.083845354616642,
-0.04431363195180893,
-0.16338497400283813,
-0.09403505176305771,
-0.06489214301109314,
0.1576482057571411,
0.1972285509109497,
-0.049404602497816086,
0.2034960687160492,
0.18408773839473724,
-0.1465623825788498,
-0.2905844449996948,
-0.024474339559674263,
0.059049904346466064,
0.029234249144792557,
0.08277355134487152,
-0.21177275478839874,
0.06543049216270447,
0.0649527832865715,
-0.012619164772331715,
0.012429817579686642,
-0.1816025823354721,
-0.13414601981639862,
0.08545488864183426,
0.03293134644627571,
-0.035819850862026215,
-0.09774007648229599,
-0.068932443857193,
-0.06458601355552673,
-0.04969102889299393,
0.11251523345708847,
-0.08115237951278687,
0.06374036520719528,
0.050724323838949203,
0.010005938820540905,
0.0556604377925396,
-0.019759200513362885,
0.12794992327690125,
-0.00920046865940094,
0.015845024958252907,
-0.09488558024168015,
0.0400366336107254,
-0.008301351219415665,
-0.03448646143078804,
0.08665746450424194,
-0.011652277782559395,
0.009920138865709305,
-0.022267647087574005,
-0.06292123347520828,
-0.04759431257843971,
0.032574061304330826,
-0.060664866119623184,
-0.04928363487124443,
-0.052573494613170624,
0.058565907180309296,
0.07960504293441772,
-0.02448832429945469,
0.0009721870883367956,
-0.12123218178749084,
-0.0028638301882892847,
0.22736617922782898,
0.1852743774652481,
0.028860541060566902,
-0.12344976514577866,
-0.015804030001163483,
-0.03675803914666176,
0.06208372488617897,
-0.10024242103099823,
0.07589219510555267,
0.053711239248514175,
0.04197365418076515,
0.09805314242839813,
-0.0020957968663424253,
-0.13583102822303772,
0.009484891779720783,
0.04954740032553673,
-0.08106735348701477,
-0.20720195770263672,
0.0067691197618842125,
0.032034773379564285,
-0.10840411484241486,
-0.036223527044057846,
0.1387733817100525,
-0.061052896082401276,
-0.017533140257000923,
0.004462617915123701,
0.09832289069890976,
-0.053714409470558167,
0.12459170073270798,
0.0415799580514431,
0.06483876705169678,
-0.07592705637216568,
0.10630578547716141,
0.06862740218639374,
-0.044248513877391815,
0.03260581195354462,
0.034866590052843094,
-0.06275257468223572,
-0.03692202642560005,
-0.06629964709281921,
-0.010653159581124783,
0.06955907493829727,
-0.09486372768878937,
0.013917307369410992,
-0.11670882254838943,
0.02915792167186737,
0.03167247772216797,
0.01754877157509327,
0.023677054792642593,
0.0061240969225764275,
0.05300547927618027,
-0.08360124379396439,
0.07219120860099792,
-0.004029209725558758,
0.040264006704092026,
-0.11405625939369202,
0.1296660602092743,
0.04240534082055092,
0.011882917955517769,
-0.022194303572177887,
-0.04469941556453705,
-0.08486124128103256,
0.04243570938706398,
-0.13327741622924805,
0.0288661178201437,
-0.08230890333652496,
-0.01447861548513174,
-0.00037289102328941226,
-0.031560350209474564,
-0.008483308367431164,
0.05389665812253952,
-0.043814197182655334,
-0.011616655625402927,
-0.02903732843697071,
0.08586657792329788,
-0.14709465205669403,
0.05320323631167412,
0.10031643509864807,
-0.08841674774885178,
0.08980923891067505,
0.01831301487982273,
-0.050096169114112854,
0.09652013331651688,
-0.17168866097927094,
-0.038412660360336304,
0.004718582611531019,
0.05364661291241646,
-0.029440758749842644,
-0.10618533939123154,
-0.0025285875890403986,
-0.0060656326822936535,
-0.030400656163692474,
-0.009032707661390305,
0.07054919749498367,
-0.06618339568376541,
0.033040646463632584,
0.01634281501173973,
-0.033671293407678604,
-0.04932555556297302,
0.012622874230146408,
0.07252941280603409,
0.058670759201049805,
0.11327467858791351,
-0.07401710003614426,
0.04497494921088219,
-0.15932194888591766,
0.01948736608028412,
0.0163860060274601,
0.034841787070035934,
-0.05369994789361954,
-0.044418055564165115,
0.07632550597190857,
-0.025253476575016975,
0.1661485731601715,
-0.058780185878276825,
-0.025299420580267906,
0.06762631237506866,
-0.07205254584550858,
-0.045531436800956726,
0.06141137704253197,
0.09127141535282135,
0.04121936112642288,
-0.014725670218467712,
-0.050717320293188095,
-0.07101496309041977,
-0.022348299622535706,
-0.024914659559726715,
0.10278008133172989,
0.1675695776939392,
0.08894289284944534,
0.010563538409769535,
0.05776490643620491,
-0.04934944957494736,
-0.01472738292068243,
0.08164988458156586,
-0.06516627967357635,
-0.008461232297122478,
-0.057223279029130936,
0.12761344015598297,
0.1272292137145996,
-0.12846973538398743,
0.10014248639345169,
-0.05932798981666565,
-0.07342212647199631,
-0.1397232860326767,
-0.17310881614685059,
-0.03255234286189079,
-0.053324535489082336,
0.014880230650305748,
-0.09366469085216522,
0.06491399556398392,
0.09095924347639084,
0.02038838528096676,
-0.017325369641184807,
0.08465942740440369,
-0.08668697625398636,
-0.08167541027069092,
0.03217360004782677,
-0.004934423137456179,
0.013911701738834381,
0.019619369879364967,
0.011435071006417274,
0.09971432387828827,
0.07434548437595367,
0.09208465367555618,
0.06480833142995834,
0.004190548788756132,
0.013197820633649826,
-0.04334414005279541,
-0.04599367082118988,
0.02518848329782486,
-0.026358673349022865,
0.021199509501457214,
0.1358676254749298,
0.08515099436044693,
-0.010715650394558907,
-0.00267625181004405,
0.20557768642902374,
-0.052736517041921616,
-0.08313717693090439,
-0.16400164365768433,
0.0728345587849617,
0.03700602054595947,
0.06630372256040573,
0.04640381410717964,
-0.11728862673044205,
-0.017709091305732727,
0.1509246826171875,
0.18335014581680298,
-0.04086364805698395,
0.041284170001745224,
-0.036448296159505844,
0.011156905442476273,
-0.0106308963149786,
0.08445195108652115,
0.027622587978839874,
0.18483297526836395,
-0.03878430277109146,
0.12562914192676544,
0.002725862665101886,
-0.045449454337358475,
-0.02789863757789135,
0.12926171720027924,
-0.06358959525823593,
0.029209643602371216,
-0.07546338438987732,
0.11853817105293274,
-0.09744329005479813,
-0.2936784327030182,
-0.0345129631459713,
-0.04604975879192352,
-0.10103097558021545,
0.025732342153787613,
0.034325648099184036,
0.07472007721662521,
0.02323834039270878,
0.02855844795703888,
-0.0023703621700406075,
0.1814890056848526,
0.03479483723640442,
-0.04095151275396347,
-0.042215850204229355,
0.07698163390159607,
-0.03289089351892471,
0.17144396901130676,
0.021171987056732178,
0.09912583976984024,
0.03325488790869713,
0.014813395217061043,
-0.11662252992391586,
0.06320320069789886,
0.015576031990349293,
-0.11020801961421967,
0.002309053670614958,
0.21324904263019562,
-0.021683240309357643,
0.1265954226255417,
0.07414072006940842,
-0.05443761870265007,
0.015064838342368603,
0.020432759076356888,
-0.028605107218027115,
-0.06603740155696869,
0.08743449300527573,
-0.09980916976928711,
0.11105548590421677,
0.11540583521127701,
-0.04400411620736122,
-0.001999183092266321,
-0.07020483165979385,
0.031469035893678665,
-0.02181416191160679,
0.03795691579580307,
0.0244244784116745,
-0.20515599846839905,
0.02755122259259224,
-0.01023374404758215,
0.09223803877830505,
-0.21226470172405243,
-0.06277282536029816,
0.027485214173793793,
-0.004835283383727074,
-0.021451633423566818,
0.08822868019342422,
0.06585577130317688,
0.017222946509718895,
-0.04364621639251709,
-0.11133727431297302,
0.024931970983743668,
0.119237020611763,
-0.08498764783143997,
-0.060168005526065826
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Chuvash
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Chuvash using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "cv", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-chuvash")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-chuvash")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Chuvash test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/cv.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-chuvash")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-chuvash")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/cv/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/cv/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 40.01 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
The script used for training can be found [here](github.com)
|
{"language": "cv", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Chuvash XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice cv", "type": "common_voice", "args": "cv"}, "metrics": [{"type": "wer", "value": 40.01, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-chuvash
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"cv",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"cv"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Chuvash
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Chuvash using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Chuvash test data of Common Voice.
Test Result: 40.01 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
The script used for training can be found here
|
[
"# Wav2Vec2-Large-XLSR-53-Chuvash\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Chuvash using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Chuvash test data of Common Voice.\n\n\n\nTest Result: 40.01 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training.\n\nThe script used for training can be found here"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Chuvash\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Chuvash using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Chuvash test data of Common Voice.\n\n\n\nTest Result: 40.01 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training.\n\nThe script used for training can be found here"
] |
[
81,
67,
20,
29,
32
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Chuvash\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Chuvash using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Chuvash test data of Common Voice.\n\n\n\nTest Result: 40.01 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training.\n\nThe script used for training can be found here"
] |
[
-0.1645575314760208,
0.028605109080672264,
-0.0006975739961490035,
-0.03175687417387962,
0.11596263200044632,
-0.04041123762726784,
0.17484810948371887,
0.10000625997781754,
-0.046234000474214554,
-0.003109295852482319,
0.04415027052164078,
-0.01896660216152668,
0.06809443235397339,
0.13745103776454926,
0.041356973350048065,
-0.20099636912345886,
-0.019856838509440422,
0.014988286420702934,
0.06287942081689835,
0.12760809063911438,
0.09287988394498825,
-0.08102356642484665,
-0.000355348369339481,
0.08048886805772781,
-0.15552233159542084,
0.016822464764118195,
0.01327897422015667,
-0.10811056196689606,
0.1331731081008911,
0.05681609734892845,
0.07817240804433823,
0.040214575827121735,
0.10825573652982712,
-0.2174430787563324,
0.03451072797179222,
0.05273228883743286,
0.027870502322912216,
0.04710307717323303,
0.03646549582481384,
-0.031178181990981102,
0.1318422108888626,
0.07872849702835083,
-0.026070179417729378,
0.09715816378593445,
-0.06120280176401138,
-0.1745460480451584,
-0.014225401915609837,
-0.00317950127646327,
0.0785917118191719,
0.1466905176639557,
-0.06450653821229935,
0.07518570125102997,
-0.1361297070980072,
0.08762790262699127,
0.11780995875597,
-0.12412360310554504,
-0.004730944521725178,
0.09267986565828323,
0.07542523741722107,
0.04719556123018265,
-0.07612621039152145,
0.021175215020775795,
0.046089403331279755,
0.0136746009811759,
0.047290533781051636,
0.004853719845414162,
-0.22120216488838196,
-0.019207408651709557,
-0.12280811369419098,
-0.0484195277094841,
0.19788527488708496,
-0.02073686756193638,
-0.07199439406394958,
-0.12707637250423431,
-0.01383914053440094,
-0.042126793414354324,
-0.005920011084526777,
-0.029835721477866173,
-0.02028079703450203,
0.04243052750825882,
-0.07565531134605408,
-0.019460031762719154,
-0.1293574422597885,
-0.15038172900676727,
0.04680190235376358,
-0.0008612772799097002,
0.008216354995965958,
0.009654941968619823,
-0.16117286682128906,
0.11035989969968796,
0.0006118185119703412,
-0.06868235021829605,
-0.00936130154877901,
0.01167434360831976,
-0.04586958885192871,
-0.00649094907566905,
-0.07318467646837234,
-0.13059277832508087,
0.023178117349743843,
0.022096190601587296,
0.09628914296627045,
0.017518045380711555,
-0.04483181610703468,
0.049031686037778854,
0.035950373858213425,
0.09297767281532288,
-0.05664661154150963,
-0.0016785500338301063,
0.04854479432106018,
0.06176653504371643,
-0.05053221434354782,
-0.022279275581240654,
-0.0727727860212326,
-0.05791844427585602,
0.0686429813504219,
0.07862409949302673,
-0.011376004666090012,
0.0029988177120685577,
-0.040147315710783005,
-0.03661903738975525,
0.05335579067468643,
-0.09052330255508423,
-0.04619919881224632,
0.0924575999379158,
-0.025295788422226906,
0.08590733259916306,
0.08019012212753296,
0.06069303676486015,
-0.07876306772232056,
-0.03168267384171486,
0.00019826380594167858,
0.07371380925178528,
-0.030386336147785187,
-0.08954175561666489,
0.03849421441555023,
-0.010651578195393085,
-0.015553497709333897,
-0.11085641384124756,
-0.12029168754816055,
-0.07015924900770187,
-0.0018675540341064334,
0.04455560818314552,
0.024310601875185966,
-0.10434486716985703,
-0.0006569710676558316,
-0.03320933133363724,
-0.05181072652339935,
0.13272923231124878,
-0.04829581826925278,
0.06603624671697617,
0.015500875189900398,
0.060519713908433914,
0.09411462396383286,
0.07385364919900894,
-0.07858783006668091,
-0.07400708645582199,
0.05885855853557587,
0.12016509473323822,
-0.009334702044725418,
-0.024725906550884247,
-0.09993916749954224,
-0.08275611698627472,
-0.0834023654460907,
0.05440767481923103,
0.059026654809713364,
0.1365373730659485,
-0.28026852011680603,
-0.09094398468732834,
0.2398305982351303,
-0.11100054532289505,
-0.030519142746925354,
0.17401152849197388,
-0.021018315106630325,
0.09473849833011627,
0.13536123931407928,
0.15490412712097168,
0.12188848853111267,
-0.221685528755188,
0.030199773609638214,
0.0019272604258731008,
0.0021029221825301647,
-0.07347310334444046,
0.08747036755084991,
-0.04593448340892792,
0.011905130930244923,
0.031030485406517982,
-0.07185010612010956,
0.10506466031074524,
-0.048442542552948,
-0.06153573468327522,
-0.014466122724115849,
-0.06934312731027603,
0.029569968581199646,
0.03635886311531067,
0.01887696236371994,
-0.005857761949300766,
-0.07823799550533295,
-0.0001193461794173345,
0.12456464767456055,
-0.13335303962230682,
0.05326736345887184,
-0.1043829619884491,
0.046802010387182236,
-0.042578622698783875,
0.0009633382433094084,
-0.14732380211353302,
0.15840326249599457,
-0.021004894748330116,
0.06746654957532883,
0.015834031626582146,
0.12917406857013702,
0.010249806568026543,
0.038125187158584595,
-0.018142061308026314,
-0.0041974252089858055,
-0.020878881216049194,
-0.038566432893276215,
-0.03386785835027695,
-0.07617618888616562,
-0.05931108444929123,
-0.07122215628623962,
0.14749811589717865,
-0.1950833946466446,
0.012441643513739109,
0.0423622652888298,
0.012838474474847317,
0.012196183204650879,
-0.036387860774993896,
0.09545134752988815,
0.0744067132472992,
-0.021318629384040833,
-0.013890477828681469,
0.0498831570148468,
0.01640130952000618,
-0.047584809362888336,
0.08574491739273071,
-0.1577712595462799,
-0.0687648206949234,
0.11448937654495239,
-0.035460881888866425,
-0.00005107638207846321,
-0.04188643395900726,
-0.0343293733894825,
-0.013380160555243492,
-0.10785903036594391,
-0.020210696384310722,
0.23951750993728638,
-0.007751191034913063,
0.12847106158733368,
-0.08933559060096741,
-0.005416593514382839,
0.020389720797538757,
-0.06733543425798416,
0.05369934067130089,
0.032523296773433685,
0.015436762012541294,
0.054879967123270035,
0.008252146653831005,
-0.049311812967061996,
-0.05320890247821808,
0.2546238303184509,
-0.025570129975676537,
-0.10107256472110748,
0.02705543488264084,
-0.015745025128126144,
-0.01206472609192133,
0.03624289482831955,
-0.17101293802261353,
-0.06802336126565933,
0.03434241935610771,
0.028736133128404617,
0.06068158894777298,
-0.16793905198574066,
0.010817409493029118,
0.02358829975128174,
-0.11961749196052551,
-0.15877623856067657,
0.05518900975584984,
-0.06157364696264267,
0.042140860110521317,
-0.09519730508327484,
-0.0013554332545027137,
-0.022932955995202065,
-0.046736396849155426,
-0.18993791937828064,
0.14616860449314117,
-0.07895652204751968,
-0.18948934972286224,
-0.12541796267032623,
0.04870598763227463,
0.08203405141830444,
0.02851676568388939,
0.07709576934576035,
-0.13042713701725006,
0.008568788878619671,
-0.043112337589263916,
0.08929098397493362,
0.02777717635035515,
-0.025213129818439484,
-0.0042389328591525555,
0.037710100412368774,
0.054764047265052795,
-0.14457955956459045,
0.024880426004529,
-0.051974106580019,
-0.0508590042591095,
0.01305175106972456,
-0.0006359065300785005,
0.00986916571855545,
0.16959616541862488,
0.052652548998594284,
0.020530376583337784,
-0.04023353010416031,
0.16766077280044556,
-0.06676536798477173,
-0.020579414442181587,
0.23021379113197327,
-0.007344340905547142,
-0.028917476534843445,
0.04345898702740669,
0.010402245447039604,
-0.06237425282597542,
0.021938713267445564,
-0.04014399275183678,
-0.11438208073377609,
-0.23423154652118683,
-0.09810429066419601,
-0.06323648989200592,
-0.03482033684849739,
0.026480816304683685,
-0.013432417996227741,
0.05877877026796341,
0.01976923458278179,
-0.04509079083800316,
-0.05050953850150108,
0.0727217048406601,
0.002926483051851392,
0.026531537994742393,
0.008640180341899395,
0.09516387432813644,
-0.03388624265789986,
-0.0008628367213532329,
0.0010528910206630826,
0.022569436579942703,
0.18438303470611572,
0.02741583064198494,
0.11180540919303894,
0.08533058315515518,
0.09252852201461792,
0.10927698016166687,
0.07963959872722626,
-0.03285357356071472,
-0.015134332701563835,
0.017043940722942352,
-0.06993500143289566,
-0.07798678427934647,
0.04256557673215866,
0.10893025994300842,
-0.05451704189181328,
-0.06307605654001236,
0.01303167175501585,
0.0064733210019767284,
0.17764434218406677,
0.07335890084505081,
-0.2302350252866745,
-0.09408017247915268,
-0.031815532594919205,
-0.03383345529437065,
0.027745317667722702,
0.04387807473540306,
0.15952010452747345,
-0.1370086669921875,
-0.0026436294429004192,
-0.010938326828181744,
0.08916088193655014,
-0.01609175093472004,
0.020170219242572784,
-0.058949291706085205,
0.038831718266010284,
0.0017014789627864957,
0.08539803326129913,
-0.26674866676330566,
0.19001887738704681,
0.005371862556785345,
0.14444509148597717,
-0.0721132829785347,
0.003447524271905422,
0.018200470134615898,
0.003916813991963863,
0.10575126856565475,
0.00536613492295146,
0.015667587518692017,
-0.09033715724945068,
-0.0859580859541893,
0.04938908666372299,
-0.013062609359622002,
-0.0016211746260523796,
0.04666171595454216,
-0.006504712160676718,
-0.0038171003106981516,
0.017953839153051376,
-0.04432528838515282,
-0.17732536792755127,
-0.06830348074436188,
0.008255884051322937,
0.12391621619462967,
0.10786088556051254,
-0.038176100701093674,
-0.0908885970711708,
-0.015685250982642174,
0.01973951794207096,
-0.12645776569843292,
-0.04442713037133217,
-0.04845881834626198,
0.043057531118392944,
0.07237700372934341,
-0.06849221885204315,
0.007192178629338741,
0.08642888814210892,
0.1318657249212265,
-0.03221763297915459,
-0.058765895664691925,
0.03498624637722969,
-0.13712045550346375,
-0.13077408075332642,
-0.02603318728506565,
0.1965552717447281,
0.10389765352010727,
0.06799957901239395,
0.04705042764544487,
0.003608995582908392,
-0.016476981341838837,
-0.025483055040240288,
0.008537710644304752,
0.17116162180900574,
-0.058019865304231644,
-0.011605659499764442,
-0.016882633790373802,
-0.169168621301651,
-0.09959915280342102,
-0.07145610451698303,
0.18087105453014374,
0.07544214278459549,
-0.04834781214594841,
0.1470903605222702,
0.21433094143867493,
-0.08987067639827728,
-0.20350518822669983,
-0.00999844167381525,
0.09345465898513794,
0.11135167628526688,
-0.01057001668959856,
-0.23422160744667053,
0.04740409553050995,
-0.0030196041334420443,
-0.01257489062845707,
-0.014420070685446262,
-0.33365118503570557,
-0.1486912965774536,
0.12210043519735336,
-0.01602904684841633,
0.11409272998571396,
-0.012168190442025661,
-0.02192746289074421,
-0.013865682296454906,
-0.03583771735429764,
0.044786274433135986,
-0.1566142588853836,
0.11201144009828568,
0.02148820273578167,
0.0419260673224926,
0.037176456302404404,
-0.058760203421115875,
0.052723225206136703,
0.09330547600984573,
-0.02886207588016987,
-0.007440662942826748,
0.03311486169695854,
0.02992502972483635,
-0.0017543126596137881,
0.11220131814479828,
-0.11649548262357712,
0.02857091836631298,
-0.08267087489366531,
-0.09705927968025208,
-0.08346870541572571,
0.060766592621803284,
0.020372848957777023,
-0.036684561520814896,
0.010905303992331028,
-0.03139536455273628,
0.022894125431776047,
0.007263938430696726,
-0.07479532063007355,
-0.10743409395217896,
0.0685683861374855,
0.16877448558807373,
0.17554382979869843,
-0.029923010617494583,
-0.028436848893761635,
-0.018568510189652443,
-0.015503482893109322,
0.1293519288301468,
-0.10712555795907974,
0.014584296382963657,
0.06296186149120331,
0.05796283110976219,
0.13183943927288055,
0.020329834893345833,
-0.09590274840593338,
0.08335074037313461,
0.04110291600227356,
-0.060141418129205704,
-0.10044673085212708,
-0.03840978816151619,
-0.009705216623842716,
-0.06744381785392761,
0.00585615960881114,
0.10515663772821426,
-0.08759800344705582,
-0.019514748826622963,
-0.026336006820201874,
0.03392414376139641,
-0.1034470871090889,
0.20936167240142822,
0.023662911728024483,
0.07706334441900253,
-0.09917622059583664,
0.025392873212695122,
0.003379217814654112,
-0.02657458931207657,
0.04405372962355614,
-0.029317332431674004,
-0.10447849333286285,
-0.06999750435352325,
-0.052964549511671066,
0.09003487974405289,
0.048432812094688416,
-0.09844563901424408,
-0.048403654247522354,
-0.0686085894703865,
0.009475416503846645,
0.04845966026186943,
0.03436435014009476,
0.02630031295120716,
-0.1095028817653656,
-0.027662836015224457,
-0.12081533670425415,
0.059923477470874786,
0.07755723595619202,
-0.032493215054273605,
-0.07874885201454163,
0.21963706612586975,
0.08886244148015976,
0.015751440078020096,
-0.025985976681113243,
-0.08195021003484726,
-0.011130275204777718,
0.08777017146348953,
-0.06072629988193512,
-0.011233343742787838,
-0.07645033299922943,
-0.00511866994202137,
-0.0188820268958807,
-0.05227745696902275,
0.009768891148269176,
0.09554500877857208,
-0.06825893372297287,
0.0399259515106678,
-0.04223339259624481,
0.0693788230419159,
-0.10446076840162277,
0.02834981121122837,
0.011072205379605293,
-0.0614192970097065,
0.07387961447238922,
0.12220548093318939,
-0.08349102735519409,
0.10977093130350113,
-0.23811157047748566,
-0.0559510812163353,
0.07266586273908615,
0.04318445175886154,
-0.046118393540382385,
-0.09085244685411453,
0.03795253485441208,
0.05568457394838333,
0.0524824820458889,
-0.01989104598760605,
0.09316478669643402,
-0.05419088155031204,
-0.024627944454550743,
-0.02854428067803383,
0.0284389890730381,
-0.04330297186970711,
0.06256052851676941,
0.06001235917210579,
0.13395404815673828,
0.13859683275222778,
-0.09981004893779755,
0.10785141587257385,
-0.1563066989183426,
0.0051129707135260105,
-0.046949103474617004,
-0.01257951557636261,
-0.15081755816936493,
-0.070075623691082,
0.07072281092405319,
-0.048624977469444275,
0.08204686641693115,
-0.0007879079785197973,
0.03955022618174553,
-0.023271741345524788,
-0.06004452332854271,
0.018504155799746513,
-0.0035369140096008778,
0.22598420083522797,
0.041966717690229416,
0.020104262977838516,
-0.030514482408761978,
0.010276063345372677,
0.05062537267804146,
0.1005585566163063,
0.05019697546958923,
0.1444590985774994,
-0.0004300101427361369,
0.07325591146945953,
0.05944026634097099,
-0.07037761807441711,
-0.079221211373806,
-0.04553869739174843,
-0.13914832472801208,
0.045974090695381165,
-0.08008840680122375,
0.16275428235530853,
0.15300095081329346,
-0.09631993621587753,
0.041891735047101974,
0.017154229804873466,
-0.09001578390598297,
-0.14236630499362946,
-0.13938693702220917,
-0.03516935929656029,
-0.13455994427204132,
0.027846908196806908,
-0.07766058295965195,
0.021850962191820145,
0.040574558079242706,
0.04018754884600639,
-0.028136195614933968,
0.18440821766853333,
0.0811636671423912,
-0.10898032784461975,
0.07956811785697937,
-0.07770790159702301,
-0.009107192046940327,
-0.07631274312734604,
0.027574189007282257,
0.17550894618034363,
-0.014597454108297825,
0.08427499234676361,
0.0031031544785946608,
-0.04746551439166069,
0.057552892714738846,
-0.0765811949968338,
-0.07227044552564621,
-0.011879573576152325,
-0.015500172041356564,
0.07611492276191711,
0.14204753935337067,
0.13004571199417114,
-0.05128578469157219,
0.02545173466205597,
0.12726104259490967,
-0.020764630287885666,
-0.12821651995182037,
-0.15880228579044342,
0.09263575077056885,
0.05439567193388939,
0.002233987208455801,
-0.010081515647470951,
-0.029391907155513763,
-0.004441045690327883,
0.2245681881904602,
0.21345771849155426,
0.053739454597234726,
0.02897634170949459,
-0.0248807892203331,
-0.010044644586741924,
-0.029005704447627068,
0.0810229480266571,
0.09839306026697159,
0.15619537234306335,
-0.014758884906768799,
0.031684741377830505,
-0.05451301857829094,
-0.10339849442243576,
0.00025129818823188543,
0.02477841079235077,
-0.08210831135511398,
-0.06624375283718109,
0.007193280849605799,
0.11420144885778427,
-0.052481602877378464,
-0.10616254061460495,
-0.10180442035198212,
-0.08230258524417877,
-0.06511660665273666,
-0.020586103200912476,
0.03435783460736275,
0.11849697679281235,
0.028878241777420044,
-0.06860417872667313,
0.03424612060189247,
0.1418963521718979,
-0.00804714486002922,
-0.052429184317588806,
-0.05812245234847069,
0.042288970202207565,
-0.09135173261165619,
0.01896427758038044,
-0.039407070726156235,
0.14544157683849335,
0.022082576528191566,
0.1086818054318428,
-0.046618882566690445,
0.11654272675514221,
-0.01415922213345766,
-0.018145663663744926,
0.018408063799142838,
0.12127326428890228,
-0.04441166669130325,
0.08924858272075653,
0.012457145377993584,
-0.11519941687583923,
0.054200030863285065,
-0.11071369051933289,
0.007124509662389755,
-0.09065064042806625,
0.0621214434504509,
-0.017434069886803627,
0.09742093831300735,
0.07396265864372253,
-0.08926063030958176,
-0.053739942610263824,
-0.0463387593626976,
0.046521399170160294,
0.020289137959480286,
-0.02472294121980667,
-0.05001448839902878,
-0.24224887788295746,
-0.01575222611427307,
-0.06441756337881088,
0.011707889847457409,
-0.18095646798610687,
-0.015227461233735085,
-0.01882190816104412,
-0.09628107398748398,
0.0026675767730921507,
0.06802568584680557,
0.08782587945461273,
0.030315294861793518,
-0.0008564204326830804,
0.03172830119729042,
0.053549401462078094,
0.12325192987918854,
-0.1878998577594757,
-0.11424217373132706
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Estonian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Estonian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "et", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-estonian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-estonian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Estonian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/et.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-estonian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-estonian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/et/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/et/clips/"
def clean_sentence(sent):
sent = sent.lower()
# normalize apostrophes
sent = sent.replace("’", "'")
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() or ch == "'" else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 30.74 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
The script used for training can be found [here](github.com)
|
{"language": "et", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Estonian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice et", "type": "common_voice", "args": "et"}, "metrics": [{"type": "wer", "value": 30.74, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-estonian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"et",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"et"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #et #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Estonian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Estonian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Estonian test data of Common Voice.
Test Result: 30.74 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
The script used for training can be found here
|
[
"# Wav2Vec2-Large-XLSR-53-Estonian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Estonian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Estonian test data of Common Voice.\n\n\n\nTest Result: 30.74 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training.\n\nThe script used for training can be found here"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #et #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Estonian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Estonian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Estonian test data of Common Voice.\n\n\n\nTest Result: 30.74 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training.\n\nThe script used for training can be found here"
] |
[
80,
65,
20,
29,
32
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #et #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Estonian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Estonian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Estonian test data of Common Voice.\n\n\n\nTest Result: 30.74 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training.\n\nThe script used for training can be found here"
] |
[
-0.15640467405319214,
-0.06527739018201828,
-0.0013329527573660016,
-0.014086098410189152,
0.10975542664527893,
-0.05112604796886444,
0.1949043869972229,
0.10977942496538162,
0.03631587326526642,
-0.011067761108279228,
0.039422616362571716,
-0.017532331869006157,
0.059247445315122604,
0.141755610704422,
0.0576665997505188,
-0.2189827263355255,
0.0040213484317064285,
0.007878935895860195,
0.0708686038851738,
0.10387122631072998,
0.12839771807193756,
-0.0695146694779396,
-0.010548933409154415,
0.08348146826028824,
-0.16928985714912415,
0.06769340485334396,
0.015985390171408653,
-0.11194249242544174,
0.1504763513803482,
0.032223884016275406,
0.0768585130572319,
0.007107018493115902,
0.08953332901000977,
-0.19220292568206787,
0.0212857685983181,
0.03157687932252884,
0.015761416405439377,
0.013920284807682037,
0.041886188089847565,
-0.038817450404167175,
0.10367339104413986,
0.07250402867794037,
-0.021011417731642723,
0.07630234211683273,
-0.04127319157123566,
-0.20834356546401978,
0.010270710103213787,
-0.005625596735626459,
0.1262616515159607,
0.17531736195087433,
-0.07563529908657074,
0.0852903202176094,
-0.16576507687568665,
0.09255988895893097,
0.12349629402160645,
-0.11171215027570724,
0.0009410043712705374,
0.05979853495955467,
0.049155183136463165,
0.08789435774087906,
-0.06248193979263306,
0.01643717661499977,
0.010304628871381283,
0.036071814596652985,
0.03981224074959755,
-0.005228939466178417,
-0.22538280487060547,
-0.039727240800857544,
-0.1271299421787262,
-0.0021456589456647635,
0.2579651474952698,
-0.019686425104737282,
-0.05458168685436249,
-0.1350065916776657,
0.011733679100871086,
-0.01059629675000906,
-0.013220058754086494,
-0.03121761791408062,
-0.019456902518868446,
0.01245823409408331,
0.011807193048298359,
-0.06453488767147064,
-0.13706497848033905,
-0.14842437207698822,
0.042877811938524246,
0.11284510791301727,
0.02839762158691883,
-0.004988469649106264,
-0.12789107859134674,
0.10118100047111511,
-0.09418424218893051,
-0.05699814483523369,
-0.01457937154918909,
0.01598898321390152,
-0.048360537737607956,
0.023359814658761024,
-0.07663776725530624,
-0.21298788487911224,
0.004226693417876959,
0.024558423087000847,
0.1511467844247818,
0.017979905009269714,
-0.0020512351766228676,
0.08510829508304596,
-0.010191711597144604,
0.14755304157733917,
-0.016530601307749748,
-0.0340404137969017,
0.02002543956041336,
0.01502513699233532,
-0.0475725457072258,
-0.012196594849228859,
-0.09683052450418472,
-0.07933064550161362,
0.010362643748521805,
0.06813468039035797,
-0.026409562677145004,
0.01334264688193798,
-0.027900610119104385,
-0.0330352857708931,
0.04368310049176216,
-0.11174972355365753,
-0.05294577777385712,
0.07477479428052902,
-0.03285490348935127,
0.08748731017112732,
0.07177437841892242,
0.04744616150856018,
-0.1199112981557846,
-0.052763551473617554,
0.03494929522275925,
0.0964827910065651,
-0.0295674130320549,
-0.12751014530658722,
0.02024713158607483,
-0.03858834505081177,
-0.0035336024593561888,
-0.1281222254037857,
-0.07157878577709198,
-0.06961512565612793,
-0.014981317333877087,
0.05252208560705185,
0.041109200567007065,
-0.1114407554268837,
-0.003325626254081726,
-0.03325100243091583,
-0.04407143220305443,
0.052223049104213715,
-0.022662609815597534,
0.032842010259628296,
-0.030892785638570786,
0.05434146896004677,
0.07206284254789352,
0.08874022960662842,
-0.1081952452659607,
-0.11325901746749878,
-0.026223046705126762,
0.09949055314064026,
-0.047325994819402695,
-0.026713883504271507,
-0.0980108305811882,
-0.07932113856077194,
-0.0738660916686058,
0.059631772339344025,
0.05874841660261154,
0.1317167580127716,
-0.24852919578552246,
-0.0990331619977951,
0.19528238475322723,
-0.1190844476222992,
-0.029902387410402298,
0.1936328262090683,
0.0022376729175448418,
0.13499696552753448,
0.1249597892165184,
0.25165751576423645,
0.1338529735803604,
-0.2195495367050171,
0.03198090195655823,
0.026143724098801613,
-0.027004661038517952,
-0.10748250782489777,
0.09697938710451126,
-0.05474920943379402,
0.06135554984211922,
0.042763661593198776,
-0.12069417536258698,
0.0913514718413353,
-0.008242030628025532,
-0.0582379475235939,
-0.010148435831069946,
-0.04163334518671036,
0.003934867680072784,
0.04409916698932648,
0.031384676694869995,
-0.05298039689660072,
-0.09067588299512863,
0.06042033061385155,
0.1275678426027298,
-0.14759577810764313,
0.07482634484767914,
-0.08838216215372086,
0.060678280889987946,
-0.025249643251299858,
-0.0071729994378983974,
-0.11465835571289062,
0.15077228844165802,
-0.0198023971170187,
0.011007568798959255,
0.052617397159338,
0.13491860032081604,
0.016760990023612976,
0.015243561007082462,
-0.03418580815196037,
-0.013818013481795788,
-0.03543576970696449,
-0.02723770961165428,
-0.03594982996582985,
-0.08408859372138977,
-0.017216311767697334,
-0.06285245716571808,
0.07342864573001862,
-0.14815054833889008,
0.0253673754632473,
0.04322229325771332,
0.010725711472332478,
0.003443025751039386,
-0.025006158277392387,
0.054767489433288574,
0.09121455997228622,
-0.019924966618418694,
-0.010500418953597546,
0.05453142523765564,
0.0031685209833085537,
-0.04754859581589699,
0.14141802489757538,
-0.13945220410823822,
-0.052128925919532776,
0.10735563188791275,
-0.07169592380523682,
-0.01285148598253727,
0.032285235822200775,
-0.023741796612739563,
-0.014254138804972172,
-0.06499515473842621,
-0.038479626178741455,
0.256712406873703,
-0.0044841761700809,
0.11882401257753372,
-0.08236312121152878,
-0.00756983133032918,
0.02334202267229557,
-0.09731601178646088,
0.04434515908360481,
0.0659109503030777,
-0.005984270013868809,
0.0390697605907917,
0.020403610542416573,
-0.05194627121090889,
-0.08554369956254959,
0.28781744837760925,
-0.018795136362314224,
-0.10334646701812744,
0.030468270182609558,
-0.019375286996364594,
-0.02449415624141693,
0.0639987513422966,
-0.20682941377162933,
-0.07388285547494888,
0.028044510632753372,
0.059101253747940063,
0.086479552090168,
-0.14785867929458618,
-0.012959056533873081,
0.004797962494194508,
-0.11747021228075027,
-0.17608989775180817,
0.10184156894683838,
-0.06464726477861404,
0.04386400431394577,
-0.10526017844676971,
-0.023472348228096962,
-0.02049298770725727,
-0.043504808098077774,
-0.17337830364704132,
0.15803590416908264,
-0.09446080029010773,
-0.2245362251996994,
-0.17785881459712982,
0.05321007966995239,
0.06327711045742035,
-0.0024954229593276978,
0.10574808716773987,
-0.15798301994800568,
-0.006442711688578129,
-0.04354345425963402,
0.14891548454761505,
0.02325328253209591,
-0.05659305304288864,
-0.06641169637441635,
0.04694583639502525,
0.048097241669893265,
-0.15148013830184937,
0.00509518152102828,
-0.05314914882183075,
-0.03168787807226181,
0.010397640988230705,
-0.04304792359471321,
0.007389824837446213,
0.15866555273532867,
0.016696631908416748,
0.013890743255615234,
-0.018268980085849762,
0.15281008183956146,
-0.10970563441514969,
0.0218216460198164,
0.19502674043178558,
0.00429900735616684,
-0.01827985979616642,
0.06899818032979965,
0.027914192527532578,
-0.037612415850162506,
0.0030842397827655077,
-0.03201162442564964,
-0.08788738399744034,
-0.27694129943847656,
-0.10552739351987839,
-0.06668888032436371,
-0.056785788387060165,
-0.005238959100097418,
0.0017445102566853166,
0.03748678043484688,
0.02206607349216938,
-0.021243633702397346,
-0.11234404891729355,
0.08016044646501541,
0.01841859519481659,
0.0984589010477066,
0.0029106605798006058,
0.09143292903900146,
-0.059223324060440063,
0.009811787866055965,
0.010071735829114914,
0.011530651710927486,
0.18317954242229462,
0.029434721916913986,
0.11130474507808685,
0.080681212246418,
0.07545477896928787,
0.07078150659799576,
0.07468364387750626,
-0.025012942031025887,
-0.01851225085556507,
0.02234705537557602,
-0.04905364289879799,
0.01849440671503544,
0.04001422971487045,
0.08026652783155441,
-0.07778547704219818,
-0.051295362412929535,
-0.024234915152192116,
0.044974178075790405,
0.14543727040290833,
0.0657060369849205,
-0.18348966538906097,
-0.11279860883951187,
-0.03603069856762886,
-0.07037360221147537,
0.006841099821031094,
0.029678013175725937,
0.17570184171199799,
-0.14311917126178741,
0.003974823746830225,
-0.0004123339313082397,
0.08355756103992462,
0.07272231578826904,
0.005450133234262466,
-0.06809406727552414,
0.05019354820251465,
-0.001374781597405672,
0.10508212447166443,
-0.2235880047082901,
0.2291380912065506,
-0.009924228303134441,
0.1520824134349823,
-0.05141860246658325,
-0.0015553493285551667,
0.022476527839899063,
0.08067715167999268,
0.13137173652648926,
0.03253057599067688,
0.00343417190015316,
-0.11093293875455856,
-0.08045431971549988,
0.05086333304643631,
-0.001169369206763804,
-0.0129604022949934,
0.025111403316259384,
0.009925451129674911,
0.012791844084858894,
0.014025251381099224,
-0.10550406575202942,
-0.12434441596269608,
-0.038223814219236374,
0.005277431104332209,
0.13173642754554749,
0.06922025233507156,
-0.015077846124768257,
-0.10648123919963837,
-0.07725776731967926,
0.04071711003780365,
-0.1327100545167923,
-0.0501699335873127,
-0.05670139566063881,
-0.028718551620841026,
0.08997832238674164,
-0.09840644150972366,
0.014391430653631687,
0.09041336923837662,
0.12866489589214325,
-0.039691630750894547,
-0.035976022481918335,
0.039692752063274384,
-0.13989032804965973,
-0.07718431204557419,
0.007599761243909597,
0.18380823731422424,
0.11636475473642349,
0.06540732830762863,
0.03319349139928818,
0.010704421438276768,
-0.03457824885845184,
-0.041467588394880295,
0.008350273594260216,
0.10559360682964325,
-0.09184224158525467,
0.00043295297655276954,
-0.026852412149310112,
-0.17528033256530762,
-0.1465052217245102,
-0.07299169152975082,
0.1912626177072525,
0.11153285205364227,
-0.0717901661992073,
0.1358308494091034,
0.18507735431194305,
-0.10752519965171814,
-0.23602059483528137,
0.024938657879829407,
0.09259438514709473,
0.13009381294250488,
0.0011584076564759016,
-0.22080756723880768,
0.06285404413938522,
0.0026222956366837025,
-0.016289211809635162,
-0.06114258989691734,
-0.2922240197658539,
-0.16739130020141602,
0.16860543191432953,
-0.02325439453125,
0.10987367480993271,
0.04196644574403763,
0.022494494915008545,
0.017945153638720512,
0.034498974680900574,
-0.010218869894742966,
-0.10714832693338394,
0.11895018815994263,
0.019702082499861717,
0.07706959545612335,
0.03940248116850853,
-0.045116305351257324,
0.07537924498319626,
0.09930194169282913,
-0.013632559217512608,
-0.012000191025435925,
0.047000396996736526,
0.043282054364681244,
-0.0061211492866277695,
0.17799243330955505,
-0.1006736159324646,
0.013029612600803375,
-0.0876535102725029,
-0.09347167611122131,
-0.09782783687114716,
0.10366430878639221,
0.032628823071718216,
-0.04611974582076073,
-0.007655360270291567,
-0.021206622943282127,
0.011090481653809547,
0.003561887191608548,
-0.029366571456193924,
-0.12458536773920059,
0.020224614068865776,
0.08993428200483322,
0.17119894921779633,
0.001869065104983747,
-0.10542638599872589,
0.011050723493099213,
-0.023715054616332054,
0.11355388164520264,
-0.11241693794727325,
-0.004026005044579506,
0.06791061908006668,
0.03952540084719658,
0.11930003017187119,
0.0008558015688322484,
-0.12598364055156708,
0.0771045908331871,
0.05394365265965462,
-0.023008104413747787,
-0.09763652086257935,
-0.048451993614435196,
-0.055301181972026825,
-0.028993668034672737,
0.020956669002771378,
0.08184629678726196,
-0.12042704224586487,
-0.00718148285523057,
-0.0335262157022953,
0.015978440642356873,
-0.11328299343585968,
0.18300525844097137,
0.06762342154979706,
0.06731794029474258,
-0.08047595620155334,
0.046545930206775665,
-0.028409728780388832,
-0.018435470759868622,
0.06518423557281494,
0.005355507601052523,
-0.09288065880537033,
-0.0771336778998375,
-0.004931358154863119,
0.07492206990718842,
0.04693927988409996,
-0.12500642240047455,
-0.0879853144288063,
-0.06335286796092987,
-0.004481087438762188,
0.056476518511772156,
0.07014051079750061,
-0.020090630277991295,
-0.12190697342157364,
-0.03773781657218933,
-0.10379060357809067,
0.07139226049184799,
0.08533774316310883,
-0.04851730167865753,
-0.07193953543901443,
0.21242056787014008,
0.10619055479764938,
0.01438185852020979,
-0.0322100929915905,
-0.08057526499032974,
0.02556479535996914,
0.0786094143986702,
-0.0035081200767308474,
-0.03842034190893173,
-0.046072158962488174,
0.005281183868646622,
-0.03230645880103111,
-0.08497434109449387,
-0.015590204857289791,
0.09191212058067322,
-0.08812403678894043,
0.0506555549800396,
-0.03680013492703438,
0.06873613595962524,
-0.06996526569128036,
0.013548211194574833,
0.029708515852689743,
-0.06014473736286163,
0.07409240305423737,
0.14724643528461456,
-0.09015660732984543,
0.14274902641773224,
-0.2010861337184906,
-0.015221738256514072,
0.07268206030130386,
0.06256315857172012,
-0.033111464232206345,
-0.08115169405937195,
0.02128073014318943,
0.05900876596570015,
0.09102852642536163,
-0.014255114831030369,
0.060289930552244186,
-0.057348739355802536,
0.013107070699334145,
-0.03566646948456764,
-0.012863481417298317,
-0.05515419319272041,
0.09195947647094727,
0.0464307926595211,
0.1503107249736786,
0.15307803452014923,
-0.10460696369409561,
0.13018496334552765,
-0.11680248379707336,
0.019048620015382767,
-0.04207899793982506,
-0.03741471841931343,
-0.10048764199018478,
-0.05484281852841377,
0.04684494435787201,
-0.06207694858312607,
0.11545167118310928,
0.03026757948100567,
0.014203105121850967,
-0.0033720661886036396,
-0.08097147941589355,
-0.015040975995361805,
-0.007739844731986523,
0.1873563975095749,
0.019135864451527596,
0.024399738758802414,
-0.07094395160675049,
-0.013750404119491577,
0.031049350276589394,
0.10615141689777374,
0.026674792170524597,
0.1606452316045761,
0.011362857185304165,
0.09530653059482574,
0.06806463748216629,
-0.057852622121572495,
-0.12829424440860748,
-0.07172806560993195,
-0.11238262802362442,
0.04110420122742653,
-0.06645677238702774,
0.21659351885318756,
0.17102785408496857,
-0.08653464913368225,
0.06314398348331451,
0.04144924134016037,
-0.11393871158361435,
-0.15623866021633148,
-0.12727859616279602,
-0.040384747087955475,
-0.1354794055223465,
0.02363491803407669,
-0.06981568783521652,
0.034276459366083145,
0.02913835644721985,
0.06455006450414658,
-0.04863175004720688,
0.24462926387786865,
0.05050173029303551,
-0.11800677329301834,
0.08475840836763382,
-0.09225240349769592,
-0.018651721999049187,
-0.08760024607181549,
0.0419553779065609,
0.15191572904586792,
-0.010334568098187447,
0.0726102739572525,
0.003985539078712463,
-0.06898953020572662,
0.02434857003390789,
-0.10838557034730911,
-0.06785982102155685,
-0.03445788472890854,
0.007288267835974693,
0.07206080108880997,
0.11528901755809784,
0.1231997013092041,
-0.06922370940446854,
0.006014020182192326,
0.12403924763202667,
-0.00041639318806119263,
-0.15842294692993164,
-0.12866199016571045,
0.10564806312322617,
0.06911404430866241,
-0.0010894514853134751,
-0.04485097527503967,
-0.027008214965462685,
-0.029123039916157722,
0.23089465498924255,
0.22933627665042877,
0.12842030823230743,
0.02206547185778618,
-0.01979704760015011,
-0.012949679978191853,
-0.035889364778995514,
0.07705527544021606,
0.05009834095835686,
0.14765815436840057,
-0.011700264178216457,
0.0662483498454094,
-0.0830998495221138,
-0.07448338717222214,
-0.008360042236745358,
0.042673345655202866,
-0.055117417126894,
-0.09648644179105759,
-0.032591015100479126,
0.1317519098520279,
-0.020190976560115814,
-0.08395843207836151,
-0.10261515527963638,
-0.0737912580370903,
-0.08229448646306992,
-0.007690264377743006,
0.04858345910906792,
0.08889824897050858,
0.04393848031759262,
-0.06169195845723152,
0.012943489477038383,
0.11159218102693558,
-0.00008676332072354853,
-0.04147842898964882,
-0.09110339730978012,
0.04872715473175049,
-0.11486630886793137,
0.04962391406297684,
-0.022537805140018463,
0.16280938684940338,
0.005012940615415573,
0.08837597817182541,
-0.013212249614298344,
0.1448577642440796,
-0.020548317581415176,
0.03859035298228264,
0.03017394058406353,
0.12277381867170334,
-0.06610823422670364,
0.09346222877502441,
0.00909661129117012,
-0.13715995848178864,
0.06626662611961365,
-0.14740189909934998,
-0.047458846122026443,
-0.07621335238218307,
0.056077949702739716,
-0.03947843983769417,
0.08123338967561722,
0.1038452684879303,
-0.06623739004135132,
-0.0848676934838295,
-0.05848830193281174,
0.03444382920861244,
0.052004121243953705,
-0.038084521889686584,
-0.024029552936553955,
-0.22961172461509705,
-0.011447501368820667,
-0.05210776627063751,
0.00036947871558368206,
-0.16684800386428833,
-0.03214682266116142,
-0.021755358204245567,
-0.08203930407762527,
0.008669005706906319,
0.03674094378948212,
0.07022839039564133,
0.014699136838316917,
0.006124062929302454,
0.03935304284095764,
0.06031470373272896,
0.10964335501194,
-0.16388365626335144,
-0.12157873064279556
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Hungarian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Hungarian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "hu", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-hungarian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-hungarian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Hungarian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/hu.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-hungarian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-hungarian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/hu/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/hu/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 42.26 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "hu", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Hungarian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice hu", "type": "common_voice", "args": "hu"}, "metrics": [{"type": "wer", "value": 42.26, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-hungarian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"hu",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hu"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hu #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Hungarian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Hungarian test data of Common Voice.
Test Result: 42.26 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Hungarian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Hungarian test data of Common Voice.\n\n\n\nTest Result: 42.26 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hu #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Hungarian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Hungarian test data of Common Voice.\n\n\n\nTest Result: 42.26 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
66,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hu #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Hungarian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Hungarian test data of Common Voice.\n\n\n\nTest Result: 42.26 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.15068405866622925,
-0.0003541869518812746,
-0.0014923005364835262,
-0.05062703788280487,
0.09329178929328918,
-0.056753817945718765,
0.22456826269626617,
0.06976883113384247,
0.020232686772942543,
-0.006932293530553579,
0.025764670222997665,
0.004256876651197672,
0.042614519596099854,
0.021809455007314682,
-0.013325524516403675,
-0.22181466221809387,
0.04832635074853897,
-0.022852279245853424,
0.07710292935371399,
0.08705329149961472,
0.10874887555837631,
-0.059073884040117264,
0.0020411000587046146,
0.07185512781143188,
-0.11679794639348984,
0.05514993518590927,
0.018396949395537376,
-0.09991834312677383,
0.15354031324386597,
0.05989062786102295,
0.035098422318696976,
0.05336602032184601,
0.07251540571451187,
-0.1726866215467453,
0.012690826319158077,
0.026682885363698006,
0.042066991329193115,
0.017384300008416176,
0.0503515750169754,
-0.023445744067430496,
0.19014114141464233,
0.06747730076313019,
-0.02984323352575302,
0.06176140531897545,
-0.027062049135565758,
-0.2139882743358612,
-0.01634509302675724,
0.04260380193591118,
0.10718519985675812,
0.1252029985189438,
-0.0694112777709961,
0.11395877599716187,
-0.12878020107746124,
0.0975903645157814,
0.09665349125862122,
-0.1851111650466919,
0.004156747832894325,
0.07089820504188538,
0.051509276032447815,
0.12874162197113037,
-0.05506148189306259,
0.056289192289114,
0.049710169434547424,
0.030866485089063644,
0.0027653577271848917,
-0.044430751353502274,
-0.2880449891090393,
-0.05548974871635437,
-0.15611447393894196,
-0.025372525677084923,
0.21916984021663666,
-0.0036614194978028536,
-0.07036430388689041,
-0.14199155569076538,
-0.0021219621412456036,
0.04606315493583679,
-0.0018051071092486382,
-0.025546357035636902,
-0.0288771390914917,
0.021067865192890167,
0.000009601265446690377,
-0.005474271718412638,
-0.08652748167514801,
-0.13225503265857697,
0.01717284508049488,
0.0873430147767067,
0.03245566785335541,
0.03188170865178108,
-0.07431323826313019,
0.05863901600241661,
-0.06493758410215378,
-0.08780088275671005,
0.009031307883560658,
0.05731329321861267,
-0.08884915709495544,
-0.0022647841833531857,
-0.07007189095020294,
-0.12316858023405075,
0.02712583914399147,
-0.05359983071684837,
0.02222779020667076,
0.0216363538056612,
-0.014381146989762783,
0.06871864944696426,
0.03086196817457676,
0.1235329881310463,
-0.09380172938108444,
-0.0493348054587841,
-0.01672528311610222,
-0.029663918539881706,
-0.03000418283045292,
-0.045103322714567184,
-0.08526235818862915,
-0.06443815678358078,
0.025942830368876457,
0.05778312310576439,
-0.05707105994224548,
0.012906096875667572,
0.026171855628490448,
-0.051742345094680786,
0.015483316965401173,
-0.08772213757038116,
-0.05223174765706062,
0.07535819709300995,
0.005344864446669817,
0.13049164414405823,
-0.01908848248422146,
0.08034803718328476,
-0.07816959172487259,
0.01177131850272417,
0.021901937201619148,
0.03938326984643936,
-0.016159100458025932,
-0.13146336376667023,
0.022291995584964752,
-0.02001834474503994,
0.01219653058797121,
-0.08782441169023514,
-0.06199834123253822,
-0.06452383100986481,
-0.003633210202679038,
0.03261679783463478,
0.000326974899508059,
-0.10640881955623627,
-0.03883012384176254,
-0.009966842830181122,
-0.056983012706041336,
0.08568696677684784,
-0.047773413360118866,
0.0672050192952156,
-0.0723329558968544,
0.018857762217521667,
-0.0045689004473388195,
0.06383475661277771,
-0.11781175434589386,
-0.030253153294324875,
-0.00439074682071805,
0.13011930882930756,
-0.05295977741479874,
0.006714475806802511,
-0.07141955941915512,
-0.05976556986570358,
0.01993563398718834,
0.04504571110010147,
0.03793987259268761,
0.07243503630161285,
-0.26962727308273315,
-0.09701012820005417,
0.1303301900625229,
-0.15252873301506042,
-0.030900971964001656,
0.22372129559516907,
-0.017136894166469574,
0.09547990560531616,
0.15900994837284088,
0.3060392141342163,
0.1745067685842514,
-0.17066746950149536,
-0.024890311062335968,
0.0007697523687966168,
0.00018232263391837478,
-0.04042496532201767,
0.08883831650018692,
-0.037761133164167404,
-0.04085880145430565,
0.019083093851804733,
-0.09572698175907135,
0.07285719364881516,
-0.029318619519472122,
-0.0444173701107502,
-0.0010773689718917012,
-0.10311638563871384,
0.08179035782814026,
0.049400921911001205,
0.029620273038744926,
-0.0604826919734478,
-0.031464844942092896,
0.05373543128371239,
0.15839171409606934,
-0.14620690047740936,
0.03984992578625679,
-0.0884513333439827,
0.11365324258804321,
-0.09045282751321793,
-0.010931620374321938,
-0.10949696600437164,
0.19243711233139038,
0.007218807470053434,
0.07490372657775879,
0.03808765858411789,
0.15454410016536713,
0.013285603374242783,
0.04768383875489235,
-0.028452835977077484,
0.00754693616181612,
-0.0025891587138175964,
-0.008760822005569935,
-0.03140309080481529,
-0.12847520411014557,
-0.0385715551674366,
-0.052320972084999084,
0.1074211597442627,
-0.14806044101715088,
-0.013757012784481049,
0.021827733144164085,
0.025381645187735558,
-0.012917893007397652,
-0.015170673839747906,
0.009236386977136135,
0.10385572165250778,
0.007669136859476566,
-0.002706535393372178,
0.04274042323231697,
0.01579083688557148,
-0.03300946205854416,
0.14466512203216553,
-0.1493203043937683,
-0.04857948049902916,
0.08875907212495804,
-0.06857775151729584,
-0.009505994617938995,
0.049479421228170395,
-0.028830504044890404,
-0.0046859011054039,
-0.0826067104935646,
-0.025206323713064194,
0.3299199044704437,
-0.029527127742767334,
0.10261619091033936,
-0.09725596755743027,
-0.010112183168530464,
0.025808759033679962,
-0.06521802395582199,
0.06001680716872215,
0.09221401065587997,
0.02014191821217537,
0.025709839537739754,
0.026366280391812325,
-0.042375367134809494,
-0.08001888543367386,
0.2873710095882416,
-0.0240466371178627,
-0.12105368077754974,
0.04319554939866066,
-0.0207735076546669,
-0.031689196825027466,
0.10069572180509567,
-0.19069617986679077,
-0.02040592022240162,
0.026240190491080284,
0.06792955845594406,
0.0723075196146965,
-0.11198174208402634,
0.029464777559041977,
0.0025812112726271152,
-0.13790686428546906,
-0.19714178144931793,
0.07203026860952377,
-0.039480213075876236,
0.05212537571787834,
-0.10493141412734985,
-0.053635790944099426,
-0.03243030607700348,
-0.04011036828160286,
-0.1847108155488968,
0.09869913756847382,
-0.04494680091738701,
-0.1432986855506897,
-0.18879525363445282,
-0.009082898497581482,
-0.0038707051426172256,
0.01775476709008217,
0.08661755174398422,
-0.09701335430145264,
0.0012609359109774232,
-0.02038244716823101,
0.15456989407539368,
-0.010078445076942444,
-0.029968785122036934,
-0.02218901924788952,
0.02734452113509178,
0.04627831652760506,
-0.11487191170454025,
0.010323967784643173,
-0.07002595067024231,
-0.036408472806215286,
-0.05602569133043289,
-0.012980980798602104,
0.007902903482317924,
0.1735401451587677,
0.013102401979267597,
0.023828275501728058,
-0.032243430614471436,
0.19890688359737396,
-0.11258821934461594,
-0.0379725843667984,
0.1870175004005432,
-0.05477454140782356,
-0.03967767953872681,
0.13502247631549835,
0.03546992689371109,
-0.02709616906940937,
-0.0188754815608263,
-0.0077780988067388535,
-0.09817568212747574,
-0.19957385957241058,
-0.15913406014442444,
-0.06826537847518921,
-0.06376547366380692,
-0.04607095196843147,
-0.008979854173958302,
0.06054506078362465,
0.050735507160425186,
-0.0659116730093956,
-0.06684377789497375,
0.033386196941137314,
-0.009206476621329784,
0.09062552452087402,
0.0026623939629644156,
0.07834721356630325,
-0.06857987493276596,
-0.049720220267772675,
0.0046538240276277065,
-0.011251352727413177,
0.19063988327980042,
0.047042712569236755,
0.04730355739593506,
0.09152740985155106,
0.09518849849700928,
0.11123447120189667,
0.07302975654602051,
-0.0672827810049057,
-0.019912611693143845,
0.014637227170169353,
-0.08610355854034424,
0.004256484564393759,
0.029365787282586098,
0.09863439947366714,
-0.009851941838860512,
-0.10139761865139008,
-0.0563119500875473,
0.02686525695025921,
0.2570001184940338,
0.07873818278312683,
-0.1904734969139099,
-0.09559958428144455,
-0.0049340082332491875,
-0.05999606102705002,
0.02214180678129196,
0.02742340974509716,
0.21133345365524292,
-0.13161931931972504,
0.025891654193401337,
0.03209240362048149,
0.09368478506803513,
-0.04669538140296936,
0.0388181172311306,
-0.11502110958099365,
0.01625494286417961,
-0.009513547644019127,
0.1267724335193634,
-0.19982267916202545,
0.2186254858970642,
0.003799175377935171,
0.13956080377101898,
-0.08620433509349823,
-0.016905050724744797,
0.01664942502975464,
0.06666827946901321,
0.08508222550153732,
0.03198416531085968,
0.011020670644938946,
-0.1221386194229126,
-0.0995023101568222,
0.056767821311950684,
-0.016340672969818115,
-0.0023751489352434874,
0.038855571299791336,
-0.018752222880721092,
0.010176458396017551,
-0.013815405778586864,
-0.04033156856894493,
-0.09821393340826035,
-0.04739417880773544,
0.006499140989035368,
0.22592061758041382,
0.07971201092004776,
-0.010696178302168846,
-0.0929703339934349,
-0.12092911452054977,
0.048218246549367905,
-0.09842195361852646,
-0.07458031177520752,
-0.042273424565792084,
-0.11614455282688141,
0.1329164057970047,
-0.06765950471162796,
-0.03990699350833893,
0.11891444772481918,
0.12363686412572861,
-0.04873664677143097,
-0.06866839528083801,
0.009317184798419476,
-0.12010596692562103,
-0.1064877063035965,
0.011044507846236229,
0.20637387037277222,
0.13736729323863983,
0.0917714536190033,
0.06499354541301727,
0.01123566273599863,
0.015344199724495411,
-0.08296339213848114,
-0.028288336470723152,
0.09284117817878723,
-0.12505720555782318,
0.006107022054493427,
0.004341185558587313,
-0.1312020719051361,
-0.13769865036010742,
-0.058058954775333405,
0.16359448432922363,
0.09489012509584427,
-0.08502940833568573,
0.1855238974094391,
0.23019620776176453,
-0.07353322207927704,
-0.17026425898075104,
-0.018498437479138374,
0.12151119858026505,
0.12922556698322296,
-0.011156919412314892,
-0.1376911699771881,
0.061585646122694016,
-0.014436174184083939,
-0.032137587666511536,
-0.05568094179034233,
-0.3260500729084015,
-0.13640202581882477,
0.17410491406917572,
-0.03319647163152695,
0.15673691034317017,
0.03435203433036804,
-0.028723381459712982,
-0.00993633083999157,
0.02271435596048832,
0.0038697225973010063,
-0.14798638224601746,
0.09325776249170303,
0.04856918007135391,
0.10203535109758377,
0.06546594202518463,
-0.009756151586771011,
0.11585046350955963,
0.12311487644910812,
-0.016643330454826355,
-0.013036537915468216,
0.06726356595754623,
0.05787144973874092,
0.02086447738111019,
0.11688308417797089,
-0.10273691266775131,
0.036364372819662094,
-0.09858181327581406,
-0.07543067634105682,
-0.0694449320435524,
0.06013251468539238,
0.018632108345627785,
-0.04979374632239342,
0.006186666898429394,
-0.04909944161772728,
0.02857441082596779,
0.016293007880449295,
-0.059140998870134354,
-0.13212552666664124,
0.06255461275577545,
0.07441245764493942,
0.17048275470733643,
-0.10452242195606232,
-0.038385551422834396,
-0.0076396530494093895,
-0.028494330123066902,
0.13192462921142578,
-0.0008220527088269591,
0.0381508506834507,
0.08644039928913116,
0.03701286017894745,
0.11055684089660645,
0.01014847680926323,
-0.09049821645021439,
0.07501132786273956,
0.01854211464524269,
-0.08328473567962646,
-0.06927096843719482,
-0.05058978497982025,
-0.0045751286670565605,
-0.022687256336212158,
0.045038431882858276,
0.12756013870239258,
-0.11126038432121277,
0.00610589561983943,
-0.04553384706377983,
-0.024871090427041054,
-0.12729404866695404,
0.21157525479793549,
-0.015378442592918873,
0.08173365145921707,
-0.1295551061630249,
-0.00960951391607523,
0.010127011686563492,
0.00047225572052411735,
0.04814526066184044,
-0.03623700141906738,
-0.08717925101518631,
-0.030868371948599815,
-0.06557897478342056,
0.0961792841553688,
0.026531999930739403,
-0.16381311416625977,
-0.05464303493499756,
-0.1346137970685959,
-0.00988792348653078,
0.07762180268764496,
0.06676147133111954,
0.020050618797540665,
-0.15056496858596802,
-0.0878472849726677,
-0.14382711052894592,
0.03915724530816078,
0.07399733364582062,
-0.019585425034165382,
-0.1271042674779892,
0.19653770327568054,
0.07525522261857986,
0.017455456778407097,
-0.05976220592856407,
-0.08287636935710907,
0.01358911395072937,
0.08005835115909576,
-0.09510919451713562,
-0.026809751987457275,
-0.05279267206788063,
0.01672835275530815,
-0.007931938394904137,
-0.1019524484872818,
-0.036975543946027756,
0.06687543541193008,
-0.08739875257015228,
0.07942312955856323,
-0.019106637686491013,
0.069512277841568,
-0.06844022125005722,
0.03236919641494751,
0.04024231433868408,
-0.06278501451015472,
0.08226479589939117,
0.12870027124881744,
-0.08787406980991364,
0.12908191978931427,
-0.18001680076122284,
-0.06808017194271088,
0.040304895490407944,
0.06452461332082748,
-0.004772456828504801,
-0.07832280546426773,
0.026475464925169945,
0.0899045541882515,
0.06273458153009415,
0.014317519962787628,
0.05980644375085831,
-0.038153257220983505,
-0.015530601143836975,
-0.03147309646010399,
-0.014165530912578106,
-0.01481047086417675,
0.054363030940294266,
0.04252450168132782,
0.14556452631950378,
0.13242685794830322,
-0.10062093287706375,
0.10431946069002151,
-0.09327445924282074,
0.018007146194577217,
-0.059875644743442535,
-0.010134104639291763,
-0.10226292908191681,
-0.08261839300394058,
0.05082875117659569,
-0.05191277340054512,
0.14009252190589905,
0.014568021520972252,
0.09101743996143341,
-0.026393301784992218,
-0.02407369576394558,
0.05268007516860962,
-0.02925499901175499,
0.2987256944179535,
0.05383407697081566,
0.042713187634944916,
-0.04309830069541931,
0.005322439596056938,
0.022934934124350548,
0.0890846699476242,
-0.012793094851076603,
0.1509014368057251,
-0.01216016337275505,
0.04584508761763573,
0.1020287573337555,
-0.05884413793683052,
-0.045522209256887436,
-0.04306284710764885,
-0.08195371925830841,
0.015181648544967175,
-0.06782703846693039,
0.21700464189052582,
0.1807754784822464,
-0.12013030052185059,
0.0874052420258522,
0.07354247570037842,
-0.07253368198871613,
-0.14957959949970245,
-0.0799909383058548,
-0.04157140105962753,
-0.16160894930362701,
0.013027912937104702,
-0.08119940012693405,
0.003875855589285493,
0.08514367043972015,
0.05568777024745941,
-0.018308434635400772,
0.1443215310573578,
-0.027625981718301773,
-0.08434857428073883,
0.07478319108486176,
-0.09569858014583588,
0.021360117942094803,
-0.10304957628250122,
0.00942909438163042,
0.18496546149253845,
0.0022105728276073933,
0.06168895587325096,
-0.004843547474592924,
-0.04860352352261543,
-0.0028737306129187346,
-0.07578019052743912,
-0.05115792527794838,
-0.010947749018669128,
-0.04667965695261955,
0.09163347631692886,
0.13060669600963593,
0.13356490433216095,
-0.0709439367055893,
-0.028345372527837753,
0.13523583114147186,
-0.049254897981882095,
-0.1425492912530899,
-0.15956266224384308,
0.13398660719394684,
0.03149406984448433,
0.020310407504439354,
0.0026885713450610638,
-0.03839410841464996,
-0.040151115506887436,
0.2560107111930847,
0.23078808188438416,
0.06604945659637451,
0.015181073918938637,
-0.07003825902938843,
-0.012217745184898376,
-0.026870939880609512,
0.08972840011119843,
0.042786527425050735,
0.27093085646629333,
-0.012016909196972847,
0.01639457419514656,
-0.11163981258869171,
-0.044993411749601364,
-0.018160762265324593,
0.10364969819784164,
-0.07277993112802505,
-0.1234196275472641,
0.021909158676862717,
0.16122597455978394,
-0.06094426289200783,
-0.07219263911247253,
-0.16039884090423584,
-0.0838441550731659,
-0.10349374264478683,
0.0012651399010792375,
0.01579270511865616,
0.10832281410694122,
0.013571269810199738,
-0.07349611073732376,
0.029701514169573784,
0.11180850863456726,
0.001805869396775961,
-0.04674871265888214,
-0.0946681797504425,
0.06016245856881142,
-0.1197710707783699,
0.06529609113931656,
-0.036253634840250015,
0.14761023223400116,
0.029876187443733215,
0.10725518316030502,
0.006314773578196764,
0.1413549780845642,
-0.017482940107584,
-0.12198324501514435,
0.03812277317047119,
0.15515033900737762,
-0.05136100947856903,
0.14595261216163635,
0.01643088273704052,
-0.15827780961990356,
0.05462298542261124,
-0.16442950069904327,
0.0011198106221854687,
-0.04709020256996155,
0.04706326127052307,
-0.0231498870998621,
0.07579594105482101,
0.08604373782873154,
-0.06453574448823929,
-0.02556699700653553,
-0.07139094918966293,
0.036848727613687515,
0.037559378892183304,
-0.11025591194629669,
-0.03967252001166344,
-0.2742510735988617,
-0.0241794902831316,
-0.10016123950481415,
-0.04064334183931351,
-0.22481030225753784,
-0.005303285550326109,
-0.01670285314321518,
-0.0742761567234993,
0.01757902465760708,
-0.0018946209456771612,
0.07948997616767883,
-0.0017273351550102234,
0.002103199949488044,
-0.015807604417204857,
0.03067127615213394,
0.1143403872847557,
-0.1720130443572998,
-0.10667433589696884
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Kyrgyz
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Kyrgyz using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ky", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-kyrgyz")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-kyrgyz")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Kyrgyz test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/ky.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-kyrgyz")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-kyrgyz")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/ky/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/ky/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 31.88 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "ky", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Kyrgyz XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ky", "type": "common_voice", "args": "ky"}, "metrics": [{"type": "wer", "value": 31.88, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-kyrgyz
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"ky",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ky"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ky #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Kyrgyz
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Kyrgyz using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Kyrgyz test data of Common Voice.
Test Result: 31.88 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Kyrgyz\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Kyrgyz using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Kyrgyz test data of Common Voice.\n\n\n\nTest Result: 31.88 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ky #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Kyrgyz\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Kyrgyz using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Kyrgyz test data of Common Voice.\n\n\n\nTest Result: 31.88 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
68,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ky #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Kyrgyz\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Kyrgyz using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Kyrgyz test data of Common Voice.\n\n\n\nTest Result: 31.88 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.11639873683452606,
0.022292232140898705,
-0.001322491210885346,
-0.05175675079226494,
0.07366496324539185,
-0.03713135048747063,
0.1622331738471985,
0.11327170580625534,
-0.006075706332921982,
-0.015553894452750683,
0.03378961235284805,
0.015366160310804844,
0.07279398292303085,
0.10495312511920929,
-0.026438983157277107,
-0.21004585921764374,
0.0037316717207431793,
0.015799960121512413,
0.08042062073945999,
0.12131344527006149,
0.08669758588075638,
-0.0925620049238205,
-0.011932102963328362,
0.08636347204446793,
-0.13592150807380676,
0.018334729596972466,
0.008896061219274998,
-0.12667620182037354,
0.12386056035757065,
0.04895125702023506,
0.06181665137410164,
0.07298720628023148,
0.07739914953708649,
-0.2184039056301117,
0.027800802141427994,
0.02926262654364109,
0.04523744061589241,
0.03437686339020729,
0.034368518739938736,
-0.008215961046516895,
0.08299019187688828,
0.022153334692120552,
-0.04866952821612358,
0.05065653473138809,
-0.033134665340185165,
-0.22685323655605316,
-0.028945669531822205,
0.028396714478731155,
0.1330716907978058,
0.1072816401720047,
-0.06931839138269424,
0.10505756735801697,
-0.12105177342891693,
0.08817559480667114,
0.11067710071802139,
-0.18011055886745453,
0.013518295250833035,
0.1051681786775589,
0.03702804818749428,
0.03490950167179108,
-0.07650034874677658,
0.051709145307540894,
0.04018772765994072,
0.004882763605564833,
0.01913250982761383,
-0.04853583872318268,
-0.20094014704227448,
-0.024216029793024063,
-0.1306232064962387,
-0.03292640671133995,
0.21207676827907562,
-0.010351263917982578,
-0.06871603429317474,
-0.1375758945941925,
0.008110336028039455,
0.024329211562871933,
-0.017412537708878517,
-0.05080163851380348,
-0.029958343133330345,
-0.0009472939418628812,
-0.09685496985912323,
-0.025954434648156166,
-0.12423419207334518,
-0.1332576721906662,
-0.02058001421391964,
0.12616728246212006,
0.043001022189855576,
0.0184808149933815,
-0.10938980430364609,
0.10247182846069336,
-0.08112791925668716,
-0.07196351140737534,
-0.01668989472091198,
0.039917703717947006,
-0.06645263731479645,
0.00678594084456563,
-0.06777369230985641,
-0.11995672434568405,
0.06594305485486984,
-0.05423319339752197,
0.016355931758880615,
0.03617134690284729,
0.013344096951186657,
0.03241414576768875,
0.07354675978422165,
0.08967765420675278,
-0.05735597386956215,
-0.03514113277196884,
0.021677710115909576,
-0.011996706016361713,
-0.051206279546022415,
-0.026977233588695526,
-0.05963728576898575,
-0.07861518859863281,
0.061727091670036316,
0.05891463905572891,
-0.01805051974952221,
0.01954537071287632,
-0.011577123776078224,
-0.03933536261320114,
0.0036058768164366484,
-0.06673797219991684,
-0.04261450469493866,
0.07209452986717224,
-0.02550220489501953,
0.07171111553907394,
0.00010961543739540502,
0.0962519720196724,
-0.055141158401966095,
0.021576162427663803,
0.02776721678674221,
0.035106536000967026,
0.004938576836138964,
-0.09609480202198029,
0.012514645233750343,
-0.0412844754755497,
0.011704719625413418,
-0.09060635417699814,
-0.10037311911582947,
-0.08410309255123138,
0.0030476453248411417,
0.04331507533788681,
-0.004718867130577564,
-0.10588699579238892,
-0.030476391315460205,
0.0036511768121272326,
-0.06063356623053551,
0.10880684852600098,
-0.043441809713840485,
0.061580758541822433,
-0.017169605940580368,
0.059277161955833435,
0.04879504069685936,
0.07872753590345383,
-0.09633882343769073,
-0.040704529732465744,
0.060044970363378525,
0.13581669330596924,
-0.03613034263253212,
-0.08665749430656433,
-0.12204431742429733,
-0.08999257534742355,
-0.044157475233078,
0.04708496108651161,
0.06464742124080658,
0.09794137626886368,
-0.2135879546403885,
-0.08536523580551147,
0.13935548067092896,
-0.08802914619445801,
-0.01874174177646637,
0.18256783485412598,
-0.02469693124294281,
0.13653987646102905,
0.15358956158161163,
0.2641619145870209,
0.15799909830093384,
-0.20351473987102509,
-0.027206970378756523,
0.022690117359161377,
-0.00800097081810236,
-0.03395543247461319,
0.06336820125579834,
-0.03395508602261543,
-0.036456309258937836,
0.036372240632772446,
-0.07596949487924576,
0.05521010980010033,
-0.05057220160961151,
-0.055945802479982376,
-0.01061252411454916,
-0.07756676524877548,
0.0955057218670845,
0.030405784025788307,
0.04253164306282997,
-0.013692831620573997,
-0.034377146512269974,
0.05701209232211113,
0.14077825844287872,
-0.14355705678462982,
0.043652646243572235,
-0.1062944233417511,
0.0379224568605423,
-0.09786488115787506,
-0.004912237636744976,
-0.10165376961231232,
0.15697263181209564,
-0.00219980557449162,
-0.01422730553895235,
0.053631458431482315,
0.14874176681041718,
0.011344484984874725,
0.006114019080996513,
-0.05337904021143913,
-0.039310310035943985,
0.020817646756768227,
-0.009688548743724823,
-0.06437612324953079,
-0.10195805877447128,
-0.011218917556107044,
-0.05233464390039444,
0.07444770634174347,
-0.12104534357786179,
-0.010528611950576305,
0.07078859955072403,
-0.005183488596230745,
0.0009314533090218902,
0.011315540410578251,
0.08634684979915619,
0.08981488645076752,
0.00931425392627716,
0.02021811157464981,
0.040327854454517365,
0.0014671289827674627,
-0.05510446056723595,
0.1490132361650467,
-0.12916988134384155,
-0.08959328383207321,
0.09975102543830872,
0.020279429852962494,
0.0021640530321747065,
-0.003736260114237666,
-0.01294103916734457,
-0.03541072830557823,
-0.07892969995737076,
0.012643367052078247,
0.31200292706489563,
-0.011278384365141392,
0.08812806010246277,
-0.08831795305013657,
-0.038612768054008484,
0.034474849700927734,
-0.08622293174266815,
0.04169071465730667,
0.06549348682165146,
-0.038036052137613297,
-0.012452932074666023,
0.015805259346961975,
-0.027877967804670334,
-0.08090447634458542,
0.26709645986557007,
-0.029959188774228096,
-0.08243170380592346,
-0.0009258000063709915,
-0.0413624532520771,
-0.04744165390729904,
0.08482412993907928,
-0.15317854285240173,
-0.03912718966603279,
0.042127057909965515,
0.05015374347567558,
0.06537783890962601,
-0.15461677312850952,
0.030792642384767532,
-0.0036747087724506855,
-0.1224895566701889,
-0.18432898819446564,
0.07935185730457306,
-0.044478293508291245,
0.03759027644991875,
-0.07952506840229034,
-0.004415420349687338,
0.000696474511642009,
-0.050702568143606186,
-0.1550641506910324,
0.13330474495887756,
-0.07123444974422455,
-0.20130500197410583,
-0.116325244307518,
0.008266356773674488,
0.0015315376222133636,
-0.010936489328742027,
0.09629887342453003,
-0.1197485700249672,
0.015154452063143253,
-0.023595748469233513,
0.12151665985584259,
0.003827958134934306,
-0.0037619031500071287,
-0.06124483421444893,
0.01325280126184225,
0.011694015003740788,
-0.13824595510959625,
0.013318482786417007,
-0.07188432663679123,
-0.0423712283372879,
0.04054655507206917,
0.009460587985813618,
0.017815185710787773,
0.18471680581569672,
0.022773828357458115,
0.012442689388990402,
-0.04903603345155716,
0.16488613188266754,
-0.10293643176555634,
-0.011928252875804901,
0.157151460647583,
-0.006006420589983463,
-0.01538509875535965,
0.11209989339113235,
0.016999077051877975,
-0.040755826979875565,
0.014085712842643261,
-0.00890234112739563,
-0.08852444589138031,
-0.22951088845729828,
-0.11389358341693878,
-0.044931672513484955,
-0.05074996501207352,
-0.020016586408019066,
-0.019055919721722603,
0.07746562361717224,
0.04153625667095184,
-0.027591390535235405,
-0.07010508328676224,
0.051255002617836,
-0.012337627820670605,
0.09336839616298676,
0.006867547519505024,
0.09329412877559662,
-0.049066878855228424,
-0.032376501709222794,
0.011129149235785007,
0.0632428377866745,
0.19431355595588684,
0.020834000781178474,
0.05007137358188629,
0.1020493134856224,
0.13986904919147491,
0.11633346974849701,
0.06267831474542618,
-0.05694843456149101,
-0.013931774534285069,
0.0226749237626791,
-0.05921446159482002,
0.005782547406852245,
0.05479954928159714,
0.14021143317222595,
-0.06279280036687851,
-0.0502559095621109,
-0.05024933069944382,
0.01819026842713356,
0.21639764308929443,
0.11343486607074738,
-0.16995500028133392,
-0.09111687541007996,
-0.0211910679936409,
-0.05948201194405556,
0.015791382640600204,
0.030348068103194237,
0.17875924706459045,
-0.14932039380073547,
0.03442365676164627,
-0.0015858926344662905,
0.07075950503349304,
-0.04126018285751343,
0.010653854347765446,
-0.10909518599510193,
0.03866583853960037,
-0.02710779756307602,
0.10556256771087646,
-0.25973430275917053,
0.21938833594322205,
0.015083680860698223,
0.14734873175621033,
-0.08179627358913422,
-0.03787907958030701,
0.010837821289896965,
-0.0010904986411333084,
0.08781338483095169,
0.021750029176473618,
-0.011656825430691242,
-0.13908421993255615,
-0.07751986384391785,
0.08203823119401932,
0.014155570417642593,
0.011401192285120487,
0.05728413164615631,
0.016990195959806442,
-0.003813547547906637,
0.010588112287223339,
-0.07632297277450562,
-0.15987606346607208,
-0.035500336438417435,
-0.0019763652235269547,
0.18606500327587128,
0.10040000081062317,
-0.014227516017854214,
-0.09054563194513321,
-0.06474165618419647,
0.006268779747188091,
-0.07474785298109055,
-0.0698322132229805,
-0.031564440578222275,
-0.06662820279598236,
0.13133648037910461,
-0.0855744257569313,
-0.010370059870183468,
0.09208473563194275,
0.10184815526008606,
-0.032608795911073685,
-0.07009798288345337,
0.02805974893271923,
-0.10987670719623566,
-0.14182297885417938,
0.007745412178337574,
0.18496188521385193,
0.1341387927532196,
0.08342861384153366,
0.08441633731126785,
0.018682774156332016,
-0.016921093687415123,
-0.05786101147532463,
0.030318796634674072,
0.09390115737915039,
-0.08257830142974854,
0.006750490050762892,
0.04652678221464157,
-0.16437488794326782,
-0.15430866181850433,
-0.07143387198448181,
0.1813124567270279,
0.10507832467556,
-0.0669427141547203,
0.19464415311813354,
0.23765327036380768,
-0.09228654205799103,
-0.23095764219760895,
-0.01688624732196331,
0.12817300856113434,
0.09974098205566406,
-0.04600830003619194,
-0.19788198173046112,
0.059614650905132294,
-0.023095574229955673,
-0.033002354204654694,
-0.05383043363690376,
-0.3079702854156494,
-0.16590575873851776,
0.15634891390800476,
-0.014484994113445282,
0.16914160549640656,
0.008414839394390583,
-0.027764908969402313,
-0.015100313350558281,
0.03944804146885872,
0.08745955675840378,
-0.1278759390115738,
0.1238764151930809,
0.009387565776705742,
0.11378759145736694,
0.05127701908349991,
-0.027138901874423027,
0.09019537270069122,
0.11272687464952469,
-0.011643564328551292,
0.007369160186499357,
0.04477831721305847,
0.0451747290790081,
-0.0040222397074103355,
0.11665808409452438,
-0.08379372954368591,
0.045106999576091766,
-0.08978535234928131,
-0.08125773817300797,
-0.09214921295642853,
0.08526119589805603,
0.03878602012991905,
-0.06339505314826965,
-0.015569073148071766,
-0.03477007523179054,
0.02802329696714878,
-0.007822750136256218,
-0.04109668731689453,
-0.08707848191261292,
0.04094895347952843,
0.13739389181137085,
0.1659332662820816,
-0.009613728150725365,
-0.03941537067294121,
-0.009191918186843395,
-0.048528384417295456,
0.11870300769805908,
-0.10401686280965805,
0.025483382865786552,
0.08472642302513123,
0.07233623415231705,
0.11391399055719376,
0.002023896435275674,
-0.11629178375005722,
0.10021251440048218,
0.026174256578087807,
-0.06803538650274277,
-0.13231633603572845,
-0.03775709867477417,
-0.057471998035907745,
-0.033962175250053406,
0.046397674828767776,
0.10527922213077545,
-0.0859009400010109,
-0.022836292162537575,
-0.03796195983886719,
0.023152414709329605,
-0.10739141702651978,
0.25835004448890686,
0.02898661233484745,
0.07654080539941788,
-0.11359401047229767,
0.014581222087144852,
-0.016928786411881447,
-0.007062818389385939,
0.06272398680448532,
-0.07863117009401321,
-0.07626387476921082,
-0.05835874378681183,
-0.012068432755768299,
0.042469002306461334,
0.059902604669332504,
-0.13788388669490814,
-0.013812419958412647,
-0.09009907394647598,
-0.01513959001749754,
0.08085259050130844,
0.049119722098112106,
0.02915729396045208,
-0.09986041486263275,
-0.05894848331809044,
-0.13169848918914795,
0.08023334294557571,
0.08847127109766006,
-0.020423049107193947,
-0.11790375411510468,
0.19814248383045197,
0.06960040330886841,
0.062371548265218735,
-0.054700661450624466,
-0.08929257839918137,
-0.017166508361697197,
0.079742930829525,
-0.08096326887607574,
0.0056233699433505535,
-0.07856778800487518,
-0.00005335498644853942,
-0.0063565275631845,
-0.08903095871210098,
-0.014731946401298046,
0.08596830070018768,
-0.09766455739736557,
0.08036518841981888,
-0.010772771202027798,
0.07863897830247879,
-0.07416515052318573,
0.004159059375524521,
0.03086663968861103,
-0.05404974892735481,
0.08549611270427704,
0.1305740922689438,
-0.09180735051631927,
0.14580607414245605,
-0.22689518332481384,
-0.04419713467359543,
0.0744878426194191,
0.06319799274206161,
-0.046050503849983215,
-0.13658149540424347,
0.018709450960159302,
0.07264143973588943,
0.07284000515937805,
0.005875511094927788,
0.10726634413003922,
-0.052801262587308884,
-0.023808781057596207,
-0.09541614353656769,
0.008293754421174526,
-0.04177398607134819,
0.05070904269814491,
0.06093977764248848,
0.1580365002155304,
0.17398874461650848,
-0.10377778857946396,
0.09931772947311401,
-0.10226573795080185,
0.00520480377599597,
-0.06115180253982544,
-0.048958804458379745,
-0.12262912839651108,
-0.07818305492401123,
0.06812881678342819,
-0.043604929000139236,
0.09710171073675156,
0.006384056061506271,
0.004995588678866625,
-0.0546068511903286,
-0.06221577525138855,
-0.004982227925211191,
-0.014342794194817543,
0.2846316397190094,
0.04351772367954254,
0.015754498541355133,
-0.038809966295957565,
0.028143921867012978,
0.008239852264523506,
0.08434984087944031,
-0.009257585741579533,
0.12457603961229324,
0.024843867868185043,
0.05320445075631142,
0.09498314559459686,
-0.05905169993638992,
-0.04924639314413071,
0.00535291712731123,
-0.16134338080883026,
0.048622459173202515,
-0.06758398562669754,
0.19775427877902985,
0.11590596288442612,
-0.1283402144908905,
0.06825010478496552,
0.02572341449558735,
-0.09791334718465805,
-0.15261052548885345,
-0.15368999540805817,
-0.08031660318374634,
-0.13866199553012848,
0.03331694006919861,
-0.09205363690853119,
0.019756510853767395,
0.06484284996986389,
0.0425352044403553,
-0.01198229193687439,
0.1575929820537567,
0.013856022618710995,
-0.10864768177270889,
0.1081741452217102,
-0.0831444188952446,
-0.046120453625917435,
-0.07292892783880234,
0.04198547080159187,
0.1738739013671875,
-0.015315044671297073,
0.07065074145793915,
0.00774161983281374,
-0.08356958627700806,
0.030218375846743584,
-0.07836206257343292,
-0.0771031603217125,
-0.016557559370994568,
-0.002377320546656847,
0.1288626790046692,
0.09416339546442032,
0.14395733177661896,
-0.07519318908452988,
0.003708204021677375,
0.15271776914596558,
-0.029236722737550735,
-0.139924556016922,
-0.13453111052513123,
0.14375829696655273,
0.030718684196472168,
0.006978493649512529,
0.030655736103653908,
-0.04516177996993065,
-0.004099000245332718,
0.20488853752613068,
0.2163933366537094,
0.0939406231045723,
0.03434308245778084,
-0.05387895181775093,
-0.004220365546643734,
-0.02008582092821598,
0.08270015567541122,
0.06941167265176773,
0.22114022076129913,
-0.032975781708955765,
0.035424500703811646,
-0.07746117562055588,
-0.07082846015691757,
0.011814464814960957,
0.057324763387441635,
-0.06504911929368973,
-0.13579080998897552,
0.001320627867244184,
0.14780956506729126,
-0.09064958244562149,
-0.12929393351078033,
-0.11288730800151825,
-0.09436164051294327,
-0.07679414004087448,
0.00457300990819931,
0.0258086696267128,
0.11524299532175064,
0.023241927847266197,
-0.08097357302904129,
0.06218729168176651,
0.07960807532072067,
0.010730643756687641,
-0.07158166915178299,
-0.03145445138216019,
0.06261254101991653,
-0.07664941996335983,
-0.0007884691585786641,
-0.01599995419383049,
0.20827773213386536,
0.02087983302772045,
0.0763690173625946,
-0.03296176344156265,
0.14529955387115479,
-0.011337601579725742,
-0.07870034128427505,
0.032709211111068726,
0.1606147736310959,
-0.017572226002812386,
0.09613849967718124,
0.0435791052877903,
-0.11268547922372818,
0.05793292447924614,
-0.13347835838794708,
0.011673880741000175,
-0.09813195466995239,
0.05714702978730202,
-0.03438560292124748,
0.08651486039161682,
0.05750695988535881,
-0.07677307724952698,
-0.025287508964538574,
-0.05065835639834404,
0.042112644761800766,
0.006875427905470133,
-0.06536632776260376,
-0.039174120873212814,
-0.23253799974918365,
-0.0034240917302668095,
-0.10125356167554855,
-0.001265914412215352,
-0.18818089365959167,
0.004157416056841612,
-0.022359684109687805,
-0.07442687451839447,
0.01088041067123413,
0.04608947038650513,
0.09664428979158401,
-0.003124828916043043,
-0.0018306694692000747,
-0.03540696203708649,
0.027977706864476204,
0.10145609825849533,
-0.17877520620822906,
-0.10635735094547272
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Latvian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Latvian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "lv", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-latvian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-latvian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Latvian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/lv.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-latvian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-latvian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/lv/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/lv/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 26.89 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "lv", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Latvian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice lv", "type": "common_voice", "args": "lv"}, "metrics": [{"type": "wer", "value": 26.89, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-latvian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"lv",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"lv"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #lv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Latvian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Latvian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Latvian test data of Common Voice.
Test Result: 26.89 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Latvian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Latvian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Latvian test data of Common Voice.\n\n\n\nTest Result: 26.89 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #lv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Latvian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Latvian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Latvian test data of Common Voice.\n\n\n\nTest Result: 26.89 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
66,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #lv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Latvian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Latvian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Latvian test data of Common Voice.\n\n\n\nTest Result: 26.89 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1595914512872696,
-0.037238623946905136,
-0.0035978329833596945,
-0.0007070983992889524,
0.09660810977220535,
-0.07335890084505081,
0.17609252035617828,
0.09235793352127075,
0.04357181116938591,
-0.007608192507177591,
0.047139544039964676,
-0.015950243920087814,
0.019727300852537155,
0.049656134098768234,
0.0031172004528343678,
-0.19737038016319275,
0.010249322280287743,
-0.012020589783787727,
0.015000974759459496,
0.09291429817676544,
0.11350696533918381,
-0.059186968952417374,
0.014517925679683685,
0.07816825807094574,
-0.10188736766576767,
0.07878998667001724,
0.05036439746618271,
-0.08888677507638931,
0.1260143369436264,
0.07293977588415146,
0.020526621490716934,
0.03896001726388931,
0.07527858763933182,
-0.20554497838020325,
0.010750084184110165,
0.022576143965125084,
0.029269158840179443,
0.016286784783005714,
0.04792294278740883,
-0.05625595524907112,
0.13210059702396393,
0.011688142083585262,
-0.022202221676707268,
0.07527526468038559,
-0.06395550072193146,
-0.24411597847938538,
-0.004125581122934818,
-0.047440312802791595,
0.11999610811471939,
0.12747520208358765,
-0.07141051441431046,
0.08746645599603653,
-0.13847695291042328,
0.10346638411283493,
0.17193396389484406,
-0.18259918689727783,
-0.007304322440177202,
0.08833026140928268,
0.055261172354221344,
0.06784706562757492,
-0.04439732804894447,
0.06950395554304123,
0.032886750996112823,
0.03664914146065712,
0.03506326302886009,
-0.03912780061364174,
-0.22747881710529327,
-0.0027803427074104548,
-0.1600472629070282,
-0.017788361757993698,
0.23481549322605133,
-0.005552610848098993,
-0.05188439413905144,
-0.14802315831184387,
0.008432818576693535,
-0.0025960972998291254,
-0.013708843849599361,
-0.05522511526942253,
-0.00550891412422061,
-0.0035088579170405865,
0.024624047800898552,
-0.05446299910545349,
-0.10669170320034027,
-0.14620627462863922,
0.033045656979084015,
0.09311751276254654,
0.01654202863574028,
0.022226542234420776,
-0.08377687633037567,
0.0628276914358139,
-0.12500853836536407,
-0.05799969658255577,
-0.005737430416047573,
0.02211380936205387,
-0.07859853655099869,
0.03806557506322861,
-0.0672421008348465,
-0.19664175808429718,
0.030634891241788864,
-0.03891054168343544,
0.0661734864115715,
0.0010133894393220544,
0.009478212334215641,
0.061355896294116974,
0.014709755778312683,
0.15232230722904205,
-0.04934382066130638,
-0.01878865249454975,
0.010493508540093899,
-0.024002471938729286,
-0.04718238487839699,
-0.010412958450615406,
-0.08930204808712006,
-0.0790124163031578,
-0.010539054870605469,
0.07838853448629379,
-0.05554831773042679,
0.016694677993655205,
-0.03277036175131798,
-0.03689396753907204,
0.017131377011537552,
-0.09799204021692276,
-0.027331659570336342,
0.06163947656750679,
0.0151578513905406,
0.0874699205160141,
0.07435573637485504,
0.03592522069811821,
-0.09163490682840347,
-0.06492527574300766,
0.042718250304460526,
0.05423823371529579,
-0.008621170185506344,
-0.15181677043437958,
0.02655022405087948,
-0.06916231662034988,
0.009037335403263569,
-0.10371702164411545,
-0.07269531488418579,
-0.07964704185724258,
-0.002623013686388731,
0.020817678421735764,
0.002201840514317155,
-0.12112374603748322,
-0.030407210811972618,
-0.02193242684006691,
-0.03793281689286232,
0.03865142539143562,
-0.04002084955573082,
0.032071296125650406,
-0.03429865092039108,
0.05974831059575081,
0.047347426414489746,
0.07173442840576172,
-0.1099148839712143,
-0.049888551235198975,
0.04028987139463425,
0.13342086970806122,
-0.08281411975622177,
-0.06032510846853256,
-0.08590469509363174,
-0.08248735964298248,
-0.01679224520921707,
0.06112154200673103,
0.06479824334383011,
0.11979536712169647,
-0.2696515619754791,
-0.10863664001226425,
0.18036070466041565,
-0.09980863332748413,
-0.016733581200242043,
0.2031071037054062,
-0.004590654280036688,
0.10703633725643158,
0.16250477731227875,
0.25021418929100037,
0.16985167562961578,
-0.2424904853105545,
0.04214192181825638,
0.03862792253494263,
-0.02571178413927555,
-0.055388014763593674,
0.08514747768640518,
-0.04177260398864746,
-0.0227771308273077,
0.05986442044377327,
-0.06624677032232285,
0.06499715149402618,
-0.028515370562672615,
-0.06103801727294922,
-0.015153043903410435,
-0.07008593529462814,
0.07612734287977219,
0.05063987523317337,
-0.0042498051188886166,
-0.054252028465270996,
-0.05765681341290474,
0.09629950672388077,
0.12078217417001724,
-0.13324834406375885,
0.05528082698583603,
-0.08965139836072922,
0.07628411799669266,
-0.09196693450212479,
-0.010206910781562328,
-0.12031184136867523,
0.165077343583107,
-0.026053156703710556,
0.061378903687000275,
0.05870889872312546,
0.21865986287593842,
0.040909651666879654,
0.014035046100616455,
-0.044781189411878586,
0.013589059934020042,
-0.02355079911649227,
-0.02325916290283203,
-0.04659377783536911,
-0.12921854853630066,
-0.0236680805683136,
-0.07061099261045456,
0.037474580109119415,
-0.1436440646648407,
-0.01256606262177229,
0.025711681693792343,
-0.013131799176335335,
0.044291235506534576,
-0.011424408294260502,
0.04185650870203972,
0.10458212345838547,
0.030864672735333443,
0.006638757884502411,
0.050212759524583817,
-0.017092404887080193,
-0.010114848613739014,
0.15759292244911194,
-0.1022271066904068,
-0.02874893881380558,
0.07647106796503067,
-0.013671083375811577,
-0.002536145271733403,
0.049026649445295334,
-0.032145485281944275,
0.00784164946526289,
-0.06796150654554367,
-0.011375294998288155,
0.2661651074886322,
-0.0026530034374445677,
0.10386435687541962,
-0.10553975403308868,
-0.010225513949990273,
0.04650556668639183,
-0.10323455184698105,
0.03798374533653259,
0.13586638867855072,
0.023637181147933006,
0.0645315870642662,
0.009805434383451939,
-0.061726123094558716,
-0.07758820801973343,
0.30234283208847046,
-0.01589331217110157,
-0.09561073780059814,
0.009039419703185558,
-0.0423835813999176,
-0.022704022005200386,
0.1124846413731575,
-0.18647703528404236,
-0.03294484689831734,
0.03974369168281555,
0.0665266141295433,
0.07802121341228485,
-0.13211621344089508,
0.0017319262260571122,
0.016585946083068848,
-0.15135130286216736,
-0.16253291070461273,
0.12961886823177338,
-0.03187733143568039,
0.051903873682022095,
-0.10578322410583496,
-0.018715549260377884,
-0.050241608172655106,
-0.049995336681604385,
-0.17434121668338776,
0.11876919120550156,
-0.08352813869714737,
-0.22050203382968903,
-0.1787828654050827,
0.003626752644777298,
-0.03220925107598305,
0.015902722254395485,
0.09937900304794312,
-0.1221112608909607,
-0.026744695380330086,
-0.042224712669849396,
0.19629082083702087,
0.0308074913918972,
-0.048597514629364014,
-0.07799725979566574,
0.05072867497801781,
0.07219833135604858,
-0.13957150280475616,
-0.0014004872646182775,
-0.08161326497793198,
-0.014720821753144264,
-0.03594665601849556,
-0.028190216049551964,
0.008796490728855133,
0.14908233284950256,
0.0394991859793663,
0.021283483132719994,
-0.022023752331733704,
0.24169521033763885,
-0.07212936878204346,
-0.02853313274681568,
0.20778819918632507,
0.017170719802379608,
-0.04162980243563652,
0.0893356129527092,
0.04254966229200363,
-0.05785248801112175,
-0.021790577098727226,
0.016869863495230675,
-0.10133077204227448,
-0.24699720740318298,
-0.16164138913154602,
-0.0789884477853775,
-0.044256698340177536,
-0.02133280225098133,
0.0001692005607765168,
-0.00673703895881772,
0.007046972867101431,
-0.016618195921182632,
-0.07049060612916946,
0.059345636516809464,
-0.017540063709020615,
0.13432340323925018,
-0.03133239597082138,
0.07179022580385208,
-0.0638279840350151,
-0.04013874754309654,
0.037726759910583496,
0.000870753894560039,
0.16494318842887878,
0.03350676968693733,
0.031996432691812515,
0.11248930543661118,
0.09124665707349777,
0.06804841011762619,
0.07872723042964935,
-0.044435109943151474,
-0.024089151993393898,
0.021240685135126114,
-0.07473423331975937,
0.01109308935701847,
0.040709737688302994,
0.10707996040582657,
-0.047559160739183426,
-0.07141068577766418,
-0.0448027141392231,
0.029094113036990166,
0.21304260194301605,
0.10775008052587509,
-0.15951849520206451,
-0.08019017428159714,
-0.01818656735122204,
-0.0461154580116272,
-0.017703114077448845,
0.03630299121141434,
0.1643649786710739,
-0.11588768661022186,
0.029540542513132095,
-0.00859837420284748,
0.098922960460186,
0.05435049533843994,
0.026046816259622574,
-0.09287624061107635,
0.040676843374967575,
0.006152466405183077,
0.12315680086612701,
-0.22833968698978424,
0.2254280000925064,
0.0027617409359663725,
0.11054899543523788,
-0.06065216287970543,
-0.016470979899168015,
-0.02357974275946617,
0.0829649418592453,
0.14316877722740173,
0.04103490710258484,
0.07334061712026596,
-0.12016504257917404,
-0.07954172044992447,
0.05938336253166199,
-0.023040303960442543,
0.01190792303532362,
0.029962187632918358,
0.002064557047560811,
-0.007462887559086084,
0.0017185078468173742,
-0.06535544246435165,
-0.11666404455900192,
-0.039875660091638565,
-0.007270846050232649,
0.1430913358926773,
0.033346567302942276,
0.001409625867381692,
-0.10399007052183151,
-0.12099377065896988,
0.0673752874135971,
-0.09020185470581055,
-0.061029158532619476,
-0.05862554535269737,
-0.07370579987764359,
0.1396419256925583,
-0.06923047453165054,
-0.005794007796794176,
0.09998529404401779,
0.13106144964694977,
-0.05580529198050499,
-0.07157920300960541,
0.04197005555033684,
-0.11455019563436508,
-0.10538392513990402,
0.02275848761200905,
0.21055291593074799,
0.11129199713468552,
0.06578805297613144,
0.048069603741168976,
0.01981876976788044,
-0.014944063499569893,
-0.05750242993235588,
0.01383709255605936,
0.13448074460029602,
-0.16028174757957458,
0.01454805489629507,
0.010671639814972878,
-0.1351679116487503,
-0.12551343441009521,
-0.0376780666410923,
0.16930273175239563,
0.13550826907157898,
-0.10428833216428757,
0.21703281998634338,
0.12540331482887268,
-0.10925043374300003,
-0.233701691031456,
-0.00016689300537109375,
0.1278209090232849,
0.15965689718723297,
0.029997337609529495,
-0.147888645529747,
0.038512229919433594,
-0.03461465984582901,
-0.037890639156103134,
-0.05588456615805626,
-0.280737042427063,
-0.17524413764476776,
0.1565665751695633,
-0.05952220782637596,
0.16974489390850067,
0.06651607900857925,
-0.006994814611971378,
-0.02350051887333393,
0.06650234013795853,
0.054876554757356644,
-0.07794560492038727,
0.1097223088145256,
0.013822881504893303,
0.06699920445680618,
0.07345600426197052,
-0.0005667586228810251,
0.11947838217020035,
0.07247511297464371,
0.01950531080365181,
0.01657056249678135,
0.08513499796390533,
0.013209268450737,
-0.010160033591091633,
0.16048631072044373,
-0.08867885172367096,
0.03647711127996445,
-0.10142012685537338,
-0.10133381187915802,
-0.09697183221578598,
0.10025722533464432,
0.039189524948596954,
-0.03716496005654335,
-0.004396815784275532,
-0.05302572622895241,
0.016985418274998665,
-0.013752843253314495,
-0.025300929322838783,
-0.1464073359966278,
0.019263217225670815,
0.13140156865119934,
0.1719847172498703,
-0.06761883944272995,
-0.10085050761699677,
0.031932372599840164,
-0.01680758036673069,
0.1199793890118599,
-0.10642019659280777,
0.03744637966156006,
0.07427404820919037,
0.04577651992440224,
0.11147338151931763,
-0.015531860291957855,
-0.11770304292440414,
0.035164233297109604,
0.0403621569275856,
-0.08171659708023071,
-0.09804674237966537,
-0.0704103484749794,
-0.09459591656923294,
-0.01689947582781315,
0.03275241702795029,
0.13469341397285461,
-0.11317946016788483,
0.002768700709566474,
-0.03481476381421089,
0.015865234658122063,
-0.13746514916419983,
0.18921242654323578,
0.04282451421022415,
0.06992626190185547,
-0.10870853811502457,
-0.0013583314139395952,
-0.043293360620737076,
-0.03132973983883858,
0.052793391048908234,
-0.0371144600212574,
-0.05864689126610756,
-0.05623378977179527,
0.006052836310118437,
0.09746049344539642,
0.05621098726987839,
-0.12227711826562881,
-0.06858427077531815,
-0.11825252324342728,
0.012022910639643669,
0.10541900247335434,
0.08127856254577637,
0.016615519300103188,
-0.1269799768924713,
-0.07938118278980255,
-0.10591860860586166,
0.09232612699270248,
0.10258270800113678,
-0.059018079191446304,
-0.11494209617376328,
0.17111065983772278,
0.07065712660551071,
0.05487899109721184,
-0.057892002165317535,
-0.05948982387781143,
-0.0008786140824668109,
0.0847867801785469,
-0.08421262353658676,
-0.008426151238381863,
-0.031967416405677795,
0.02210898883640766,
-0.024161431938409805,
-0.0783083438873291,
-0.04129517450928688,
0.08198398351669312,
-0.09272844344377518,
0.08603990077972412,
-0.035876158624887466,
0.08153305947780609,
-0.06262539327144623,
0.0003674011677503586,
0.030565839260816574,
-0.054881490767002106,
0.07320592552423477,
0.10208679735660553,
-0.09675892442464828,
0.1568676382303238,
-0.18681320548057556,
-0.011662404052913189,
0.09598476439714432,
0.08766639977693558,
-0.01487947441637516,
-0.09490304440259933,
0.030299803242087364,
0.07769951969385147,
0.07516317814588547,
0.001476524630561471,
0.031440380960702896,
-0.0621781088411808,
0.0011332777794450521,
-0.03056747280061245,
-0.02209431864321232,
-0.0134656373411417,
0.057612400501966476,
0.019253257662057877,
0.1745782196521759,
0.1487189084291458,
-0.11052756756544113,
0.1197173148393631,
-0.12634848058223724,
0.008355856873095036,
-0.05602765083312988,
-0.016048919409513474,
-0.12551584839820862,
-0.06186063215136528,
0.07091132551431656,
-0.04796682670712471,
0.17988111078739166,
0.10690034925937653,
0.05372084304690361,
-0.03804564103484154,
-0.04698669910430908,
0.031770579516887665,
-0.009887556545436382,
0.16167905926704407,
0.008431770838797092,
0.05597942695021629,
-0.06884770095348358,
0.010415623895823956,
0.02113831602036953,
0.15262098610401154,
-0.0312579944729805,
0.14393018186092377,
0.032818980515003204,
0.0652502179145813,
0.1021164059638977,
-0.04832635447382927,
-0.028663814067840576,
-0.02922283299267292,
-0.09517896920442581,
0.05517591908574104,
-0.08582546561956406,
0.15488697588443756,
0.117758609354496,
-0.11815595626831055,
0.11271976679563522,
0.045526910573244095,
-0.09383416175842285,
-0.19292497634887695,
-0.13530172407627106,
-0.0664585530757904,
-0.17320621013641357,
0.013430939987301826,
-0.09750453382730484,
0.01911950670182705,
0.02547875978052616,
0.04919581487774849,
-0.020537184551358223,
0.13131900131702423,
-0.024337491020560265,
-0.11326543241739273,
0.09619591385126114,
-0.08951692283153534,
-0.00725153973326087,
-0.08254116773605347,
0.0859794020652771,
0.18894636631011963,
-0.002870564116165042,
0.03966287523508072,
0.02155487611889839,
-0.06186080351471901,
0.008705055341124535,
-0.09991800785064697,
-0.06724127382040024,
-0.03649319335818291,
0.00021653289149980992,
0.07640533149242401,
0.13612717390060425,
0.12960998713970184,
-0.0800212100148201,
-0.01090230606496334,
0.146021768450737,
-0.030377108603715897,
-0.14581073820590973,
-0.1589205265045166,
0.1299210488796234,
0.040616195648908615,
0.0509989857673645,
-0.026618460193276405,
-0.02476460114121437,
-0.03225674480199814,
0.2564806342124939,
0.2523162066936493,
0.09927789121866226,
0.033233966678380966,
-0.05643356218934059,
-0.0026867459528148174,
-0.028631821274757385,
0.06564582139253616,
0.01979372836649418,
0.25830191373825073,
-0.02373582497239113,
0.005816401448100805,
-0.11341069638729095,
-0.049522917717695236,
0.01288381963968277,
0.07869913429021835,
-0.04716792702674866,
-0.1225273460149765,
0.004535040818154812,
0.14305917918682098,
-0.05083223059773445,
-0.042874839156866074,
-0.1115955263376236,
-0.10161523520946503,
-0.09830302000045776,
-0.021564563736319542,
0.02260870672762394,
0.10673177987337112,
0.03380105644464493,
-0.08467155694961548,
0.015926510095596313,
0.14006748795509338,
-0.01265376340597868,
-0.07195687294006348,
-0.07315544784069061,
0.06411757320165634,
-0.04738378897309303,
0.000542520429007709,
-0.004693659953773022,
0.15787675976753235,
-0.004163271281868219,
0.07539165019989014,
-0.020686520263552666,
0.12809953093528748,
-0.04404875263571739,
-0.04726063087582588,
0.02653643861413002,
0.12143505364656448,
-0.052402347326278687,
0.1201334074139595,
0.009232274256646633,
-0.1294800192117691,
0.04197264462709427,
-0.11962942034006119,
-0.049041520804166794,
-0.057386528700590134,
0.0794312134385109,
-0.05660820007324219,
0.043463386595249176,
0.11543694883584976,
-0.061596114188432693,
-0.056910328567028046,
-0.07105745375156403,
0.07409109175205231,
0.015497244894504547,
-0.06888823956251144,
-0.021866092458367348,
-0.22668354213237762,
-0.004756892565637827,
-0.0239595677703619,
-0.03996649011969566,
-0.20895074307918549,
-0.023404298350214958,
-0.008647708222270012,
-0.07322371006011963,
0.04189220070838928,
0.052550651133060455,
0.056490927934646606,
0.0038794351276010275,
0.0065732127986848354,
-0.010163742117583752,
0.07644103467464447,
0.10583353042602539,
-0.14594943821430206,
-0.11602674424648285
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Lithuanian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Lithuanian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "lt", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-lithuanian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-lithuanian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Lithuanian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/lt.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-lithuanian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-lithuanian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/lt/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/lt/clips/"
def clean_sentence(sent):
sent = sent.lower()
# normalize apostrophes
sent = sent.replace("’", "'")
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() or ch == "'" else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 49.00 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "lt", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Lithuanian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice lt", "type": "common_voice", "args": "lt"}, "metrics": [{"type": "wer", "value": 49.0, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-lithuanian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"lt",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"lt"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #lt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Lithuanian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Lithuanian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Lithuanian test data of Common Voice.
Test Result: 49.00 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Lithuanian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Lithuanian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Lithuanian test data of Common Voice.\n\n\n\nTest Result: 49.00 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #lt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Lithuanian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Lithuanian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Lithuanian test data of Common Voice.\n\n\n\nTest Result: 49.00 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
67,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #lt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Lithuanian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Lithuanian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Lithuanian test data of Common Voice.\n\n\n\nTest Result: 49.00 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.14556801319122314,
0.009491553530097008,
-0.0021860250271856785,
-0.0075595797970891,
0.07599619030952454,
-0.04951390624046326,
0.18553446233272552,
0.10328111797571182,
-0.015330201014876366,
-0.01572481170296669,
0.047637440264225006,
-0.010040662251412868,
0.03897843137383461,
0.06993274390697479,
-0.0016566659323871136,
-0.17390072345733643,
0.023387201130390167,
0.0014250517124310136,
0.013648523949086666,
0.10122384876012802,
0.10811330378055573,
-0.06847221404314041,
0.0006962207262404263,
0.07908546924591064,
-0.1225578784942627,
0.061985548585653305,
0.04520368203520775,
-0.10133061558008194,
0.13680407404899597,
0.0762084573507309,
0.039407216012477875,
0.029032833874225616,
0.06649199873209,
-0.20542515814304352,
0.014378935098648071,
0.031646162271499634,
0.03621375188231468,
0.0068938531912863255,
0.03545290231704712,
-0.023187538608908653,
0.11257713288068771,
0.019314415752887726,
-0.011525998823344707,
0.05317351594567299,
-0.06183686852455139,
-0.24365849792957306,
-0.031507913023233414,
0.005634438246488571,
0.11556859314441681,
0.137851744890213,
-0.06294936686754227,
0.12517514824867249,
-0.11362732201814651,
0.09047871083021164,
0.060828566551208496,
-0.15957607328891754,
-0.013502002693712711,
0.07472162693738937,
0.010921631008386612,
0.07882365584373474,
-0.045766640454530716,
0.041352491825819016,
0.030245695263147354,
0.0509776771068573,
-0.0060125552117824554,
-0.045071572065353394,
-0.2347097396850586,
-0.009249472059309483,
-0.14662288129329681,
-0.036850765347480774,
0.24109971523284912,
-0.030501436442136765,
-0.06810609996318817,
-0.1488250344991684,
0.017047831788659096,
0.024721795693039894,
-0.0016845534555613995,
-0.0501253604888916,
-0.019270649179816246,
0.014725151471793652,
0.012898820452392101,
-0.04166385158896446,
-0.12068904936313629,
-0.11541834473609924,
0.017804911360144615,
0.1143762469291687,
0.009893120266497135,
0.030272003263235092,
-0.10277847200632095,
0.05943017080426216,
-0.09087686985731125,
-0.0681501105427742,
0.009554065763950348,
0.028784066438674927,
-0.0761900469660759,
0.016685007140040398,
-0.07572045177221298,
-0.22403521835803986,
0.03506848216056824,
-0.05398629605770111,
0.06068694218993187,
0.00477715115994215,
-0.0031497336458414793,
0.05844265967607498,
0.03231889009475708,
0.1419997215270996,
-0.05722080543637276,
-0.039729777723550797,
0.0058954325504601,
-0.03497782722115517,
-0.039206720888614655,
-0.011896776966750622,
-0.07056606560945511,
-0.0672902762889862,
-0.003953331150114536,
0.05670392885804176,
-0.02210661582648754,
0.001228843815624714,
-0.020889025181531906,
-0.04670247063040733,
-0.032047249376773834,
-0.09098589420318604,
-0.046636588871479034,
0.03941415995359421,
0.012137449346482754,
0.1285507082939148,
0.020579542964696884,
0.042250946164131165,
-0.08386678993701935,
-0.013285067863762379,
0.028283612802624702,
0.03234287351369858,
-0.0070616863667964935,
-0.14159667491912842,
0.028294391930103302,
-0.020396146923303604,
0.014844127930700779,
-0.087751105427742,
-0.08008713275194168,
-0.0902334600687027,
0.011217284016311169,
0.03533690422773361,
-0.005007701925933361,
-0.09119540452957153,
-0.023665370419621468,
-0.005761098116636276,
-0.043177757412195206,
0.05214430019259453,
-0.03312317654490471,
0.058786630630493164,
-0.02516815811395645,
0.0317370779812336,
0.012022488750517368,
0.07423072308301926,
-0.12079188972711563,
-0.05375354364514351,
0.011205418035387993,
0.1333041787147522,
-0.06283493340015411,
-0.047200266271829605,
-0.0925099179148674,
-0.06748495995998383,
-0.0007578951772302389,
0.055550526827573776,
0.06771332025527954,
0.11137548089027405,
-0.27889034152030945,
-0.08332991600036621,
0.14009739458560944,
-0.13186950981616974,
-0.01636406220495701,
0.19460447132587433,
-0.01690012216567993,
0.11212976276874542,
0.15805278718471527,
0.2743956446647644,
0.18306303024291992,
-0.18325845897197723,
0.012335247360169888,
0.0010200453689321876,
-0.005727540235966444,
-0.047308310866355896,
0.0783117413520813,
-0.03602313622832298,
-0.03142842277884483,
0.04270641505718231,
-0.09735425561666489,
0.06789097934961319,
-0.008063791319727898,
-0.06835413724184036,
-0.016127215698361397,
-0.07768680155277252,
0.08782726526260376,
0.03592017665505409,
-0.007747845258563757,
-0.05397728458046913,
-0.05753639340400696,
0.08856746554374695,
0.13521996140480042,
-0.1382739096879959,
0.04045206308364868,
-0.09475739300251007,
0.06646538525819778,
-0.09890057891607285,
-0.010466665029525757,
-0.113939568400383,
0.14668609201908112,
-0.020330727100372314,
0.06311658024787903,
0.06648007035255432,
0.1723303347826004,
0.032423216849565506,
0.0035979091189801693,
-0.038204021751880646,
-0.00866518635302782,
-0.027215560898184776,
-0.02323976717889309,
-0.039553578943014145,
-0.11468643695116043,
-0.012881669215857983,
-0.059692565351724625,
0.0361134335398674,
-0.13858462870121002,
-0.014600221998989582,
0.06436226516962051,
-0.0041788979433476925,
0.027067046612501144,
-0.016213761642575264,
0.05299513414502144,
0.11074080318212509,
0.020334800705313683,
0.007583286613225937,
0.03447623923420906,
-0.0044589960016310215,
-0.025539565831422806,
0.14687873423099518,
-0.11950718611478806,
-0.02137177065014839,
0.09986621141433716,
0.021819615736603737,
-0.005988605320453644,
0.05265866592526436,
-0.011425665579736233,
0.0022681879345327616,
-0.0737844854593277,
0.013535067439079285,
0.3070773184299469,
0.005832407157868147,
0.07231359928846359,
-0.09257720410823822,
-0.011856704950332642,
0.04356647655367851,
-0.07281886041164398,
0.03724685311317444,
0.09740430116653442,
-0.013542943634092808,
0.04097456857562065,
0.03523610532283783,
-0.06489699333906174,
-0.062240127474069595,
0.25496986508369446,
-0.031169187277555466,
-0.09857314079999924,
0.006884527392685413,
-0.028632158413529396,
-0.019857369363307953,
0.15030254423618317,
-0.1778586357831955,
-0.013310788199305534,
0.035224005579948425,
0.06956613063812256,
0.06978646665811539,
-0.1414395272731781,
-0.0015280175721272826,
0.009685203433036804,
-0.1381387859582901,
-0.15733660757541656,
0.0900462344288826,
-0.02771303802728653,
0.04320825636386871,
-0.10372303426265717,
0.028540113940835,
-0.03407905250787735,
-0.04180356487631798,
-0.16277329623699188,
0.12895414233207703,
-0.08611612766981125,
-0.2623233497142792,
-0.1833728849887848,
-0.010151006281375885,
-0.009695015847682953,
-0.00370326591655612,
0.08982337266206741,
-0.1308797001838684,
-0.026387488469481468,
-0.013662700541317463,
0.18120788037776947,
0.003044386859983206,
-0.039584930986166,
-0.07350294291973114,
0.0381927415728569,
0.06399650871753693,
-0.13585825264453888,
0.013593241572380066,
-0.05691054090857506,
0.02146240882575512,
-0.008159912191331387,
-0.0004446617094799876,
0.011240293271839619,
0.1587236374616623,
0.005329471081495285,
0.012287230230867863,
-0.029726658016443253,
0.22840669751167297,
-0.11053241044282913,
-0.03366798534989357,
0.21370287239551544,
-0.017818571999669075,
-0.04356338456273079,
0.1107156053185463,
0.026127025485038757,
-0.06008124351501465,
-0.013629124499857426,
-0.011736489832401276,
-0.0863485038280487,
-0.25722649693489075,
-0.15970444679260254,
-0.06969013065099716,
-0.037599921226501465,
-0.013440465554594994,
-0.002642517676576972,
-0.024848027154803276,
0.02759283222258091,
-0.01973925530910492,
-0.046084340661764145,
0.04074540734291077,
-0.01105236355215311,
0.17969205975532532,
-0.014694781973958015,
0.07075712829828262,
-0.060409385710954666,
-0.03127945587038994,
0.04773418605327606,
0.03662661090493202,
0.12615454196929932,
0.029208553954958916,
0.006720396690070629,
0.0871855616569519,
0.1053682267665863,
0.07828576117753983,
0.06485503166913986,
-0.052260421216487885,
-0.012798063457012177,
0.017000360414385796,
-0.07208923250436783,
-0.009760239161550999,
0.036273177713155746,
0.11256512254476547,
-0.04630730673670769,
-0.05935508385300636,
-0.04487210884690285,
0.02647567167878151,
0.1909482777118683,
0.11305340379476547,
-0.17000846564769745,
-0.10677126049995422,
-0.024104122072458267,
-0.06628623604774475,
-0.008153253234922886,
0.027541374787688255,
0.19419769942760468,
-0.13405360281467438,
0.025939136743545532,
-0.009148120880126953,
0.08538047224283218,
-0.0045632668770849705,
0.017755795270204544,
-0.09108398109674454,
0.020570170134305954,
-0.009352294728159904,
0.11341699957847595,
-0.2439998984336853,
0.2166217416524887,
0.01268725749105215,
0.12389097362756729,
-0.06042031943798065,
-0.018915748223662376,
-0.014112181030213833,
0.046769775450229645,
0.11693715304136276,
0.03486642241477966,
0.0353630967438221,
-0.09145574271678925,
-0.10198387503623962,
0.06755656003952026,
-0.021771378815174103,
0.05170959234237671,
0.048670053482055664,
0.006150598172098398,
0.023037876933813095,
-0.0023099102545529604,
-0.05069324001669884,
-0.09722206741571426,
-0.0563654787838459,
-0.0006589212571270764,
0.1613321304321289,
0.07563045620918274,
-0.010960699990391731,
-0.08931595087051392,
-0.05640975013375282,
0.07198353856801987,
-0.07531510293483734,
-0.06681983172893524,
-0.07882899791002274,
-0.055369142442941666,
0.1516934484243393,
-0.06164234131574631,
-0.017024733126163483,
0.09786596149206161,
0.13168947398662567,
-0.06308657675981522,
-0.05881020799279213,
0.023344233632087708,
-0.10542381554841995,
-0.09613651037216187,
0.003905067453160882,
0.1777064949274063,
0.1057681292295456,
0.08661215007305145,
0.060243818908929825,
0.020255615934729576,
-0.015806743875145912,
-0.06283844262361526,
0.0034333262592554092,
0.08612743020057678,
-0.14218252897262573,
0.01385014969855547,
0.02809925004839897,
-0.16179512441158295,
-0.1442125141620636,
-0.05733601376414299,
0.1916268914937973,
0.10829533636569977,
-0.08343952894210815,
0.2167702168226242,
0.13604658842086792,
-0.11725414544343948,
-0.22323115170001984,
0.021091599017381668,
0.11066225916147232,
0.15119384229183197,
0.0035790917463600636,
-0.16449297964572906,
0.061029914766550064,
-0.020392682403326035,
-0.01914135180413723,
-0.05196552723646164,
-0.29274916648864746,
-0.1548944115638733,
0.1391456574201584,
-0.027497870847582817,
0.14343364536762238,
0.0413411408662796,
-0.004434308968484402,
-0.025199169293045998,
0.04050838202238083,
0.054464712738990784,
-0.07527639716863632,
0.09739388525485992,
0.021586155518889427,
0.13389137387275696,
0.0627618134021759,
-0.014023820869624615,
0.09979245811700821,
0.07036634534597397,
0.009243318811058998,
0.019562065601348877,
0.07165198773145676,
0.05191173404455185,
0.01842072233557701,
0.1484357714653015,
-0.0760493353009224,
0.037584275007247925,
-0.13010096549987793,
-0.09481190145015717,
-0.084775909781456,
0.07180957496166229,
0.030131537467241287,
-0.059002190828323364,
0.002346532652154565,
-0.030124664306640625,
0.023576030507683754,
-0.009646392427384853,
-0.0690639466047287,
-0.11754197627305984,
0.039336398243904114,
0.06751634180545807,
0.17088383436203003,
-0.028455516323447227,
-0.12745681405067444,
0.022688686847686768,
-0.021064219996333122,
0.10365240275859833,
-0.1080380529165268,
0.026826726272702217,
0.07540223747491837,
0.054364509880542755,
0.1015469953417778,
-0.004681740887463093,
-0.11263356357812881,
0.06518279016017914,
0.03825758770108223,
-0.07688841968774796,
-0.09040580689907074,
-0.03709257394075394,
-0.027330739423632622,
-0.03811560571193695,
0.014934948645532131,
0.13869412243366241,
-0.11638464778661728,
-0.008024693466722965,
-0.025666192173957825,
0.003301272401586175,
-0.13590121269226074,
0.22122244536876678,
0.03681100532412529,
0.07683423906564713,
-0.11008601635694504,
0.01908152550458908,
-0.019825680181384087,
-0.061897143721580505,
0.054771121591329575,
-0.0375850610435009,
-0.061082623898983,
-0.04987778887152672,
0.017530476674437523,
0.027638982981443405,
0.03662848100066185,
-0.12492120265960693,
-0.05758162587881088,
-0.11863251030445099,
-0.0008141480502672493,
0.1139931008219719,
0.07914718985557556,
0.021017177030444145,
-0.12492503970861435,
-0.09673918038606644,
-0.11909445375204086,
0.06727712601423264,
0.08741986751556396,
-0.04356890916824341,
-0.11707033216953278,
0.19279330968856812,
0.07069328427314758,
0.02436416782438755,
-0.04785842448472977,
-0.06466402113437653,
-0.01841621845960617,
0.08579734712839127,
-0.08903633058071136,
-0.001238641794770956,
-0.031331952661275864,
0.019162213429808617,
-0.014639806933701038,
-0.07269324362277985,
-0.04178154468536377,
0.07451296597719193,
-0.09109994024038315,
0.08097896724939346,
-0.028263108804821968,
0.05489332973957062,
-0.06465474516153336,
0.0085120415315032,
0.03412749990820885,
-0.053844816982746124,
0.06929846107959747,
0.10640288144350052,
-0.09716040641069412,
0.1371881663799286,
-0.1569768488407135,
-0.0396193265914917,
0.06505002081394196,
0.08087759464979172,
-0.013097390532493591,
-0.1284620314836502,
0.03349194675683975,
0.0845598503947258,
0.06379684805870056,
0.028358155861496925,
0.0906510055065155,
-0.04890158772468567,
-0.011313848197460175,
-0.04387330636382103,
-0.02006845735013485,
-0.01563526876270771,
0.049026742577552795,
0.03665175288915634,
0.15967752039432526,
0.180833101272583,
-0.10330083966255188,
0.12405290454626083,
-0.10388440638780594,
0.0038271620869636536,
-0.07335957884788513,
-0.026045849546790123,
-0.13630948960781097,
-0.04957015812397003,
0.06126018613576889,
-0.03682514652609825,
0.16027848422527313,
0.0648941770195961,
0.04928607866168022,
-0.012522017583251,
-0.06779783219099045,
-0.010950260795652866,
-0.015180906280875206,
0.22314873337745667,
0.021707193925976753,
0.04072016105055809,
-0.04407434165477753,
0.001181591534987092,
0.018365858122706413,
0.1484086513519287,
-0.015330987051129341,
0.14785553514957428,
0.04874982312321663,
0.06264365464448929,
0.08696895837783813,
-0.0217918548732996,
-0.054487671703100204,
-0.03978413715958595,
-0.10105617344379425,
0.04555303230881691,
-0.06828145682811737,
0.2292725294828415,
0.11317598819732666,
-0.13589219748973846,
0.10231108963489532,
0.02401707135140896,
-0.09552885591983795,
-0.1524309515953064,
-0.11987420171499252,
-0.056412793695926666,
-0.1673576831817627,
0.014162108302116394,
-0.1169578954577446,
0.011235160753130913,
0.05068828910589218,
0.04740229621529579,
-0.027700994163751602,
0.1264718621969223,
-0.012799946591258049,
-0.11164242774248123,
0.09028656035661697,
-0.0700608566403389,
-0.0033522360026836395,
-0.07865993678569794,
0.07461700588464737,
0.1474374681711197,
-0.010812772437930107,
0.06534046679735184,
0.019107848405838013,
-0.06145234778523445,
0.005659374874085188,
-0.08708619326353073,
-0.05879529193043709,
-0.029680326581001282,
-0.011291096918284893,
0.10456979274749756,
0.13404573500156403,
0.12916933000087738,
-0.07377075403928757,
-0.013814570382237434,
0.1561828851699829,
-0.027563387528061867,
-0.17483770847320557,
-0.16167324781417847,
0.15562954545021057,
0.011033042334020138,
0.028801530599594116,
-0.009598215110599995,
-0.037234898656606674,
-0.008177417330443859,
0.2589389979839325,
0.22389328479766846,
0.07429884374141693,
0.020591530948877335,
-0.03849652037024498,
-0.004938235506415367,
-0.041327103972435,
0.08882583677768707,
0.023718923330307007,
0.2557540237903595,
-0.034194983541965485,
0.0076463609002530575,
-0.12894046306610107,
-0.06495624035596848,
0.03464685007929802,
0.07412344962358475,
-0.06468133628368378,
-0.13475465774536133,
-0.001751405419781804,
0.17325040698051453,
-0.07494852691888809,
-0.057239044457674026,
-0.10686081647872925,
-0.12338599562644958,
-0.11718113720417023,
-0.021510634571313858,
0.015368360094726086,
0.08383980393409729,
0.0357295498251915,
-0.07471461594104767,
0.017829343676567078,
0.1030300110578537,
0.0033069828059524298,
-0.07621286064386368,
-0.08381126075983047,
0.07639788836240768,
-0.043724048882722855,
0.006090064533054829,
-0.007968684658408165,
0.16479238867759705,
0.0066029950976371765,
0.08866951614618301,
-0.019037805497646332,
0.102923683822155,
-0.024481315165758133,
-0.0697629526257515,
0.022290410473942757,
0.11582022905349731,
-0.045898474752902985,
0.16139468550682068,
0.028853366151452065,
-0.12361183017492294,
0.05048607662320137,
-0.0940641462802887,
-0.006988808512687683,
-0.049550727009773254,
0.05451709032058716,
-0.05543440207839012,
0.07065722346305847,
0.106740802526474,
-0.06683089584112167,
-0.041624702513217926,
-0.06735614687204361,
0.07495014369487762,
0.023748647421598434,
-0.04252903535962105,
-0.01872926764190197,
-0.23384472727775574,
-0.024247273802757263,
-0.06301168352365494,
-0.0425976887345314,
-0.23562592267990112,
-0.029062544927001,
0.005451336968690157,
-0.06570442020893097,
0.02795250527560711,
0.06671452522277832,
0.07055776566267014,
-0.011940405704081059,
-0.008368958719074726,
0.023611558601260185,
0.056374482810497284,
0.1042417511343956,
-0.12977641820907593,
-0.11487966030836105
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Mongolian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Mongolian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "mn", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-mongolian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-mongolian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Mongolian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/mn.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-mongolian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-mongolian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/mn/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/mn/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 38.53 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "mn", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Mongolian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice mn", "type": "common_voice", "args": "mn"}, "metrics": [{"type": "wer", "value": 38.53, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-mongolian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"mn",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mn"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mn #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Mongolian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Mongolian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Mongolian test data of Common Voice.
Test Result: 38.53 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Mongolian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Mongolian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Mongolian test data of Common Voice.\n\n\n\nTest Result: 38.53 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mn #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Mongolian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Mongolian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Mongolian test data of Common Voice.\n\n\n\nTest Result: 38.53 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
66,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mn #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Mongolian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Mongolian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Mongolian test data of Common Voice.\n\n\n\nTest Result: 38.53 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.09190315753221512,
0.011353197507560253,
-0.0017423591343685985,
-0.023845916613936424,
0.08809126168489456,
-0.0566471703350544,
0.22228975594043732,
0.09693149477243423,
0.012455065734684467,
-0.03174760937690735,
0.040373243391513824,
-0.04192272573709488,
0.04701686277985573,
0.10908087342977524,
0.04070236533880234,
-0.25653430819511414,
0.011050465516746044,
-0.0036796482745558023,
0.06673639267683029,
0.1062094122171402,
0.11338774859905243,
-0.068121537566185,
-0.010591727681457996,
0.0874713882803917,
-0.12688672542572021,
0.06175201013684273,
0.018369151279330254,
-0.129887193441391,
0.1335223913192749,
0.07167141884565353,
0.05697612836956978,
0.042094796895980835,
0.08279560506343842,
-0.1845259666442871,
0.023968050256371498,
-0.013247577473521233,
-0.004771990701556206,
0.008798232302069664,
0.0678233951330185,
-0.008283223956823349,
0.16698530316352844,
0.05878278240561485,
-0.04687144234776497,
0.06518062949180603,
-0.037103623151779175,
-0.1887156218290329,
0.0053875381126999855,
0.021645141765475273,
0.09571754187345505,
0.13477253913879395,
-0.07821778953075409,
0.1017022654414177,
-0.160126730799675,
0.08999006450176239,
0.08374249935150146,
-0.21498334407806396,
-0.004957947880029678,
0.1333901435136795,
0.0348857156932354,
0.08901406824588776,
-0.10507464408874512,
0.03154150769114494,
0.04403049871325493,
0.013088210485875607,
-0.010151300579309464,
-0.06515725702047348,
-0.277040034532547,
-0.02183578535914421,
-0.12490086257457733,
-0.011933659203350544,
0.2336137741804123,
0.008947372436523438,
-0.07013238221406937,
-0.10016795992851257,
-0.002921311417594552,
-0.024700338020920753,
-0.0078273368999362,
-0.02461744099855423,
-0.009390193969011307,
0.013750896789133549,
-0.05889562889933586,
-0.04080025851726532,
-0.12741492688655853,
-0.11702043563127518,
-0.03663589432835579,
0.05977202579379082,
0.014890080317854881,
0.02128094993531704,
-0.13571776449680328,
0.05964772403240204,
-0.09985119104385376,
-0.06624732911586761,
-0.006083297543227673,
0.015509497374296188,
-0.09003005176782608,
0.020291391760110855,
-0.06340920925140381,
-0.19355396926403046,
0.044070880860090256,
-0.005249631591141224,
0.057140376418828964,
0.06515853852033615,
-0.058288827538490295,
0.05105132982134819,
0.016920102760195732,
0.14527033269405365,
-0.04630941152572632,
-0.03459778055548668,
0.019756736233830452,
-0.023271627724170685,
-0.03927929326891899,
-0.02484320104122162,
-0.08831068128347397,
-0.10593920946121216,
0.021532129496335983,
0.0699680745601654,
-0.045457061380147934,
0.029486147686839104,
0.026989830657839775,
-0.02360931783914566,
-0.013647748157382011,
-0.10304086655378342,
-0.05500860884785652,
0.041964735835790634,
-0.027963466942310333,
0.08604443073272705,
0.013451166450977325,
0.06265804171562195,
-0.025115584954619408,
-0.055786535143852234,
0.03526965156197548,
0.07979897409677505,
-0.021810928359627724,
-0.08322064578533173,
0.022458860650658607,
0.04727870598435402,
0.009629852138459682,
-0.11678681522607803,
-0.08770488947629929,
-0.055663373321294785,
0.008338353596627712,
0.02862340584397316,
-0.002467440441250801,
-0.097501240670681,
-0.018300535157322884,
0.014527643099427223,
-0.07611210644245148,
0.09696183353662491,
-0.04363580048084259,
0.03794821724295616,
0.028661634773015976,
0.06626635789871216,
0.06878966838121414,
0.05943147838115692,
-0.09904960542917252,
-0.026703963056206703,
0.016797548159956932,
0.11896441876888275,
-0.025981809943914413,
-0.04026661440730095,
-0.07933951169252396,
-0.06097106635570526,
-0.05617830902338028,
0.04412113502621651,
0.026543714106082916,
0.10517139732837677,
-0.26148828864097595,
-0.07822127640247345,
0.2300228774547577,
-0.09969491511583328,
0.006737729534506798,
0.2017015814781189,
-0.023895055055618286,
0.09316692501306534,
0.12624210119247437,
0.22948439419269562,
0.09857306629419327,
-0.16913285851478577,
0.01457265205681324,
0.03166905790567398,
-0.008665437810122967,
-0.030014771968126297,
0.09328271448612213,
-0.022714877501130104,
-0.0330803245306015,
0.048508379608392715,
-0.006635760888457298,
0.10136600583791733,
-0.04382336512207985,
-0.07599283754825592,
-0.00949772633612156,
-0.06586773693561554,
0.10040377825498581,
0.04374890401959419,
0.05400637164711952,
-0.03134259954094887,
-0.04652176424860954,
0.04999041184782982,
0.12975220382213593,
-0.12561357021331787,
0.0635082870721817,
-0.09445497393608093,
0.025494063273072243,
-0.11038438230752945,
-0.00784656684845686,
-0.12275302410125732,
0.1560971438884735,
-0.0021539428271353245,
0.03257698193192482,
0.07176613807678223,
0.16619515419006348,
0.026651829481124878,
-0.010780200362205505,
-0.0922313779592514,
-0.005003703758120537,
0.013328327797353268,
-0.040330398827791214,
-0.04591836407780647,
-0.11666989326477051,
0.01719610020518303,
-0.06266618520021439,
0.0931951105594635,
-0.15925537049770355,
-0.035400208085775375,
-0.02059227041900158,
-0.014017367735505104,
-0.014637680724263191,
0.02031632512807846,
0.09790623188018799,
0.08773666620254517,
-0.019415821880102158,
0.030251774936914444,
0.027754688635468483,
-0.0013499552151188254,
-0.11872304975986481,
0.17797203361988068,
-0.13246728479862213,
-0.06520820409059525,
0.11849823594093323,
-0.043485648930072784,
-0.008637942373752594,
0.021909741684794426,
-0.02638421207666397,
-0.03119625337421894,
-0.05340001359581947,
0.04082406312227249,
0.23935645818710327,
-0.045706406235694885,
0.09011273831129074,
-0.07788756489753723,
0.0068613686598837376,
0.022313350811600685,
-0.10772397369146347,
0.023088593035936356,
0.08587614446878433,
-0.001531227957457304,
0.025401411578059196,
0.050410978496074677,
-0.07496790587902069,
-0.09546513855457306,
0.2870890498161316,
-0.003681677393615246,
-0.10188078880310059,
0.006772687193006277,
-0.00626329192891717,
-0.025674402713775635,
0.10473046451807022,
-0.1663181632757187,
-0.03968709334731102,
0.04126434773206711,
0.0690944492816925,
0.0628073513507843,
-0.11723797768354416,
-0.0002700973127502948,
-0.013244579546153545,
-0.1491040140390396,
-0.14722265303134918,
0.08863811939954758,
-0.046504825353622437,
0.06696342676877975,
-0.10936448723077774,
-0.010262926109135151,
0.01380175445228815,
-0.053272899240255356,
-0.1704721599817276,
0.13116271793842316,
-0.08215022087097168,
-0.24205787479877472,
-0.11944402009248734,
-0.043299127370119095,
-0.015156053937971592,
0.024804383516311646,
0.08362110704183578,
-0.172432079911232,
-0.011422621086239815,
-0.031086908653378487,
0.09732142835855484,
-0.018751630559563637,
0.01644846983253956,
-0.06981334090232849,
0.03345280885696411,
0.064335398375988,
-0.11112001538276672,
0.018789876252412796,
-0.053099725395441055,
-0.04359360411763191,
0.04772832244634628,
-0.008981109596788883,
0.019522739574313164,
0.14662247896194458,
0.01680872216820717,
-0.01707221195101738,
-0.02950223721563816,
0.15645553171634674,
-0.09724617004394531,
-0.03301534056663513,
0.15010780096054077,
-0.0043862550519406796,
-0.04188377037644386,
0.13579431176185608,
0.010865295305848122,
-0.04168309271335602,
0.0463617779314518,
-0.0026561361737549305,
-0.115826815366745,
-0.2501778304576874,
-0.08892077207565308,
-0.08677023649215698,
-0.05481743812561035,
-0.04857135936617851,
0.018622415140271187,
0.0867525115609169,
0.022849787026643753,
-0.03389075770974159,
-0.02458789385855198,
0.037296146154403687,
-0.003972796257585287,
0.0755467563867569,
0.033196453005075455,
0.07040969282388687,
-0.07333328574895859,
-0.021093904972076416,
0.0064055416733026505,
0.026560962200164795,
0.18888969719409943,
0.06744097918272018,
0.07317199558019638,
0.07804643362760544,
0.12142790853977203,
0.1482841968536377,
0.0405973382294178,
-0.051843248307704926,
-0.031666453927755356,
0.030048340559005737,
-0.0637696385383606,
-0.001484240172430873,
0.05976090580224991,
0.14677099883556366,
-0.021751515567302704,
-0.03192053362727165,
-0.00724955927580595,
0.03496750071644783,
0.1503351479768753,
0.05744362249970436,
-0.1482129991054535,
-0.08716771751642227,
-0.008455565199255943,
-0.07583710551261902,
0.02710849978029728,
0.048404376953840256,
0.15152011811733246,
-0.1684197634458542,
0.03882509097456932,
-0.016588907688856125,
0.07427829504013062,
0.030961040407419205,
0.014488845132291317,
-0.10706070065498352,
0.022247739136219025,
-0.006471462547779083,
0.0955025926232338,
-0.3519769608974457,
0.24418768286705017,
0.011049909517168999,
0.09155470132827759,
-0.0503934845328331,
-0.012128854170441628,
0.03577863425016403,
0.0017625160980969667,
0.1005624458193779,
0.017696576192975044,
-0.05050358548760414,
-0.09618936479091644,
-0.09751744568347931,
0.06724561750888824,
-0.012285476550459862,
0.06285818666219711,
0.0636506974697113,
0.02247077226638794,
-0.009427732788026333,
0.013735233806073666,
-0.016988644376397133,
-0.1611647754907608,
-0.05472500994801521,
-0.0052218129858374596,
0.15007588267326355,
0.09729781746864319,
-0.005193311721086502,
-0.0743437185883522,
-0.08288202434778214,
0.021100899204611778,
-0.1106816977262497,
-0.07909137010574341,
-0.043305572122335434,
-0.014844289049506187,
0.13407669961452484,
-0.08437715470790863,
-0.0056982748210430145,
0.07267071306705475,
0.11004413664340973,
-0.014814440160989761,
-0.06739646941423416,
0.01790602318942547,
-0.11141771823167801,
-0.12069708108901978,
-0.006436731666326523,
0.16800197958946228,
0.13077571988105774,
0.13047188520431519,
0.04199781268835068,
-0.00757349468767643,
0.002973362570628524,
-0.05790959298610687,
-0.026124371215701103,
0.08023349195718765,
-0.12616032361984253,
-0.02111581154167652,
0.022916577756404877,
-0.1485505849123001,
-0.12637537717819214,
-0.10116548091173172,
0.17307937145233154,
0.09578397870063782,
-0.06449272483587265,
0.21921294927597046,
0.21658937633037567,
-0.09784001857042313,
-0.17176280915737152,
-0.013459415175020695,
0.1053466796875,
0.12356294691562653,
-0.003404395654797554,
-0.19167037308216095,
0.05900217220187187,
-0.01796097867190838,
-0.027638651430606842,
-0.04457294940948486,
-0.28486064076423645,
-0.1650821566581726,
0.14496098458766937,
-0.058017708361148834,
0.1434764713048935,
-0.015314595773816109,
-0.02675657905638218,
-0.027073048055171967,
0.013889603316783905,
0.019436800852417946,
-0.08391489833593369,
0.11869186908006668,
0.019158780574798584,
0.1260984241962433,
0.04617106541991234,
-0.021922463551163673,
0.13219788670539856,
0.09832800924777985,
-0.0114052202552557,
-0.020213093608617783,
0.06654848158359528,
0.05302376672625542,
0.016613561660051346,
0.11774501204490662,
-0.08969864994287491,
0.011843593791127205,
-0.1163916066288948,
-0.10411972552537918,
-0.08111368119716644,
0.06098564714193344,
0.02488335594534874,
-0.08012192696332932,
-0.007293160073459148,
-0.014698619022965431,
0.06189928203821182,
-0.004878931678831577,
0.013024501502513885,
-0.10565156489610672,
0.04813486337661743,
0.08873016387224197,
0.1716083586215973,
0.009652354754507542,
-0.03558381646871567,
-0.010359877720475197,
-0.018102163448929787,
0.10002446174621582,
-0.09011957049369812,
0.018185291439294815,
0.06308651715517044,
0.02407977357506752,
0.12656676769256592,
-0.007663247641175985,
-0.09302042424678802,
0.14601187407970428,
0.04066493734717369,
-0.050393953919410706,
-0.08849916607141495,
0.0010547013953328133,
-0.05220717936754227,
-0.014538104645907879,
-0.022739065811038017,
0.07950093597173691,
-0.10947258770465851,
-0.013951746746897697,
-0.015430757775902748,
0.011980963870882988,
-0.10153956711292267,
0.17977769672870636,
0.0235875453799963,
0.0690438523888588,
-0.0976153165102005,
0.032261092215776443,
0.008909239433705807,
-0.02745656669139862,
0.04584583640098572,
-0.02959536388516426,
-0.0801132544875145,
-0.025870218873023987,
-0.011101976037025452,
0.07339935004711151,
-0.0003277123032603413,
-0.1437099426984787,
-0.05995647609233856,
-0.0829646959900856,
-0.015288673341274261,
0.05938981845974922,
0.04552134871482849,
-0.002186030615121126,
-0.06744587421417236,
-0.03468673676252365,
-0.10575557500123978,
0.059181880205869675,
0.06827625632286072,
-0.028969595208764076,
-0.10057386755943298,
0.15243110060691833,
0.08105934411287308,
0.06872888654470444,
-0.05393684282898903,
-0.08598951250314713,
-0.013211928308010101,
0.09845994412899017,
-0.13828180730342865,
0.014283166266977787,
-0.09925093501806259,
-0.007299127522855997,
-0.03325030952692032,
-0.10096028447151184,
-0.03220983222126961,
0.07046518474817276,
-0.09980461746454239,
0.08605203777551651,
-0.02044651471078396,
0.0727880448102951,
-0.05947161838412285,
0.011710179969668388,
0.023632792755961418,
-0.03650713339447975,
0.07750652730464935,
0.1890401840209961,
-0.07878388464450836,
0.1572858691215515,
-0.22482985258102417,
-0.011592463590204716,
0.04877939447760582,
0.056388773024082184,
-0.020797191187739372,
-0.07998858392238617,
0.02192484773695469,
0.07755034416913986,
0.06548842042684555,
-0.005909075029194355,
0.06377988308668137,
-0.07483913004398346,
-0.04541987180709839,
-0.11617246270179749,
0.004521696362644434,
-0.05653151124715805,
0.06180763244628906,
0.032281070947647095,
0.14241015911102295,
0.14647068083286285,
-0.1393367052078247,
0.09196286648511887,
-0.1068887785077095,
-0.004975020419806242,
-0.07158549875020981,
-0.0007014681468717754,
-0.12844540178775787,
-0.0898728296160698,
0.06739313900470734,
-0.0592215433716774,
0.16917529702186584,
0.024187974631786346,
0.08147362619638443,
-0.03550585359334946,
-0.1286374032497406,
0.049035727977752686,
-0.02443806454539299,
0.25690388679504395,
0.06475333869457245,
0.04247212037444115,
0.005410278216004372,
0.0044077252969145775,
0.027501896023750305,
0.10263626277446747,
0.025983553379774094,
0.13651396334171295,
0.027172040194272995,
0.0976254865527153,
0.04443526640534401,
-0.06504984945058823,
-0.05655040591955185,
-0.04528380557894707,
-0.18038693070411682,
0.006819078233093023,
-0.061121441423892975,
0.178135484457016,
0.18862004578113556,
-0.10859997570514679,
0.08987637609243393,
0.027295902371406555,
-0.08364257961511612,
-0.14386610686779022,
-0.13897553086280823,
-0.08317063003778458,
-0.15531380474567413,
0.04338439926505089,
-0.095174640417099,
0.03133165091276169,
0.07072615623474121,
0.06938821077346802,
-0.0305497907102108,
0.16903431713581085,
0.06054859608411789,
-0.11055532842874527,
0.0581863671541214,
-0.07270856201648712,
-0.02070237696170807,
-0.06427248567342758,
0.028705505654215813,
0.15548917651176453,
-0.04154073819518089,
0.02101273648440838,
0.01289990171790123,
-0.03614503890275955,
0.022748487070202827,
-0.06475324928760529,
-0.05579446256160736,
-0.01458226703107357,
-0.010436274111270905,
0.1358509361743927,
0.12397404760122299,
0.11628337949514389,
-0.06861180067062378,
-0.022724583745002747,
0.12990640103816986,
-0.011323162354528904,
-0.12471030652523041,
-0.1442445069551468,
0.145772784948349,
0.006048707757145166,
-0.0034546037204563618,
-0.004236544016748667,
-0.0400572307407856,
-0.037691473960876465,
0.2424168437719345,
0.21517281234264374,
0.08115749806165695,
0.02841215394437313,
-0.056663427501916885,
-0.009512602351605892,
-0.026117421686649323,
0.08832844346761703,
0.07664259523153305,
0.24494785070419312,
-0.02560313045978546,
-0.004643742926418781,
-0.11758095771074295,
-0.06310644000768661,
0.012265452183783054,
0.08818616718053818,
-0.068210169672966,
-0.09389668703079224,
-0.008215093985199928,
0.13298296928405762,
-0.12215914577245712,
-0.09708841145038605,
-0.11081928759813309,
-0.08011370152235031,
-0.08230537921190262,
0.014754462987184525,
0.01898024044930935,
0.14295729994773865,
-0.01720048114657402,
-0.08349547535181046,
0.05126234516501427,
0.06719779223203659,
0.02066487818956375,
-0.04779916629195213,
-0.06457529962062836,
0.07716672122478485,
-0.004281381145119667,
0.0554845966398716,
-0.007505690213292837,
0.15716150403022766,
0.018351949751377106,
0.09761478751897812,
-0.015511278994381428,
0.16262437403202057,
0.03130755573511124,
-0.06922829151153564,
0.037132687866687775,
0.1696930229663849,
-0.010212907567620277,
0.14093562960624695,
0.025685368105769157,
-0.01217638235539198,
0.06552877277135849,
-0.13861434161663055,
0.02141137234866619,
-0.10235472768545151,
0.0718948170542717,
-0.04979386180639267,
0.08321930468082428,
0.10776592791080475,
-0.07419620454311371,
-0.04940559342503548,
-0.06864555925130844,
0.0586756207048893,
0.002749947365373373,
-0.09418690204620361,
-0.0655861496925354,
-0.2318473756313324,
0.012587416917085648,
-0.05028709024190903,
-0.02368585765361786,
-0.1976454257965088,
-0.040913552045822144,
-0.027923816815018654,
-0.0807121992111206,
0.020231034606695175,
0.031957272440195084,
0.05965031683444977,
0.01805415190756321,
0.003894777037203312,
0.006827380508184433,
0.044777367264032364,
0.10770941525697708,
-0.1930607557296753,
-0.12528489530086517
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Romanian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Romanian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ro", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-romanian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-romanian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Romanian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/ro.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-romanian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-romanian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/ro/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/ro/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 24.84 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "ro", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Romanian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ro", "type": "common_voice", "args": "ro"}, "metrics": [{"type": "wer", "value": 24.84, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-romanian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"ro",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ro"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ro #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Romanian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Romanian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Romanian test data of Common Voice.
Test Result: 24.84 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Romanian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romanian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Romanian test data of Common Voice.\n\n\n\nTest Result: 24.84 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ro #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Romanian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romanian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Romanian test data of Common Voice.\n\n\n\nTest Result: 24.84 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ro #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Romanian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romanian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Romanian test data of Common Voice.\n\n\n\nTest Result: 24.84 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1451476812362671,
0.026251599192619324,
-0.0026156497187912464,
-0.014674813486635685,
0.0875137597322464,
-0.05461389943957329,
0.19055293500423431,
0.0805862694978714,
-0.0015180613845586777,
0.0027262908406555653,
0.04658094048500061,
-0.041926298290491104,
0.028387926518917084,
0.07056168466806412,
0.004258234519511461,
-0.19628486037254333,
0.05222528427839279,
0.005118597764521837,
0.031243355944752693,
0.08442553877830505,
0.10110621899366379,
-0.080299012362957,
-0.004407085012644529,
0.09544508904218674,
-0.12465604394674301,
0.07281389832496643,
0.033474672585725784,
-0.12051547318696976,
0.1540348082780838,
0.0932830348610878,
0.04615047201514244,
0.05217859148979187,
0.10739639401435852,
-0.1592843234539032,
0.014044741168618202,
0.018113888800144196,
0.008132020011544228,
0.021008756011724472,
0.07309368997812271,
-0.030628060922026634,
0.10114236921072006,
0.07455972582101822,
-0.019707757979631424,
0.062287069857120514,
-0.038774121552705765,
-0.18619093298912048,
-0.009408744052052498,
0.039972640573978424,
0.08414143323898315,
0.15530696511268616,
-0.06949734687805176,
0.13650229573249817,
-0.17250363528728485,
0.07255073636770248,
0.06312002241611481,
-0.19516149163246155,
-0.008325531147420406,
0.019568821415305138,
0.01985096000134945,
0.09350980818271637,
-0.03348568081855774,
0.060588348656892776,
0.02470395527780056,
0.029040135443210602,
-0.00040500526665709913,
-0.01329735480248928,
-0.22974810004234314,
-0.043496038764715195,
-0.14177244901657104,
-0.04897737130522728,
0.19665762782096863,
0.01632118411362171,
-0.06146515905857086,
-0.11555434763431549,
-0.014549775049090385,
0.023709068074822426,
-0.027143336832523346,
-0.03529675304889679,
-0.022448118776082993,
0.026187265291810036,
-0.008415567688643932,
-0.02281566523015499,
-0.10546551644802094,
-0.1279989331960678,
0.014209025539457798,
0.12193562835454941,
0.030200133100152016,
0.01843012124300003,
-0.10089260339736938,
0.056149277836084366,
-0.07393550127744675,
-0.061310332268476486,
0.03202128782868385,
0.04646993800997734,
-0.06698990613222122,
0.013433888554573059,
-0.07244226336479187,
-0.1542399823665619,
0.041539695113897324,
-0.015706341713666916,
0.04047355055809021,
0.009880800731480122,
-0.012566027231514454,
0.06394021958112717,
0.014904561452567577,
0.12834987044334412,
-0.09817992895841599,
-0.05264713615179062,
0.025274015963077545,
-0.011803407222032547,
-0.042583487927913666,
-0.023782122880220413,
-0.08409320563077927,
-0.09826479852199554,
0.0027085398323833942,
0.048877451568841934,
-0.024696189910173416,
0.0021049908827990294,
0.009411418810486794,
-0.03472243249416351,
0.003240308491513133,
-0.08700602501630783,
-0.03654315695166588,
0.07015229761600494,
-0.013218632899224758,
0.1633673459291458,
0.02342074364423752,
0.05413374304771423,
-0.11254559457302094,
-0.02195098251104355,
0.021747605875134468,
0.036750927567481995,
-0.010921732522547245,
-0.12150294333696365,
0.017890384420752525,
-0.017468910664319992,
-0.00533634889870882,
-0.08136224001646042,
-0.01969844661653042,
-0.06267249584197998,
0.008769997395575047,
0.011514418758451939,
-0.028446976095438004,
-0.09646162390708923,
0.0006283078691922128,
-0.0012028072960674763,
-0.03945311903953552,
0.010407033376395702,
-0.021804265677928925,
0.05168195068836212,
-0.003472001990303397,
0.032303500920534134,
-0.010106864385306835,
0.08822417259216309,
-0.11155176162719727,
-0.05598064512014389,
-0.0013101797085255384,
0.12417671829462051,
-0.04560587927699089,
-0.009373120032250881,
-0.1001461073756218,
-0.05854398012161255,
-0.030567901208996773,
0.05282869189977646,
0.07084864377975464,
0.10894864052534103,
-0.2355591356754303,
-0.09383676946163177,
0.18932539224624634,
-0.13975653052330017,
-0.0026577957905828953,
0.24520540237426758,
0.0011982216965407133,
0.07784382253885269,
0.1470160037279129,
0.23136967420578003,
0.12417355924844742,
-0.16368907690048218,
0.029121126979589462,
-0.010881505906581879,
-0.0034820756409317255,
-0.04970445856451988,
0.07987731695175171,
-0.07305415719747543,
0.00975455716252327,
0.03446003422141075,
-0.09401314705610275,
0.0971258357167244,
-0.02835526131093502,
-0.05536862835288048,
-0.027348678559064865,
-0.07129905372858047,
0.0636352077126503,
0.05493312329053879,
0.04904559627175331,
-0.020925668999552727,
-0.06908170133829117,
0.0618785098195076,
0.13482800126075745,
-0.1607324331998825,
0.050973132252693176,
-0.11839671432971954,
0.1777021288871765,
-0.09125809371471405,
-0.012440814636647701,
-0.13325099647045135,
0.19557110965251923,
-0.013131646439433098,
0.07926362007856369,
0.0756542906165123,
0.16621695458889008,
0.021527277305722237,
0.009570474736392498,
-0.030183302238583565,
-0.0020533930510282516,
-0.014444662258028984,
-0.0170186348259449,
-0.021624740213155746,
-0.12520574033260345,
-0.008397061377763748,
-0.04673478379845619,
0.11836481839418411,
-0.16811777651309967,
-0.01816866546869278,
0.02808963507413864,
0.03745038062334061,
-0.009196203202009201,
0.005804117303341627,
0.009756220504641533,
0.09700509905815125,
-0.001187682501040399,
0.015273552387952805,
0.051564741879701614,
-0.017975008115172386,
-0.018836816772818565,
0.11240821331739426,
-0.10145383328199387,
-0.016855260357260704,
0.10875595360994339,
-0.08348234742879868,
-0.022677868604660034,
0.0723268911242485,
-0.007822458632290363,
0.00813757162541151,
-0.06736332178115845,
0.010283692739903927,
0.29547783732414246,
-0.011401894502341747,
0.12114959210157394,
-0.08660703152418137,
0.04141237586736679,
0.04819396510720253,
-0.06763384491205215,
0.027288177981972694,
0.08392953127622604,
0.09514303505420685,
0.007426992058753967,
0.028440235182642937,
-0.04884063079953194,
-0.11431464552879333,
0.23818394541740417,
-0.020559901371598244,
-0.13525275886058807,
0.04258335381746292,
-0.013584615662693977,
-0.014785400591790676,
0.09671860188245773,
-0.1725359708070755,
-0.01418462023139,
0.022955013439059258,
0.04353402182459831,
0.08551643788814545,
-0.12640121579170227,
0.025959482416510582,
0.0002458368835505098,
-0.12609419226646423,
-0.1974055916070938,
0.06209803372621536,
-0.05385046824812889,
0.05210069939494133,
-0.0893610417842865,
-0.06112033873796463,
-0.00611067796126008,
-0.015702180564403534,
-0.16377614438533783,
0.10764973610639572,
-0.051544949412345886,
-0.2002895474433899,
-0.18808990716934204,
0.038354746997356415,
-0.019432159140706062,
-0.0012036985717713833,
0.06625868380069733,
-0.12039720267057419,
-0.02974654734134674,
-0.022336238995194435,
0.13631446659564972,
-0.025277866050601006,
-0.03577746823430061,
-0.055995386093854904,
0.04824436455965042,
0.06320837140083313,
-0.11577866226434708,
0.01640002243220806,
-0.0766499936580658,
-0.013840201310813427,
-0.01984528638422489,
-0.03171698749065399,
0.022972440347075462,
0.16384117305278778,
-0.015038497745990753,
0.02225176990032196,
0.004104046616703272,
0.2197033315896988,
-0.11270952969789505,
-0.0378793329000473,
0.1754838228225708,
-0.024468548595905304,
-0.030979590490460396,
0.13248926401138306,
0.04659484326839447,
-0.03422180190682411,
-0.032250117510557175,
-0.0036119783762842417,
-0.06810018420219421,
-0.21508732438087463,
-0.16406908631324768,
-0.07018275558948517,
-0.10565788298845291,
-0.021562781184911728,
0.00725415488705039,
0.03357568755745888,
0.03212757036089897,
-0.04224780946969986,
-0.07258004695177078,
0.024492207914590836,
-0.016520986333489418,
0.18354998528957367,
0.010872443206608295,
0.07437781244516373,
-0.07729087769985199,
-0.06269684433937073,
0.012553364969789982,
0.0017260420136153698,
0.18855156004428864,
0.05492386594414711,
0.09041552990674973,
0.06346385926008224,
0.1192682683467865,
0.13836736977100372,
0.07349372655153275,
-0.04904801771044731,
-0.03484528884291649,
0.017963798716664314,
-0.05151025205850601,
0.0013618589146062732,
0.012636839412152767,
0.14720018208026886,
-0.054784175008535385,
-0.055077288299798965,
-0.04816482588648796,
0.02391601726412773,
0.25844788551330566,
0.06903170049190521,
-0.20592162013053894,
-0.08991813659667969,
-0.03720040246844292,
-0.07426873594522476,
0.0034677181392908096,
0.014206497929990292,
0.13974139094352722,
-0.12365619093179703,
0.011718020774424076,
0.013472124934196472,
0.08712217956781387,
-0.004906340967863798,
0.014668990857899189,
-0.12903177738189697,
0.017418282106518745,
-0.0022910526022315025,
0.10403262823820114,
-0.1986664980649948,
0.24326051771640778,
0.008334293030202389,
0.11616884917020798,
-0.048434700816869736,
0.005878186319023371,
-0.008909732103347778,
0.1273285150527954,
0.08638254553079605,
0.017681146040558815,
0.03897112235426903,
-0.10355373471975327,
-0.09913188219070435,
0.07783462107181549,
-0.03837719187140465,
0.04025671258568764,
0.05954978987574577,
-0.014039909467101097,
-0.0005787078989669681,
0.0018196373712271452,
-0.054792944341897964,
-0.14606726169586182,
-0.08612137287855148,
0.014151564799249172,
0.14894695580005646,
0.07179427891969681,
0.011497028172016144,
-0.11139337718486786,
-0.1147930845618248,
0.0714295357465744,
-0.07390078902244568,
-0.03513514995574951,
-0.06630346924066544,
-0.05326629802584648,
0.14448612928390503,
-0.07219776511192322,
0.006148111540824175,
0.0715658962726593,
0.11746846139431,
-0.04367995634675026,
-0.038238294422626495,
0.03102867491543293,
-0.11419548094272614,
-0.09528349339962006,
0.0004760999872814864,
0.15781576931476593,
0.12083734571933746,
0.08453286439180374,
0.06965988874435425,
0.005932554602622986,
0.0037177165504544973,
-0.04513882100582123,
-0.02765730582177639,
0.09298652410507202,
-0.18776434659957886,
-0.04239070042967796,
-0.002868426963686943,
-0.23393940925598145,
-0.12790770828723907,
-0.04873964190483093,
0.21503184735774994,
0.08529967814683914,
-0.08469695597887039,
0.18548037111759186,
0.24531890451908112,
-0.08176157623529434,
-0.1803063154220581,
0.0036293924786150455,
0.09690997749567032,
0.1383567750453949,
0.0004989510052837431,
-0.17868655920028687,
0.05034087970852852,
-0.03429044410586357,
-0.034605927765369415,
-0.0663936510682106,
-0.2680811583995819,
-0.12828713655471802,
0.18124715983867645,
-0.028715578839182854,
0.1751166433095932,
0.03941960260272026,
0.0006228804122656584,
-0.018699871376156807,
0.03706008195877075,
0.02702639065682888,
-0.0784023329615593,
0.07850410789251328,
0.04145491495728493,
0.10244480520486832,
0.059492725878953934,
-0.011566240340471268,
0.09697356820106506,
0.057551026344299316,
-0.044916242361068726,
0.009405088610947132,
0.06363515555858612,
0.0690951719880104,
0.044499270617961884,
0.11782155185937881,
-0.06425841897726059,
0.011120964772999287,
-0.11583229154348373,
-0.10335300117731094,
-0.059064313769340515,
0.06463029980659485,
0.02046055905520916,
-0.015271595679223537,
0.042774468660354614,
-0.04836130887269974,
0.015830019488930702,
0.011021015234291553,
-0.0734366774559021,
-0.12586259841918945,
0.003635051427409053,
0.0995989516377449,
0.1638689935207367,
-0.05174381658434868,
-0.06206398084759712,
0.014821424148976803,
-0.030826786532998085,
0.12255530804395676,
-0.05022582784295082,
0.04642616584897041,
0.07119150459766388,
0.033606287091970444,
0.09666614234447479,
-0.0007640371914021671,
-0.11363358050584793,
0.08735278993844986,
0.047622598707675934,
-0.0661449208855629,
-0.09578133374452591,
-0.02088993601500988,
-0.07535082846879959,
-0.01782960258424282,
0.026471110060811043,
0.12657557427883148,
-0.11575601249933243,
-0.015836944803595543,
-0.034170400351285934,
-0.021976754069328308,
-0.11879266053438187,
0.1968992054462433,
0.007804433815181255,
0.04579291492700577,
-0.11484041064977646,
0.001623661839403212,
0.02222430892288685,
-0.0670328289270401,
0.03845880180597305,
-0.02910483069717884,
-0.08448929339647293,
-0.06714484840631485,
-0.022581666707992554,
0.043137647211551666,
0.0075750830583274364,
-0.1593252569437027,
-0.09825185686349869,
-0.10854079574346542,
0.0013485056115314364,
0.07746222615242004,
0.058226894587278366,
-0.027520304545760155,
-0.1371447741985321,
-0.1005648821592331,
-0.12784279882907867,
0.040479887276887894,
0.09243243932723999,
-0.05829263851046562,
-0.09267593175172806,
0.1696869134902954,
0.09857270121574402,
-0.003658776869997382,
-0.0523827001452446,
-0.08147266507148743,
0.003391479142010212,
0.0844087153673172,
-0.10890544205904007,
-0.014389127492904663,
-0.03126485273241997,
0.023562707006931305,
-0.01753491722047329,
-0.07944580912590027,
-0.03328289836645126,
0.061779242008924484,
-0.10868941247463226,
0.0785231813788414,
0.0032317519653588533,
0.06676928699016571,
-0.05529617890715599,
0.06258679181337357,
0.03157283738255501,
-0.049678072333335876,
0.09429395198822021,
0.12489373236894608,
-0.09031921625137329,
0.12678301334381104,
-0.21505890786647797,
-0.0691489428281784,
0.020179802551865578,
0.06837047636508942,
-0.0004340113664511591,
-0.06866426020860672,
0.05485377833247185,
0.0946042612195015,
0.07159898430109024,
0.005368292797356844,
0.08163313567638397,
-0.03908194974064827,
0.0000810147394076921,
-0.04665693640708923,
-0.040786683559417725,
-0.023991486057639122,
0.06038520112633705,
0.04669380187988281,
0.13418899476528168,
0.14461550116539001,
-0.09732409566640854,
0.10239651799201965,
-0.08548527210950851,
0.023188883438706398,
-0.0667981505393982,
0.01340470276772976,
-0.14995147287845612,
-0.07048550993204117,
0.05623941496014595,
-0.028577148914337158,
0.14062057435512543,
0.040026258677244186,
0.11386147141456604,
-0.01801426149904728,
-0.07211856544017792,
0.006889369338750839,
-0.010489778593182564,
0.22416098415851593,
0.06335610151290894,
0.07076454907655716,
-0.08262766152620316,
-0.014129101298749447,
0.004136199131608009,
0.13935339450836182,
-0.04844309017062187,
0.14967651665210724,
-0.017635473981499672,
0.07232625037431717,
0.0950506180524826,
-0.058448925614356995,
-0.02971401996910572,
-0.011691359803080559,
-0.132081538438797,
0.03413556143641472,
-0.07764363288879395,
0.2042664736509323,
0.17120574414730072,
-0.05838659778237343,
0.05650279298424721,
0.025753000751137733,
-0.07819607108831406,
-0.14611032605171204,
-0.10394876450300217,
-0.05974908173084259,
-0.1845182478427887,
0.037520091980695724,
-0.0698074921965599,
0.02378573827445507,
0.0675153061747551,
0.049653321504592896,
-0.053356096148490906,
0.10638818889856339,
0.03458084166049957,
-0.10820610821247101,
0.06242427974939346,
-0.07363589853048325,
0.0012855679960921407,
-0.08617605268955231,
0.03208707645535469,
0.1439523845911026,
0.010240159928798676,
0.05256599932909012,
0.014195921830832958,
-0.055923089385032654,
0.02165803499519825,
-0.07717518508434296,
-0.042723797261714935,
-0.025967827066779137,
-0.030753199011087418,
0.08437582105398178,
0.17838962376117706,
0.11417385190725327,
-0.07583623379468918,
-0.028279365971684456,
0.08992603421211243,
-0.020983899012207985,
-0.13647405803203583,
-0.17748615145683289,
0.1098918691277504,
0.029836658388376236,
0.03713664412498474,
-0.01090295147150755,
-0.031333066523075104,
-0.030303191393613815,
0.2679857611656189,
0.22661636769771576,
0.07995722442865372,
0.01945669949054718,
-0.031168194487690926,
-0.011432457715272903,
-0.03237859159708023,
0.07120554894208908,
0.03235044330358505,
0.27167192101478577,
-0.04004308953881264,
-0.007637819740921259,
-0.1349581778049469,
-0.04676533862948418,
-0.01346513256430626,
0.08080799877643585,
-0.06016402691602707,
-0.10935907065868378,
-0.005271032452583313,
0.1597886085510254,
-0.07032445818185806,
-0.0691215991973877,
-0.1211681142449379,
-0.1180364340543747,
-0.09967149049043655,
-0.02255817875266075,
-0.006776245776563883,
0.10229405015707016,
0.020166469737887383,
-0.09097801148891449,
0.015885798260569572,
0.13522349298000336,
-0.00395344290882349,
-0.0676364079117775,
-0.08559912443161011,
0.08086120337247849,
-0.0938572809100151,
0.056122247129678726,
-0.012491599656641483,
0.16533099114894867,
0.024083850905299187,
0.11584725230932236,
-0.008338713087141514,
0.13312670588493347,
-0.028329724445939064,
-0.06536652147769928,
0.04027407988905907,
0.11121749877929688,
-0.03991442173719406,
0.1312757283449173,
0.02052551880478859,
-0.09964890778064728,
0.10654383897781372,
-0.20786798000335693,
-0.004845696501433849,
-0.05735671892762184,
0.08383536338806152,
-0.039098478853702545,
0.07920515537261963,
0.10081277787685394,
-0.06449609249830246,
-0.06079583242535591,
-0.07616705447435379,
0.06191301718354225,
0.038464080542325974,
-0.08777663856744766,
-0.05194691941142082,
-0.275260329246521,
0.013982521370053291,
-0.09990429878234863,
-0.043265968561172485,
-0.2204829603433609,
-0.025695335119962692,
-0.01918877847492695,
-0.06188188120722771,
0.009451345540583134,
0.011048229411244392,
0.10791318118572235,
-0.0007009683758951724,
0.004573167767375708,
0.021393802016973495,
0.032159242779016495,
0.08682005852460861,
-0.13727633655071259,
-0.09381616115570068
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Russian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Russian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ru", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-russian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-russian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Russian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/ru.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-russian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-russian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/ru/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/ru/clips/"
def clean_sentence(sent):
sent = sent.lower()
# these letters are considered equivalent in written Russian
sent = sent.replace('ё', 'е')
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
# free up some memory
del model
del processor
del cv_test
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 17.39 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "ru", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Russian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ru", "type": "common_voice", "args": "ru"}, "metrics": [{"type": "wer", "value": 17.39, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-russian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"ru",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ru"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ru #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-XLSR-53-Russian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Russian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Russian test data of Common Voice.
Test Result: 17.39 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Russian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Russian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Russian test data of Common Voice.\n\n\n\n\nTest Result: 17.39 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ru #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Russian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Russian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Russian test data of Common Voice.\n\n\n\n\nTest Result: 17.39 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
84,
64,
20,
27,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ru #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-XLSR-53-Russian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Russian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Russian test data of Common Voice.\n\n\n\n\nTest Result: 17.39 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.12814460694789886,
0.0021636486053466797,
-0.002336677396669984,
-0.03343189135193825,
0.09441069513559341,
-0.04468809440732002,
0.2209973931312561,
0.08075679838657379,
0.004928382113575935,
-0.011842259205877781,
0.06015389412641525,
-0.06628983467817307,
0.02753954380750656,
0.1327437162399292,
0.006786101497709751,
-0.20731230080127716,
0.008022580295801163,
-0.025622926652431488,
0.0278788935393095,
0.11057115346193314,
0.11761642247438431,
-0.08515875041484833,
-0.02004239335656166,
0.07294482737779617,
-0.13509269058704376,
0.06857862323522568,
0.063272625207901,
-0.11744900792837143,
0.13071510195732117,
0.09145831316709518,
0.06373310834169388,
0.04870538413524628,
0.06843923777341843,
-0.196180522441864,
0.025388062000274658,
0.029473530128598213,
-0.005484814755618572,
0.018517229706048965,
0.0696844607591629,
-0.026369262486696243,
0.1339244395494461,
0.038762226700782776,
-0.04632638767361641,
0.07632414996623993,
-0.05905676260590553,
-0.19336169958114624,
-0.01843346655368805,
0.03279702365398407,
0.08015596866607666,
0.1657327562570572,
-0.08329807966947556,
0.10058518499135971,
-0.13293154537677765,
0.10161609202623367,
0.10341940820217133,
-0.21564914286136627,
-0.011970044113695621,
0.09897170215845108,
0.014110244810581207,
0.09983944892883301,
-0.04866480827331543,
0.06995199620723724,
0.039357371628284454,
0.0015164926880970597,
-0.008162721060216427,
-0.01622694358229637,
-0.19924163818359375,
0.0041311983950436115,
-0.15338970720767975,
-0.029899531975388527,
0.2270222157239914,
-0.0004207630699966103,
-0.03546895086765289,
-0.1410640925168991,
0.00018443851149640977,
-0.04636423662304878,
-0.004373874980956316,
-0.03538098186254501,
-0.021977242082357407,
0.016684245318174362,
-0.06148504838347435,
-0.05963209643959999,
-0.13967706263065338,
-0.1399431973695755,
0.006179287098348141,
0.11357957124710083,
0.007157511543482542,
0.041047949343919754,
-0.1129259318113327,
0.0544724240899086,
-0.12642763555049896,
-0.053920917212963104,
0.01201704703271389,
-0.001824475359171629,
-0.09767908602952957,
0.03286954388022423,
-0.11700654774904251,
-0.16411389410495758,
0.04748568683862686,
-0.04276953265070915,
0.03624677285552025,
0.029150931164622307,
0.0027712818700820208,
0.0611838698387146,
0.025961702689528465,
0.13094757497310638,
-0.06053021177649498,
-0.010235668160021305,
-0.013485659845173359,
-0.034917544573545456,
-0.05608631297945976,
-0.016102323308587074,
-0.0943373590707779,
-0.08189745247364044,
0.011712008155882359,
0.054934918880462646,
-0.041326068341732025,
0.00430706050246954,
-0.00569376302883029,
-0.012647349387407303,
-0.007948869839310646,
-0.0951504185795784,
-0.020912043750286102,
0.06313494592905045,
-0.0006145578809082508,
0.0777805745601654,
0.005408245604485273,
0.04879762977361679,
-0.06859404593706131,
-0.014461692422628403,
0.04734952747821808,
0.06582505255937576,
-0.02636861987411976,
-0.1302771270275116,
-0.001969845499843359,
-0.07208667695522308,
0.002788787940517068,
-0.11192014813423157,
-0.06467541307210922,
-0.09377213567495346,
-0.0038842391222715378,
0.023636939004063606,
0.005064983852207661,
-0.08659786731004715,
-0.04042981564998627,
-0.006740994285792112,
-0.060595136135816574,
0.04947077855467796,
-0.045834437012672424,
0.04673594981431961,
-0.013584690168499947,
0.0679268017411232,
0.05479736626148224,
0.0710390955209732,
-0.09076642990112305,
-0.08255719393491745,
0.049544647336006165,
0.15896369516849518,
-0.056485578417778015,
-0.07042839378118515,
-0.08527203649282455,
-0.0656878724694252,
-0.06473594158887863,
0.07387694716453552,
0.06088525801897049,
0.10612965375185013,
-0.26814958453178406,
-0.0867827981710434,
0.183859720826149,
-0.12449613958597183,
0.011048543266952038,
0.19282135367393494,
-0.018819009885191917,
0.08046360313892365,
0.15464305877685547,
0.2505992352962494,
0.11137274652719498,
-0.17331378161907196,
-0.005629383493214846,
-0.0069034406915307045,
-0.037033695727586746,
-0.0322650745511055,
0.05732779949903488,
-0.03277449682354927,
-0.00324450246989727,
0.02552666701376438,
-0.042896632105112076,
0.055680178105831146,
-0.041133105754852295,
-0.07845359295606613,
-0.019808195531368256,
-0.08178102970123291,
0.07364910840988159,
0.027633050456643105,
0.029041895642876625,
-0.05655054748058319,
-0.08282794803380966,
0.028053071349859238,
0.10711738467216492,
-0.1184384673833847,
0.0582706481218338,
-0.11440730094909668,
0.03640615567564964,
-0.05094707012176514,
0.0032693271059542894,
-0.14445877075195312,
0.16608954966068268,
-0.021079247817397118,
0.11489986628293991,
0.05915345251560211,
0.1939288228750229,
0.039827488362789154,
-0.019174272194504738,
-0.05434334650635719,
-0.013628105632960796,
-0.01546215545386076,
-0.028731459751725197,
-0.042322177439928055,
-0.12889397144317627,
-0.012457165867090225,
-0.07194621860980988,
0.06675023585557938,
-0.15219956636428833,
-0.015498862601816654,
0.047554098069667816,
0.007185520604252815,
0.00918701570481062,
0.013363394886255264,
0.03394099697470665,
0.09442871809005737,
-0.007295316085219383,
0.03366005793213844,
0.03849516063928604,
-0.01830269768834114,
-0.058310601860284805,
0.15310733020305634,
-0.0980522558093071,
-0.00565199414268136,
0.1139168068766594,
-0.02249297872185707,
-0.02686494030058384,
0.030314436182379723,
-0.02272230200469494,
0.0006180331693030894,
-0.052700649946928024,
0.006853033322840929,
0.2832493185997009,
-0.012300470843911171,
0.07577076554298401,
-0.08476091921329498,
0.019012993201613426,
0.04136515036225319,
-0.09306599944829941,
0.01696639694273472,
0.08257196843624115,
-0.014534701593220234,
-0.0017732908017933369,
0.04063105583190918,
-0.05877656862139702,
-0.08500580489635468,
0.29746487736701965,
-0.018076863139867783,
-0.09953459352254868,
0.025113461539149284,
-0.04321122169494629,
-0.03966372460126877,
0.11385713517665863,
-0.1348741352558136,
-0.06113605573773384,
0.03146163374185562,
0.05056680738925934,
0.07869718223810196,
-0.13979396224021912,
0.011414282955229282,
-0.0022682086564600468,
-0.14680515229701996,
-0.15372471511363983,
0.07779405266046524,
-0.04374633729457855,
0.05902904272079468,
-0.10271964967250824,
-0.027644941583275795,
-0.007698156405240297,
-0.05616072937846184,
-0.1682441681623459,
0.10708055645227432,
-0.08625870943069458,
-0.21768952906131744,
-0.134488046169281,
0.02197791263461113,
0.008452956564724445,
0.026474589481949806,
0.10765748471021652,
-0.1762600839138031,
-0.0031987500842660666,
-0.036957740783691406,
0.11896952986717224,
0.00586501881480217,
-0.008491214364767075,
-0.072901152074337,
0.05265720933675766,
0.06255556643009186,
-0.10562988370656967,
0.009764657355844975,
-0.07456039637327194,
-0.020933734253048897,
0.043563634157180786,
0.008451989851891994,
0.004210242070257664,
0.14741690456867218,
0.004695053678005934,
0.020465463399887085,
-0.04830782860517502,
0.16803158819675446,
-0.10822389274835587,
-0.03369345888495445,
0.171355739235878,
0.012192939408123493,
-0.040879059582948685,
0.10900861769914627,
0.010225262492895126,
-0.048845719546079636,
0.017238736152648926,
-0.012078340165317059,
-0.08868193626403809,
-0.24887627363204956,
-0.15800096094608307,
-0.07271304726600647,
-0.04470628499984741,
-0.027611015364527702,
-0.0026318165473639965,
0.03947178274393082,
0.014286361634731293,
-0.03925643116235733,
-0.11946801841259003,
0.05461936444044113,
-0.017569663003087044,
0.13235102593898773,
-0.007384854834526777,
0.12141010165214539,
-0.06213419884443283,
-0.015706241130828857,
0.031188949942588806,
-0.01827109232544899,
0.1891227662563324,
0.06265583634376526,
0.051168594509363174,
0.0807846412062645,
0.11461052298545837,
0.12402612715959549,
0.05151113122701645,
0.002237324370071292,
-0.027933120727539062,
0.03174222633242607,
-0.07026343792676926,
-0.025438999757170677,
0.08361120522022247,
0.14990155398845673,
-0.09931566566228867,
-0.04390496388077736,
-0.014014051295816898,
0.03756525367498398,
0.23013094067573547,
0.09718269854784012,
-0.1807461678981781,
-0.11948610097169876,
-0.035650186240673065,
-0.06631042063236237,
0.020808560773730278,
0.04933791980147362,
0.18320871889591217,
-0.12465299665927887,
0.02327127754688263,
0.007059075869619846,
0.08958921581506729,
0.05252130329608917,
0.034934788942337036,
-0.08795804530382156,
0.014379818923771381,
-0.01093603577464819,
0.10664995014667511,
-0.26411908864974976,
0.25201401114463806,
-0.0013088794657960534,
0.10008730739355087,
-0.06020635738968849,
-0.027968252077698708,
-0.008049605414271355,
0.044775646179914474,
0.08722435683012009,
0.010404523462057114,
-0.03556341305375099,
-0.09633318334817886,
-0.08200179040431976,
0.05519210547208786,
0.016324903815984726,
0.04368443787097931,
0.05739358812570572,
0.001984511036425829,
0.0013046114472672343,
0.0176838431507349,
-0.047318775206804276,
-0.16190989315509796,
-0.08523596078157425,
-0.01773078367114067,
0.18108771741390228,
0.10244060307741165,
-0.007313098758459091,
-0.09881492704153061,
-0.07698891311883926,
0.036483459174633026,
-0.062216032296419144,
-0.06102891266345978,
-0.053971584886312485,
-0.014637015759944916,
0.12990862131118774,
-0.06610260903835297,
0.012923979200422764,
0.09917353093624115,
0.1160513386130333,
-0.049452416598796844,
-0.04535050690174103,
0.03529466688632965,
-0.11844035983085632,
-0.11828377842903137,
0.005360331851989031,
0.19573642313480377,
0.1295299232006073,
0.0865357294678688,
0.04169180244207382,
0.03699972480535507,
-0.01371988095343113,
-0.025841979309916496,
0.02689288929104805,
0.09170442074537277,
-0.1116832047700882,
0.034829121083021164,
0.025743728503584862,
-0.1653173267841339,
-0.146038219332695,
-0.06734952330589294,
0.1859729290008545,
0.09404004365205765,
-0.1141270101070404,
0.23008006811141968,
0.1571774184703827,
-0.08404543250799179,
-0.21764469146728516,
-0.003223360748961568,
0.12644192576408386,
0.15280190110206604,
0.0045216623693704605,
-0.20083068311214447,
0.018163856118917465,
-0.054352011531591415,
-0.03310254216194153,
-0.052218180149793625,
-0.23641587793827057,
-0.15578879415988922,
0.14365226030349731,
-0.07915171980857849,
0.14485470950603485,
0.020639747381210327,
-0.005620679352432489,
-0.034025195986032486,
0.0710734874010086,
0.027322378009557724,
-0.08053122460842133,
0.11484722793102264,
0.035223864018917084,
0.09758427739143372,
0.06734699010848999,
-0.01428725104779005,
0.09204475581645966,
0.06136665493249893,
-0.008087213151156902,
-0.002056993544101715,
0.0735396072268486,
0.03828549012541771,
0.016750020906329155,
0.11402276903390884,
-0.07567450404167175,
0.038094744086265564,
-0.08797071874141693,
-0.10421375930309296,
-0.06122982129454613,
0.06805295497179031,
0.04032094404101372,
-0.044771336019039154,
-0.006701442878693342,
-0.028868889436125755,
0.019867345690727234,
-0.009283163584768772,
-0.002409914042800665,
-0.13485246896743774,
0.03369687870144844,
0.1525062769651413,
0.1688527911901474,
-0.019308999180793762,
-0.0725473165512085,
0.02075420320034027,
-0.03389997035264969,
0.12894397974014282,
-0.10904442518949509,
0.028192950412631035,
0.06881938129663467,
0.027257177978754044,
0.1039666160941124,
-0.0042320601642131805,
-0.10607652366161346,
0.06081124767661095,
0.0533427968621254,
-0.10087990760803223,
-0.11641030758619308,
-0.03878220170736313,
-0.04337407276034355,
-0.0019529284909367561,
0.02253655344247818,
0.1454489678144455,
-0.1297266185283661,
0.001882324693724513,
-0.02688150852918625,
0.01702672615647316,
-0.11323960870504379,
0.21340104937553406,
0.023424360901117325,
0.06964968144893646,
-0.09840042144060135,
0.01947776973247528,
-0.01646273024380207,
-0.03322708234190941,
0.06835483759641647,
-0.047792185097932816,
-0.08040598779916763,
-0.06408490240573883,
-0.030516203492879868,
0.04825974628329277,
0.025310711935162544,
-0.14584526419639587,
-0.04862096905708313,
-0.10389167815446854,
0.02294250577688217,
0.07133178412914276,
0.06446883827447891,
-0.00899069756269455,
-0.09658900648355484,
-0.059092625975608826,
-0.08784028887748718,
0.05814056098461151,
0.08868230879306793,
-0.0548233799636364,
-0.1254304200410843,
0.1741558462381363,
0.05371096730232239,
0.03970443457365036,
-0.0599859245121479,
-0.06088707223534584,
-0.007835570722818375,
0.09083300083875656,
-0.09458383172750473,
-0.010828131809830666,
-0.040070511400699615,
-0.003104905830696225,
-0.01077879685908556,
-0.0735357478260994,
-0.03408598154783249,
0.08566265553236008,
-0.1015566810965538,
0.07492678612470627,
-0.03527579456567764,
0.06648224592208862,
-0.0792880579829216,
0.03920087218284607,
-0.005252161528915167,
-0.036800939589738846,
0.08871682733297348,
0.16769951581954956,
-0.08654725551605225,
0.14028549194335938,
-0.18210181593894958,
-0.017488285899162292,
0.06827693432569504,
0.06333998590707779,
-0.028973178938031197,
-0.08584681153297424,
0.030732693150639534,
0.08291958272457123,
0.0706937238574028,
0.0037113958969712257,
0.07446294277906418,
-0.05194208770990372,
0.005782718770205975,
-0.07364468276500702,
0.015995439141988754,
-0.02731134369969368,
0.05065390095114708,
0.04924589768052101,
0.1775275021791458,
0.1886376142501831,
-0.1351625621318817,
0.11624440550804138,
-0.1051056832075119,
0.018367527052760124,
-0.06621924042701721,
-0.03132379427552223,
-0.12793047726154327,
-0.07933685183525085,
0.08307655900716782,
-0.05628899484872818,
0.16766391694545746,
0.07728084176778793,
0.027973975986242294,
-0.013500213623046875,
-0.10522278398275375,
0.025321492925286293,
0.003187893657013774,
0.22733749449253082,
0.04478280246257782,
0.03528029844164848,
-0.04570557922124863,
0.012476126663386822,
0.022056160494685173,
0.12698133289813995,
0.0073097869753837585,
0.16255640983581543,
0.027468804270029068,
0.09441453963518143,
0.08422830700874329,
-0.09241078048944473,
0.017829131335020065,
-0.020219968631863594,
-0.1724608838558197,
0.04362315312027931,
-0.09734044969081879,
0.16291724145412445,
0.1622215062379837,
-0.10104624927043915,
0.08683255314826965,
0.006149441469460726,
-0.07833283394575119,
-0.1511908918619156,
-0.1123242899775505,
-0.07879121601581573,
-0.18817569315433502,
0.02666332758963108,
-0.09382793307304382,
0.023468080908060074,
0.04317621514201164,
0.046289872378110886,
-0.03157256171107292,
0.14382824301719666,
0.00547915231436491,
-0.10550614446401596,
0.09326960891485214,
-0.07774946093559265,
-0.00557619147002697,
-0.059197988361120224,
0.016755975782871246,
0.17294634878635406,
-0.03487605229020119,
0.03149758651852608,
0.015303532592952251,
-0.05653323978185654,
0.025746872648596764,
-0.05815702676773071,
-0.06851554661989212,
-0.01360617857426405,
0.008133472874760628,
0.12029476463794708,
0.13905145227909088,
0.1192576140165329,
-0.07125663757324219,
-0.01824941672384739,
0.15663443505764008,
-0.029683789238333702,
-0.14336489140987396,
-0.1364782601594925,
0.12964385747909546,
0.040249262005090714,
0.04867668077349663,
-0.020759474486112595,
-0.04424188658595085,
-0.03783150017261505,
0.21829575300216675,
0.2964671552181244,
0.06491069495677948,
0.043327756226062775,
-0.053992219269275665,
0.003485419787466526,
-0.01652679219841957,
0.08103300631046295,
0.041138045489788055,
0.2023434340953827,
-0.005120100453495979,
0.016733970493078232,
-0.11320096254348755,
-0.0515969842672348,
-0.002239074558019638,
0.04979187995195389,
-0.04561198502779007,
-0.1317417174577713,
0.009050036780536175,
0.16730549931526184,
-0.05723409727215767,
-0.1044526919722557,
-0.08601497858762741,
-0.12230047583580017,
-0.07887094467878342,
0.017468934878706932,
0.05951165407896042,
0.12167590111494064,
0.05182643607258797,
-0.07693914324045181,
0.02999420091509819,
0.09645330905914307,
0.0007724211900494993,
-0.07015819847583771,
-0.056188419461250305,
0.05144498497247696,
-0.09802600741386414,
-0.012312833219766617,
-0.022876350209116936,
0.1970718502998352,
0.014647887088358402,
0.09481092542409897,
-0.011637795716524124,
0.137028768658638,
-0.009019773453474045,
-0.09971976280212402,
0.018419628962874413,
0.1172923818230629,
-0.043163903057575226,
0.12191641330718994,
0.04539593309164047,
-0.13947749137878418,
0.05521968752145767,
-0.1421528458595276,
0.011195044964551926,
-0.07266984134912491,
0.09407100081443787,
-0.053329985588788986,
0.0644867941737175,
0.11033537238836288,
-0.0756828784942627,
-0.06284534186124802,
-0.04066010192036629,
0.057812321931123734,
0.020675035193562508,
-0.06474611163139343,
-0.03253903612494469,
-0.2511226236820221,
-0.017833450809121132,
-0.05288611352443695,
-0.01747380942106247,
-0.18034619092941284,
-0.014473446644842625,
-0.009521466679871082,
-0.05709516257047653,
0.020784607157111168,
0.049611646682024,
0.08679722994565964,
0.0020873025059700012,
0.007172546349465847,
-0.022721799090504646,
0.07243986427783966,
0.10485874861478806,
-0.15757307410240173,
-0.12488547712564468
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Sakha
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Sakha using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "sah", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-sakha")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-sakha")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Sakha test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/sah.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-sakha")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-sakha")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/sah/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/sah/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 32.23 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "sah", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Sakha XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice sah", "type": "common_voice", "args": "sah"}, "metrics": [{"type": "wer", "value": 32.23, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-sakha
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"sah",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sah"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sah #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Sakha
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Sakha test data of Common Voice.
Test Result: 32.23 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Sakha\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Sakha test data of Common Voice.\n\n\n\nTest Result: 32.23 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sah #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Sakha\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Sakha test data of Common Voice.\n\n\n\nTest Result: 32.23 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sah #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Sakha\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Sakha test data of Common Voice.\n\n\n\nTest Result: 32.23 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1442311555147171,
-0.02297801338136196,
-0.0020131724886596203,
-0.04146438464522362,
0.0880548357963562,
-0.048915520310401917,
0.20784515142440796,
0.0700676441192627,
0.011963880620896816,
-0.009759482927620411,
0.04019331559538841,
-0.004002187866717577,
0.06671109050512314,
0.12070395797491074,
-0.0046308874152600765,
-0.213908851146698,
0.008862275630235672,
-0.026033611968159676,
0.03185730054974556,
0.11256738752126694,
0.1021348088979721,
-0.05960759148001671,
0.0019941646605730057,
0.08922166377305984,
-0.13555048406124115,
0.03874214366078377,
0.061276666820049286,
-0.12447891384363174,
0.1433722972869873,
0.09134320169687271,
0.06973957270383835,
0.05239059031009674,
0.09263934940099716,
-0.18183784186840057,
0.03333747014403343,
0.029346095398068428,
0.04532986506819725,
0.026305630803108215,
0.02798987552523613,
-0.004419646691530943,
0.1038828045129776,
0.08449151366949081,
-0.04529574140906334,
0.08726302534341812,
-0.0654875710606575,
-0.17101246118545532,
-0.022390490397810936,
0.001137837185524404,
0.1143229603767395,
0.1265156865119934,
-0.07348737865686417,
0.07576646655797958,
-0.12887850403785706,
0.07947412878274918,
0.0823541060090065,
-0.1603952944278717,
-0.012257459573447704,
0.09633081406354904,
0.033980242908000946,
0.10615720599889755,
-0.05505780503153801,
0.06115180253982544,
0.04203550145030022,
0.02244861051440239,
0.026455985382199287,
-0.03764936327934265,
-0.202764093875885,
0.0008330788114108145,
-0.12928368151187897,
-0.038058459758758545,
0.2117694616317749,
-0.00784677267074585,
-0.08147315680980682,
-0.1435210108757019,
-0.030568769201636314,
0.010933692567050457,
-0.010521486401557922,
-0.046205777674913406,
-0.010524870827794075,
0.026324544101953506,
-0.008384499698877335,
-0.02903263084590435,
-0.1149647906422615,
-0.17591474950313568,
0.0003303465782664716,
0.07998577505350113,
0.01377324853092432,
0.03445310890674591,
-0.1485062688589096,
0.04213660582900047,
-0.10715585201978683,
-0.07834027707576752,
-0.0011967499740421772,
0.0005921911215409636,
-0.09793462604284286,
0.04528477415442467,
-0.0865110456943512,
-0.21365076303482056,
0.028035545721650124,
-0.049709197133779526,
0.041017211973667145,
0.037792354822158813,
-0.024778708815574646,
0.037210144102573395,
0.032572172582149506,
0.09614940732717514,
-0.057919830083847046,
0.008314159698784351,
0.017830975353717804,
0.02001003548502922,
-0.0260219257324934,
-0.03752703219652176,
-0.04458732530474663,
-0.05672828480601311,
-0.05278310552239418,
0.033803898841142654,
-0.042526956647634506,
-0.003992473240941763,
-0.02367514744400978,
-0.029349254444241524,
-0.007053544279187918,
-0.09899374842643738,
-0.06578007340431213,
0.060721639543771744,
0.03618566691875458,
0.11238467693328857,
0.05071331188082695,
0.056397270411252975,
-0.05149305984377861,
-0.02246379293501377,
0.020947225391864777,
0.031381167471408844,
-0.019203493371605873,
-0.0636684000492096,
-0.004337147809565067,
-0.0062635778449475765,
-0.020029332488775253,
-0.09750256687402725,
-0.1299208402633667,
-0.07888337224721909,
0.00010055784514406696,
0.03202974051237106,
-0.043074946850538254,
-0.05685873702168465,
-0.009120931848883629,
-0.011276273056864738,
-0.06797085702419281,
0.08641184121370316,
-0.03682165965437889,
0.05924304947257042,
0.04247606545686722,
0.056322190910577774,
0.06015212833881378,
0.08070315420627594,
-0.06207767874002457,
-0.03194150701165199,
-0.028642898425459862,
0.15282364189624786,
-0.05580265447497368,
-0.05602080374956131,
-0.0894809365272522,
-0.056886203587055206,
-0.08001977205276489,
0.08224616944789886,
0.0499211810529232,
0.11726418882608414,
-0.27905774116516113,
-0.09621348232030869,
0.1963556706905365,
-0.14388413727283478,
-0.026244718581438065,
0.20556117594242096,
-0.01250375248491764,
0.09167728573083878,
0.1285790652036667,
0.24568554759025574,
0.07601857930421829,
-0.17366616427898407,
0.025288602337241173,
0.02001269720494747,
0.027754463255405426,
-0.05210959538817406,
0.08884856849908829,
-0.04838636517524719,
-0.045573778450489044,
0.037620462477207184,
-0.0780722051858902,
0.09704864770174026,
-0.023011526092886925,
-0.06925332546234131,
-0.018290869891643524,
-0.09536037594079971,
0.061332378536462784,
0.04110809415578842,
0.032517265528440475,
-0.006797728594392538,
-0.04922184720635414,
0.034439049661159515,
0.13638567924499512,
-0.1150328740477562,
0.03974298760294914,
-0.10388610512018204,
0.05536423996090889,
-0.08043840527534485,
-0.007255423814058304,
-0.13784565031528473,
0.17966149747371674,
-0.035123441368341446,
0.024984123185276985,
0.05590550974011421,
0.17004358768463135,
0.015325970016419888,
0.0014442208921536803,
-0.03860063850879669,
-0.005190475843846798,
0.01070435345172882,
-0.022097548469901085,
-0.02758815325796604,
-0.10264377295970917,
-0.01845797896385193,
-0.058514200150966644,
0.14290377497673035,
-0.1623542755842209,
-0.0026378154288977385,
-0.01905898191034794,
-0.0039006290026009083,
-0.00001041531049850164,
-0.013001012615859509,
0.09610015153884888,
0.07648199796676636,
0.025351261720061302,
0.030620962381362915,
0.023961639031767845,
0.0026719938032329082,
-0.06694728881120682,
0.1548910290002823,
-0.1162634938955307,
-0.03964369744062424,
0.09788599610328674,
-0.024510927498340607,
0.009854898788034916,
0.06579728424549103,
0.002306616399437189,
-0.0198837723582983,
-0.11422649770975113,
0.007252430077642202,
0.28159916400909424,
0.007298030890524387,
0.10349863022565842,
-0.08223295956850052,
0.018004024401307106,
0.03435908257961273,
-0.09017373621463776,
0.06318008899688721,
0.06046745553612709,
0.006505683530122042,
0.038043420761823654,
0.02164490707218647,
-0.029566271230578423,
-0.09656830877065659,
0.22749188542366028,
-0.05371391400694847,
-0.08748070895671844,
0.026862313970923424,
-0.04545890539884567,
-0.04378310590982437,
0.07937340438365936,
-0.13329768180847168,
-0.05152537301182747,
0.04167322441935539,
0.04583251103758812,
0.043962400406599045,
-0.12295237183570862,
0.01441540103405714,
0.018383335322141647,
-0.11986948549747467,
-0.1799575686454773,
0.08064882457256317,
-0.057676441967487335,
0.03559432178735733,
-0.10655325651168823,
-0.009563094936311245,
-0.011332844384014606,
-0.04591458663344383,
-0.16544124484062195,
0.13496708869934082,
-0.04392794147133827,
-0.18284723162651062,
-0.13086055219173431,
0.0033353804610669613,
0.0002832038444466889,
0.017430461943149567,
0.08884147554636002,
-0.14198116958141327,
-0.009843630716204643,
-0.015400470234453678,
0.10217081010341644,
0.0271434523165226,
-0.022709764540195465,
-0.034353215247392654,
-0.020620044320821762,
0.07878381013870239,
-0.11974070221185684,
0.019839368760585785,
-0.06412317603826523,
-0.0026993693318217993,
0.002918811747804284,
0.00280399271287024,
0.005540078971534967,
0.1619795262813568,
0.04688628762960434,
0.012213965877890587,
-0.022442061454057693,
0.18425101041793823,
-0.07142224162817001,
-0.029787326231598854,
0.24826903641223907,
-0.0161226075142622,
-0.0272964034229517,
0.10624190419912338,
0.018004992976784706,
-0.05531412735581398,
-0.014773025177419186,
-0.0014069285243749619,
-0.1060890406370163,
-0.2625454366207123,
-0.10247929394245148,
-0.07916685938835144,
-0.060216546058654785,
-0.04952684044837952,
-0.0033097234554588795,
0.03259952738881111,
0.05160275846719742,
-0.008905578404664993,
-0.05447337403893471,
0.03444620966911316,
-0.02465136907994747,
0.1201922595500946,
-0.006231646053493023,
0.11176438629627228,
-0.0693427175283432,
-0.022069815546274185,
0.018854232504963875,
-0.0005255573778413236,
0.12156771123409271,
0.043563734740018845,
0.08401517570018768,
0.0949959084391594,
0.08919979631900787,
0.13750506937503815,
0.04687618464231491,
-0.04866441711783409,
-0.03536053001880646,
-0.005336626432836056,
-0.060995154082775116,
-0.08790646493434906,
0.07050575315952301,
0.09150759875774384,
-0.03819841891527176,
-0.0326431542634964,
0.007094025611877441,
0.005393005441874266,
0.2130270004272461,
0.0782831534743309,
-0.18478699028491974,
-0.09452173113822937,
0.0031042408663779497,
-0.07251351326704025,
-0.0016771790105849504,
0.06732144206762314,
0.15598954260349274,
-0.1217680275440216,
0.035053178668022156,
-0.009702351875603199,
0.09345247596502304,
0.02301601693034172,
0.03788980469107628,
-0.10913193970918655,
0.024706853553652763,
-0.006727681029587984,
0.07690056413412094,
-0.26439279317855835,
0.21316969394683838,
0.002912299707531929,
0.10507865995168686,
-0.04376429319381714,
0.004654054529964924,
0.023763803765177727,
0.06148335710167885,
0.1152825728058815,
0.022490834817290306,
-0.05058467015624046,
-0.040478795766830444,
-0.052508868277072906,
0.07602669298648834,
-0.04094594344496727,
0.028662225231528282,
0.022317353636026382,
-0.001288370112888515,
0.00420789048075676,
0.004340296145528555,
-0.009236357174813747,
-0.13194748759269714,
-0.04659628868103027,
0.008049029856920242,
0.16892299056053162,
0.11254501342773438,
-0.035006579011678696,
-0.07136959582567215,
-0.0799267590045929,
0.012961545959115028,
-0.048316020518541336,
-0.06343164294958115,
-0.0726727768778801,
-0.042039304971694946,
0.09985784441232681,
-0.06467559933662415,
0.004532868042588234,
0.0904918983578682,
0.10954563319683075,
-0.031545743346214294,
-0.05066665634512901,
0.035085372626781464,
-0.11521901935338974,
-0.10779144614934921,
-0.00518318684771657,
0.16062289476394653,
0.10015932470560074,
0.07859142124652863,
0.03888919577002525,
0.01432468555867672,
-0.010332743637263775,
-0.03764532878994942,
0.021123986691236496,
0.15349818766117096,
-0.10963989794254303,
0.008427749387919903,
0.05664139986038208,
-0.17365646362304688,
-0.09435192495584488,
-0.07107362151145935,
0.14870841801166534,
0.0866280272603035,
-0.05962346866726875,
0.20904293656349182,
0.24318143725395203,
-0.08509108424186707,
-0.20014885067939758,
-0.04733851179480553,
0.09179544448852539,
0.14001168310642242,
0.03299562633037567,
-0.1926104575395584,
0.08507242053747177,
-0.03186165913939476,
-0.0499236136674881,
-0.13367238640785217,
-0.2586500942707062,
-0.14668981730937958,
0.1673964262008667,
-0.044207774102687836,
0.193927600979805,
-0.013892138376832008,
-0.044248636811971664,
-0.03660235553979874,
-0.005077531561255455,
-0.004107523709535599,
-0.12469115853309631,
0.10306746512651443,
0.020070379599928856,
0.1278991848230362,
0.04518784210085869,
-0.02372118830680847,
0.10135632753372192,
0.08889248222112656,
-0.02195282280445099,
0.006659938022494316,
0.09358473867177963,
-0.00039818775258027017,
0.011535326018929482,
0.11539220064878464,
-0.12051395326852798,
0.05370854586362839,
-0.11372338980436325,
-0.0962948277592659,
-0.08623626083135605,
0.053575724363327026,
0.04388522729277611,
-0.04512408375740051,
0.026938222348690033,
-0.0594695582985878,
0.018120914697647095,
-0.005632218439131975,
-0.0329662524163723,
-0.14593924582004547,
0.0845293328166008,
0.12531408667564392,
0.18151859939098358,
-0.007876889780163765,
-0.0931587666273117,
-0.03351594880223274,
-0.03291470184922218,
0.122987762093544,
-0.16719484329223633,
0.029037972912192345,
0.035118214786052704,
0.05882098153233528,
0.10732091963291168,
0.00669343676418066,
-0.09352768957614899,
0.08428633213043213,
0.024771463125944138,
-0.059794969856739044,
-0.129902645945549,
-0.030107200145721436,
0.007695441599935293,
-0.04486613720655441,
0.025032367557287216,
0.12448128312826157,
-0.09500929713249207,
-0.02325463853776455,
-0.02835795469582081,
0.023464055731892586,
-0.12360879778862,
0.20708920061588287,
0.022336885333061218,
0.07960246503353119,
-0.11364082992076874,
0.005944341886788607,
-0.006638666614890099,
-0.042476505041122437,
0.041485242545604706,
-0.009253152646124363,
-0.07725050300359726,
-0.0630931705236435,
-0.03879229351878166,
0.06525420397520065,
0.03078124299645424,
-0.11858980357646942,
-0.05233439803123474,
-0.1278204619884491,
-0.006235051900148392,
0.08338172733783722,
0.043302442878484726,
-0.00964127667248249,
-0.11370514333248138,
-0.06418876349925995,
-0.09457211196422577,
0.06056971102952957,
0.0688893273472786,
-0.004714940208941698,
-0.12801982462406158,
0.12663842737674713,
0.06354142725467682,
0.06227084621787071,
-0.04842125624418259,
-0.05217771604657173,
-0.015845855697989464,
0.09269887208938599,
-0.14085069298744202,
0.0002133604430127889,
-0.051352910697460175,
-0.000993378460407257,
-0.007728095632046461,
-0.08080504089593887,
-0.014683499000966549,
0.0777091309428215,
-0.09088204801082611,
0.08636653423309326,
-0.01057813223451376,
0.08705733716487885,
-0.0781472697854042,
0.042585842311382294,
0.02913658507168293,
-0.04977374151349068,
0.08037569373846054,
0.15517807006835938,
-0.09924858808517456,
0.12219410389661789,
-0.17722409963607788,
-0.03492597118020058,
0.05497582629323006,
0.05490727350115776,
-0.03763686120510101,
-0.10336945950984955,
0.04029396176338196,
0.06860538572072983,
0.05111826956272125,
-0.017928805202245712,
0.07703287899494171,
-0.04334922507405281,
0.018800033256411552,
-0.04671147093176842,
0.019709143787622452,
-0.0458519421517849,
0.02099766582250595,
0.06831689924001694,
0.1724574863910675,
0.12871049344539642,
-0.10042791813611984,
0.10076618939638138,
-0.1313066929578781,
0.015091442503035069,
-0.06287628412246704,
-0.023355785757303238,
-0.15719455480575562,
-0.11918933689594269,
0.06590061634778976,
-0.06018736585974693,
0.12449875473976135,
0.03343474492430687,
0.02012319304049015,
-0.02062845416367054,
-0.07918212562799454,
0.07615330815315247,
-0.012533828616142273,
0.25202322006225586,
0.04496031999588013,
0.04643566161394119,
-0.008331689052283764,
-0.019762106239795685,
-0.0035657763946801424,
0.1670100837945938,
-0.009406247176229954,
0.16558094322681427,
-0.010838987305760384,
0.07113750278949738,
0.07103239744901657,
-0.05490999296307564,
-0.03152257204055786,
-0.018126077950000763,
-0.16939997673034668,
0.021438147872686386,
-0.07429942488670349,
0.14334550499916077,
0.14133161306381226,
-0.09286681562662125,
0.08371427655220032,
-0.016116352751851082,
-0.08166898787021637,
-0.17475363612174988,
-0.11076202988624573,
-0.05576830357313156,
-0.15766417980194092,
0.03172563388943672,
-0.07614360749721527,
0.03163669630885124,
0.11710401624441147,
0.01603851467370987,
0.003519408404827118,
0.16882553696632385,
0.026168329641222954,
-0.09248078614473343,
0.06017936021089554,
-0.08175327628850937,
-0.020173780620098114,
-0.08587522804737091,
0.04418691620230675,
0.19494517147541046,
-0.004621958825737238,
0.034114714711904526,
-0.01094602607190609,
-0.06248580664396286,
0.04533495754003525,
-0.0629214197397232,
-0.06719698011875153,
-0.015455858781933784,
-0.02030242793262005,
0.10088314861059189,
0.13104118406772614,
0.11895470321178436,
-0.06440912187099457,
0.0022940251510590315,
0.1313399225473404,
-0.05536951497197151,
-0.13333183526992798,
-0.16591212153434753,
0.15708962082862854,
0.003389172488823533,
0.009089247323572636,
-0.010916231200098991,
-0.028489883989095688,
-0.009247811511158943,
0.23355649411678314,
0.20163792371749878,
0.03859942406415939,
0.029045691713690758,
-0.04279250279068947,
-0.0072629377245903015,
-0.05991051346063614,
0.0771382749080658,
0.06329496204853058,
0.23287202417850494,
-0.017505789175629616,
0.05389708653092384,
-0.075508251786232,
-0.10336054116487503,
0.011456598527729511,
0.09754438698291779,
-0.07704108953475952,
-0.11309462040662766,
0.03963310271501541,
0.14530393481254578,
-0.06504794210195541,
-0.11216457188129425,
-0.1030961349606514,
-0.07118115574121475,
-0.07626398652791977,
-0.02217649109661579,
0.020318446680903435,
0.13287575542926788,
-0.00314016779884696,
-0.09323270618915558,
0.02788427099585533,
0.15888871252536774,
0.0018397073727101088,
-0.050858911126852036,
-0.04563956707715988,
0.07485585659742355,
-0.06491843611001968,
-0.02297190949320793,
0.0037964079529047012,
0.1571507751941681,
0.004545221105217934,
0.08047473430633545,
-0.02450883947312832,
0.14951373636722565,
-0.01704154908657074,
-0.09187474846839905,
0.0005944381700828671,
0.16069768369197845,
-0.03315560892224312,
0.12973886728286743,
0.008641986176371574,
-0.11572583764791489,
0.051247477531433105,
-0.11637234687805176,
-0.028146883472800255,
-0.0699339285492897,
0.05449196323752403,
-0.05138629302382469,
0.09838923066854477,
0.06098082661628723,
-0.0636998638510704,
-0.052124373614788055,
-0.05714882165193558,
0.07913379371166229,
0.024195022881031036,
-0.06149831414222717,
-0.03512866795063019,
-0.23616959154605865,
-0.01367173157632351,
-0.056238070130348206,
-0.005177945829927921,
-0.23815134167671204,
-0.018354766070842743,
-0.025498053058981895,
-0.061050962656736374,
0.029608367010951042,
0.026993928477168083,
0.11528494954109192,
0.004308025352656841,
0.003147827461361885,
0.01011233776807785,
0.05049003288149834,
0.12295030802488327,
-0.1545412242412567,
-0.11105595529079437
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Slovenian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Slovenian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "sl", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-slovenian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-slovenian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Slovenian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/sl.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-slovenian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-slovenian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/sl/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/sl/clips/"
def clean_sentence(sent):
sent = sent.lower()
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 36.04 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "sl", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Slovenian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice sl", "type": "common_voice", "args": "sl"}, "metrics": [{"type": "wer", "value": 36.04, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-slovenian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"sl",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sl"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sl #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Slovenian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Slovenian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Slovenian test data of Common Voice.
Test Result: 36.04 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Slovenian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Slovenian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Slovenian test data of Common Voice.\n\n\n\nTest Result: 36.04 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sl #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Slovenian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Slovenian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Slovenian test data of Common Voice.\n\n\n\nTest Result: 36.04 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sl #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Slovenian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Slovenian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Slovenian test data of Common Voice.\n\n\n\nTest Result: 36.04 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.14911650121212006,
-0.028264999389648438,
-0.0022275226656347513,
-0.026105251163244247,
0.06195461377501488,
-0.05675118416547775,
0.17819903790950775,
0.07652779668569565,
0.0380045622587204,
-0.003664452349767089,
0.0558868832886219,
0.06317173689603806,
0.02190851978957653,
0.03926582634449005,
-0.017293274402618408,
-0.22264839708805084,
0.05262000113725662,
-0.03652876988053322,
0.08749731630086899,
0.10144174098968506,
0.12354559451341629,
-0.08677327632904053,
-0.012403175234794617,
0.07929585874080658,
-0.1290990710258484,
0.055105894804000854,
0.03146819397807121,
-0.1159328892827034,
0.12007047235965729,
0.066077820956707,
0.007494235411286354,
0.06779561191797256,
0.09521837532520294,
-0.18844202160835266,
0.01471179723739624,
0.00022063046344555914,
0.0160407442599535,
0.014405440539121628,
0.05453074723482132,
-0.02835802175104618,
0.18340495228767395,
0.03916998207569122,
-0.03976847976446152,
0.03048885054886341,
-0.038806453347206116,
-0.23893392086029053,
-0.0007603796548210084,
-0.03574114292860031,
0.08192566782236099,
0.1077209860086441,
-0.056609902530908585,
0.11946123838424683,
-0.17472213506698608,
0.11374092847108841,
0.05388106405735016,
-0.17770743370056152,
-0.011251650750637054,
0.06859707832336426,
0.0670812651515007,
0.09676392376422882,
-0.06644327193498611,
0.05709485337138176,
0.005782484542578459,
0.024008100852370262,
-0.0002740418422035873,
-0.04320252686738968,
-0.22451701760292053,
-0.02207677625119686,
-0.14122214913368225,
-0.04606647416949272,
0.2537626326084137,
-0.013938335701823235,
-0.06644540280103683,
-0.1326049119234085,
0.025894558057188988,
0.04298535734415054,
-0.0005494639626704156,
-0.02738461084663868,
-0.020684614777565002,
-0.005206749774515629,
-0.014903114177286625,
-0.02212424762547016,
-0.07699791342020035,
-0.14424008131027222,
0.04888129234313965,
0.16782629489898682,
0.02688867785036564,
0.0451086200773716,
-0.07396367937326431,
0.06491278111934662,
-0.06872329860925674,
-0.0726979523897171,
-0.009883476421236992,
0.03724880889058113,
-0.0615350641310215,
0.02771242894232273,
-0.08225422352552414,
-0.15376298129558563,
0.031101545318961143,
-0.02148602157831192,
0.14324457943439484,
0.0038582882843911648,
0.01718215085566044,
0.08876001089811325,
0.03462943434715271,
0.08820775896310806,
-0.05461091920733452,
-0.0358019657433033,
-0.005601316224783659,
-0.06423205137252808,
-0.03602182865142822,
-0.03019324317574501,
-0.11833649128675461,
-0.08655750006437302,
-0.01764957420527935,
0.08017119765281677,
-0.02975086122751236,
0.0003975776780862361,
0.022987911477684975,
-0.040814340114593506,
-0.027848878875374794,
-0.07338369637727737,
-0.02250879816710949,
0.03365260362625122,
-0.013629463501274586,
0.10877437889575958,
-0.020793035626411438,
0.060449812561273575,
-0.10220354795455933,
0.028704438358545303,
0.05723408982157707,
0.053868964314460754,
-0.03704730421304703,
-0.12934119999408722,
0.03873559460043907,
-0.029379690065979958,
0.020956339314579964,
-0.10351289808750153,
-0.0995379164814949,
-0.06670761853456497,
-0.01236720010638237,
-0.006857262458652258,
-0.0005500536062754691,
-0.1294979751110077,
-0.01366649940609932,
0.0029600784182548523,
-0.07396950572729111,
0.08870901167392731,
-0.052221596240997314,
0.06002125144004822,
-0.041039541363716125,
0.04285939782857895,
-0.016603194177150726,
0.05747795104980469,
-0.09077848494052887,
-0.06693139672279358,
0.02063414268195629,
0.12568961083889008,
-0.043596308678388596,
-0.11052724719047546,
-0.0796222910284996,
-0.07310609519481659,
-0.029022514820098877,
0.06116623803973198,
0.051574401557445526,
0.08725205063819885,
-0.29504379630088806,
-0.05853720381855965,
0.14882995188236237,
-0.12710243463516235,
0.02881104126572609,
0.2027456909418106,
-0.009026295505464077,
0.08419686555862427,
0.13550737500190735,
0.2768820822238922,
0.11441625654697418,
-0.1781637966632843,
0.05707656964659691,
0.0042807129211723804,
-0.013465900905430317,
-0.09093283116817474,
0.0890793725848198,
-0.0484142042696476,
-0.00016927291289903224,
0.05824650451540947,
-0.083247609436512,
0.0641014501452446,
-0.034742970019578934,
-0.039107564836740494,
-0.02253882586956024,
-0.0805782675743103,
0.08594352751970291,
0.03305713087320328,
0.01717539317905903,
-0.06997484713792801,
-0.033907607197761536,
0.0687667578458786,
0.12191909551620483,
-0.13437792658805847,
0.04867488145828247,
-0.06551322340965271,
0.11715511232614517,
-0.1452598124742508,
-0.027420824393630028,
-0.11305779218673706,
0.17352502048015594,
-0.005700977519154549,
0.05802902579307556,
0.017877038568258286,
0.19099482893943787,
0.04236816242337227,
0.02943025715649128,
-0.049281202256679535,
0.014672806486487389,
-0.014653644524514675,
-0.007784275803714991,
-0.01701822318136692,
-0.13038016855716705,
-0.04560565948486328,
-0.04480176791548729,
0.008786287158727646,
-0.13003729283809662,
-0.053438734263181686,
0.013651798479259014,
0.04370305687189102,
-0.008461957797408104,
-0.011483958922326565,
0.02351001650094986,
0.12029650062322617,
0.023472387343645096,
0.04805056378245354,
0.04655282199382782,
-0.0022412820253521204,
-0.023361017927527428,
0.18445584177970886,
-0.0863114520907402,
-0.051848575472831726,
0.07342564314603806,
-0.07890944182872772,
0.021022368222475052,
0.006431409157812595,
-0.00979657843708992,
-0.00802701711654663,
-0.05920363590121269,
-0.028568726032972336,
0.26881489157676697,
-0.031767550855875015,
0.07245898246765137,
-0.09891556948423386,
-0.021439706906676292,
0.05567970499396324,
-0.07486356794834137,
0.03445940464735031,
0.10681582242250443,
0.052714042365550995,
0.09595119953155518,
0.02310950681567192,
-0.04568490758538246,
-0.0660548284649849,
0.2778618037700653,
-0.019115330651402473,
-0.1364322006702423,
0.004679271951317787,
-0.0297397430986166,
-0.03535797446966171,
0.1194387897849083,
-0.21860606968402863,
-0.026236578822135925,
0.05566324666142464,
0.07193867862224579,
0.059022754430770874,
-0.09841640293598175,
0.004569103941321373,
0.004078394267708063,
-0.14417830109596252,
-0.19621644914150238,
0.08929120749235153,
-0.04052700102329254,
0.05042215809226036,
-0.10866808891296387,
-0.02838115394115448,
0.009927426464855671,
-0.04979408532381058,
-0.1422663778066635,
0.08794762194156647,
-0.07719126343727112,
-0.1536409705877304,
-0.17255941033363342,
0.030371952801942825,
-0.015572858974337578,
0.010275079868733883,
0.09751094877719879,
-0.08449195325374603,
-0.02555815875530243,
-0.0076314592733979225,
0.13257566094398499,
0.006328612100332975,
-0.04360470548272133,
-0.10524720698595047,
0.03186509385704994,
0.014189036563038826,
-0.11770381778478622,
-0.00123932387214154,
-0.0630558431148529,
0.008514601737260818,
-0.010005326941609383,
0.026635056361556053,
0.04675264656543732,
0.111045241355896,
0.016337290406227112,
0.03055737540125847,
-0.00549753662198782,
0.17792819440364838,
-0.10890109837055206,
-0.03273143246769905,
0.21253708004951477,
-0.032731346786022186,
-0.05318803712725639,
0.09111595153808594,
0.029239334166049957,
-0.04191737249493599,
-0.02755601517856121,
-0.017845438793301582,
-0.10190261900424957,
-0.1952820122241974,
-0.1595965176820755,
-0.09288156777620316,
-0.11852160096168518,
-0.04628654196858406,
-0.006579521112143993,
0.01827206462621689,
0.022655749693512917,
-0.028908703476190567,
-0.11875133961439133,
0.03235004469752312,
-0.006302988156676292,
0.12088750302791595,
-0.004333890974521637,
0.08211055397987366,
-0.050108782947063446,
-0.06773346662521362,
0.028965499252080917,
-0.005871552508324385,
0.17325259745121002,
0.03660932928323746,
0.07950689643621445,
0.09189033508300781,
0.13213500380516052,
0.10531327873468399,
0.09142044186592102,
-0.030943743884563446,
-0.008284589275717735,
0.02267196588218212,
-0.08636400103569031,
-0.02493508905172348,
0.01460668258368969,
0.13859978318214417,
-0.039913322776556015,
-0.06991449743509293,
-0.07372555881738663,
0.035482537001371384,
0.23142828047275543,
0.06990145146846771,
-0.17951583862304688,
-0.07421740889549255,
-0.029710764065384865,
-0.04637215659022331,
-0.013867217116057873,
0.02207249030470848,
0.23768237233161926,
-0.13428720831871033,
-0.012684744782745838,
0.023414529860019684,
0.07244601845741272,
-0.01566406898200512,
0.020734885707497597,
-0.09787245094776154,
0.035764869302511215,
0.006365690380334854,
0.11925607174634933,
-0.25541916489601135,
0.20577172935009003,
-0.003235539887100458,
0.14449311792850494,
-0.11000846326351166,
-0.03600281849503517,
-0.026874583214521408,
0.01238872017711401,
0.13536891341209412,
0.05117727443575859,
-0.053139131516218185,
-0.08228395134210587,
-0.1135018914937973,
0.06185000762343407,
0.007310499437153339,
0.02248718962073326,
0.02821917086839676,
0.0020608236081898212,
-0.0017451868625357747,
-0.002772222738713026,
-0.09688511490821838,
-0.10800083726644516,
-0.047230951488018036,
0.00883768405765295,
0.19670036435127258,
0.06113852187991142,
-0.004084332846105099,
-0.10024893283843994,
-0.039239201694726944,
0.030997583642601967,
-0.07879147678613663,
-0.050946302711963654,
-0.07595332711935043,
-0.08518185466527939,
0.13753940165042877,
-0.05058044567704201,
-0.009220121428370476,
0.11382071673870087,
0.13117623329162598,
-0.019245818257331848,
-0.044192634522914886,
0.029306478798389435,
-0.10314134508371353,
-0.11274024099111557,
0.00009298752411268651,
0.1825615018606186,
0.11297646164894104,
0.06428388506174088,
0.062243033200502396,
0.03353298828005791,
-0.006373440846800804,
-0.06134463846683502,
-0.010742616839706898,
0.09739749878644943,
-0.13193926215171814,
0.013911380432546139,
0.017289435490965843,
-0.0834355279803276,
-0.06640879064798355,
-0.07230880111455917,
0.18117918074131012,
0.09499533474445343,
-0.0786774531006813,
0.20057253539562225,
0.203078493475914,
-0.07949460297822952,
-0.22324682772159576,
-0.006824159529060125,
0.10028724372386932,
0.12660253047943115,
-0.031668759882450104,
-0.1112985834479332,
0.07054790109395981,
0.019241977483034134,
-0.018527036532759666,
-0.0197495948523283,
-0.3126121759414673,
-0.1571546494960785,
0.13575664162635803,
-0.02492949552834034,
0.19013182818889618,
0.07026185840368271,
-0.028686918318271637,
-0.025459274649620056,
0.024064654484391212,
-0.006117810029536486,
-0.10678165405988693,
0.09159768372774124,
0.014313392341136932,
0.1129319965839386,
0.07377585768699646,
-0.01906820386648178,
0.12075306475162506,
0.11228672415018082,
-0.004826886113733053,
-0.03026071935892105,
0.08543478697538376,
0.016470225527882576,
0.024004651233553886,
0.12383252382278442,
-0.10147220641374588,
0.028621438890695572,
-0.11285766214132309,
-0.10684012621641159,
-0.09642238914966583,
0.0945923775434494,
0.00880716647952795,
-0.051816072314977646,
0.058599941432476044,
-0.05423562973737717,
0.014829392544925213,
-0.004726619925349951,
-0.050198957324028015,
-0.16118237376213074,
0.0008464217535220087,
0.11181902140378952,
0.16073571145534515,
-0.02706090919673443,
-0.07985594123601913,
0.02477981522679329,
-0.02530348300933838,
0.10670681297779083,
-0.008110175840556622,
0.03235287219285965,
0.07854330539703369,
0.039484694600105286,
0.0830867737531662,
-0.002711609238758683,
-0.08073875308036804,
0.058180175721645355,
0.0253352802246809,
-0.08507432788610458,
-0.07798761129379272,
-0.02946125715970993,
-0.05967175215482712,
0.022527560591697693,
0.0031328555196523666,
0.1005861684679985,
-0.11123701930046082,
0.008277815766632557,
-0.07005663961172104,
-0.004357373807579279,
-0.12727318704128265,
0.20273816585540771,
0.036786384880542755,
0.07201014459133148,
-0.13256968557834625,
0.02865937538444996,
-0.04857916757464409,
-0.021377738565206528,
0.04309356212615967,
-0.05091536045074463,
-0.07820809632539749,
-0.05078163743019104,
0.0025628639850765467,
0.09145476669073105,
0.04816735163331032,
-0.16161750257015228,
-0.06968320906162262,
-0.13460539281368256,
0.0021960337180644274,
0.060180068016052246,
0.05527130141854286,
-0.008640478365123272,
-0.10859961807727814,
-0.08853265643119812,
-0.14493651688098907,
0.07186001539230347,
0.06411383301019669,
-0.038188762962818146,
-0.0666324719786644,
0.1842893660068512,
0.03530878946185112,
0.011829273775219917,
-0.05709454044699669,
-0.055513203144073486,
-0.0005447554285638034,
0.08652143180370331,
-0.09817220270633698,
-0.026777498424053192,
-0.05628233030438423,
0.011923777870833874,
0.0025076186284422874,
-0.08902530372142792,
-0.01932741515338421,
0.08329004049301147,
-0.10175879299640656,
0.08125952631235123,
-0.00331727322191,
0.055176328867673874,
-0.05097107216715813,
0.005720492918044329,
-0.001890091341920197,
-0.055287398397922516,
0.07340286672115326,
0.17469224333763123,
-0.08622688055038452,
0.14139266312122345,
-0.2156682014465332,
-0.007393854670226574,
0.04620431736111641,
0.06766793876886368,
0.001347151817753911,
-0.06020781025290489,
0.05571689456701279,
0.11990979313850403,
0.06202654168009758,
-0.002136099385097623,
0.07037729024887085,
-0.06973563134670258,
0.013406474143266678,
-0.019431287422776222,
-0.053563058376312256,
-0.018319401890039444,
0.08070487529039383,
0.029015984386205673,
0.13932976126670837,
0.15142160654067993,
-0.10607391595840454,
0.09455236792564392,
-0.06806796789169312,
0.012548845261335373,
-0.04201434180140495,
-0.024540942162275314,
-0.13553905487060547,
-0.08334499597549438,
0.06420689076185226,
-0.051241595298051834,
0.1470794975757599,
0.05938001349568367,
0.05031587928533554,
-0.03919333219528198,
-0.08902771025896072,
0.026752036064863205,
-0.02085629291832447,
0.22296646237373352,
0.02646743878722191,
0.04624856263399124,
-0.08305125683546066,
0.010387079790234566,
0.0002067504683509469,
0.12054906785488129,
-0.03398388996720314,
0.20075714588165283,
0.0713968425989151,
0.0680602565407753,
0.13281327486038208,
-0.0733247920870781,
0.0028900608886033297,
-0.027454711496829987,
-0.07348877936601639,
0.04732837527990341,
-0.08257804811000824,
0.19388270378112793,
0.09965108335018158,
-0.12134955823421478,
0.0768284872174263,
0.03965479135513306,
-0.09094759821891785,
-0.15293817222118378,
-0.1616218537092209,
-0.06416038423776627,
-0.14162077009677887,
0.028463153168559074,
-0.11687768250703812,
0.051076069474220276,
0.026223547756671906,
0.048617273569107056,
-0.045074231922626495,
0.11648190766572952,
-0.01551723200827837,
-0.08460529893636703,
0.09953222423791885,
-0.08142533153295517,
0.02988324873149395,
-0.10444007068872452,
0.010091707110404968,
0.1442253589630127,
-0.005863914266228676,
0.04759306088089943,
0.01304446067661047,
-0.0656958669424057,
-0.04172101989388466,
-0.08254706114530563,
-0.06872470676898956,
-0.0016743694432079792,
-0.033754751086235046,
0.12035778164863586,
0.1487068235874176,
0.13141421973705292,
-0.08611524105072021,
-0.01813529059290886,
0.14736127853393555,
-0.0243745818734169,
-0.146620512008667,
-0.13197281956672668,
0.07300709933042526,
0.04429863765835762,
0.05469562113285065,
-0.0019337829435244203,
-0.04065299034118652,
-0.00775856664404273,
0.2216368466615677,
0.2293044924736023,
0.11204209923744202,
0.02351965196430683,
-0.03885868564248085,
-0.015941768884658813,
-0.0022878521122038364,
0.08655453473329544,
0.028423797339200974,
0.21217547357082367,
-0.004088380839675665,
0.050813380628824234,
-0.09536748379468918,
-0.02710648439824581,
0.015339850448071957,
0.08825758844614029,
-0.08057726919651031,
-0.16047297418117523,
0.003473942168056965,
0.16169658303260803,
-0.07954226434230804,
-0.03469162434339523,
-0.07270132750272751,
-0.08095167577266693,
-0.10112249106168747,
-0.0005742195644415915,
0.005611091386526823,
0.11947692930698395,
0.02727004699409008,
-0.10317464917898178,
0.004632010590285063,
0.12654602527618408,
0.020785845816135406,
-0.04925008863210678,
-0.10348110646009445,
0.04294584318995476,
-0.05536629259586334,
0.035341426730155945,
-0.015066991560161114,
0.19204282760620117,
0.017968451604247093,
0.10406476259231567,
-0.024260349571704865,
0.1578742414712906,
-0.023581964895129204,
-0.1104632094502449,
0.055785518139600754,
0.14210624992847443,
-0.06008479744195938,
0.1465623676776886,
0.007049861829727888,
-0.13071975111961365,
0.05962839722633362,
-0.13193683326244354,
-0.0115735474973917,
-0.05491827055811882,
0.08584173023700714,
-0.03263679891824722,
0.07731129229068756,
0.10575740784406662,
-0.061426565051078796,
-0.056256286799907684,
-0.08136099576950073,
0.08687686175107956,
0.02400449849665165,
-0.07714086025953293,
-0.023528801277279854,
-0.2210719883441925,
-0.008883893489837646,
-0.038054272532463074,
-0.02575191855430603,
-0.21302245557308197,
-0.011001881211996078,
-0.02797398343682289,
-0.09956012666225433,
0.054226502776145935,
0.010295823216438293,
0.0802500769495964,
0.011212124489247799,
0.0004356222925707698,
-0.00973742175847292,
0.07031503319740295,
0.11275267601013184,
-0.14227566123008728,
-0.09205707907676697
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Tatar
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Tatar using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "tt", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-tatar")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-tatar")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Tatar test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/tt.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-tatar")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-tatar")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/tt/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/tt/clips/"
def clean_sentence(sent):
sent = sent.lower()
# 'ё' is equivalent to 'е'
sent = sent.replace('ё', 'е')
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 26.76 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "tt", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Tatar XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice tt", "type": "common_voice", "args": "tt"}, "metrics": [{"type": "wer", "value": 26.76, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-tatar
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"tt",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tt"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #tt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Tatar
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Tatar using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Tatar test data of Common Voice.
Test Result: 26.76 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Tatar\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Tatar using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Tatar test data of Common Voice.\n\n\n\nTest Result: 26.76 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #tt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Tatar\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Tatar using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Tatar test data of Common Voice.\n\n\n\nTest Result: 26.76 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #tt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Tatar\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Tatar using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Tatar test data of Common Voice.\n\n\n\nTest Result: 26.76 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.12905211746692657,
0.0017941639525815845,
-0.0029940996319055557,
-0.03365982696413994,
0.08839140832424164,
-0.039137501269578934,
0.20545288920402527,
0.09105248004198074,
-0.017757399007678032,
-0.025942422449588776,
0.03438844531774521,
-0.004360031336545944,
0.06395174562931061,
0.08997074514627457,
0.011165797710418701,
-0.21060419082641602,
0.0005611567758023739,
-0.0058985925279557705,
0.03551596775650978,
0.11465796083211899,
0.10076700150966644,
-0.06608640402555466,
-0.02233765833079815,
0.0998062714934349,
-0.1258457750082016,
0.04923839494585991,
0.03854912519454956,
-0.13083532452583313,
0.1370665282011032,
0.0824185460805893,
0.0731852650642395,
0.051013484597206116,
0.08791476488113403,
-0.18513722717761993,
0.02621271088719368,
0.03642862290143967,
0.01686645857989788,
0.03234637528657913,
0.10875460505485535,
-0.00271328492090106,
0.1037728488445282,
0.062441855669021606,
-0.037713129073381424,
0.06464478373527527,
-0.044273801147937775,
-0.2055739015340805,
-0.028531452640891075,
0.022400565445423126,
0.10885360091924667,
0.13365256786346436,
-0.07312892377376556,
0.10560019314289093,
-0.11695478111505508,
0.1062353327870369,
0.064093217253685,
-0.20344915986061096,
-0.006759310141205788,
0.040009014308452606,
0.017241261899471283,
0.10407052934169769,
-0.04503902047872543,
0.056227907538414,
0.022644618526101112,
0.02388618513941765,
0.0015991717809811234,
-0.022060342133045197,
-0.18816344439983368,
-0.017713146284222603,
-0.13567134737968445,
-0.049625229090452194,
0.204353928565979,
-0.016430577263236046,
-0.05330211669206619,
-0.12002857774496078,
-0.007515307050198317,
0.000927942746784538,
-0.0110577791929245,
-0.051667869091033936,
-0.024622313678264618,
0.01805100403726101,
-0.04071837291121483,
-0.029241938143968582,
-0.12275362014770508,
-0.146859273314476,
-0.02284178137779236,
0.06017722934484482,
0.024554314091801643,
0.033468667417764664,
-0.1389130800962448,
0.05188937112689018,
-0.09190675616264343,
-0.06281381845474243,
0.010373641736805439,
0.017943326383829117,
-0.08396752178668976,
0.017404047772288322,
-0.0906882956624031,
-0.1801919937133789,
0.02662505954504013,
-0.08138564229011536,
0.056010350584983826,
0.043036673218011856,
-0.012499635107815266,
0.05673426017165184,
0.02642965316772461,
0.09790880978107452,
-0.07252353429794312,
-0.018314536660909653,
0.013998828828334808,
-0.02527162805199623,
-0.056731898337602615,
-0.026048878207802773,
-0.07761915773153305,
-0.06874917447566986,
0.01631217449903488,
0.04816652834415436,
-0.023908643051981926,
0.008099250495433807,
0.016746655106544495,
-0.028365442529320717,
-0.0037348943296819925,
-0.08003474026918411,
-0.04582105576992035,
0.06942781805992126,
-0.0028043624479323626,
0.1098857894539833,
0.019905101507902145,
0.04821624234318733,
-0.0720774456858635,
-0.034714095294475555,
0.0392882339656353,
0.045289795845746994,
-0.023363932967185974,
-0.12159854173660278,
0.005737355910241604,
0.00789088848978281,
-0.010692511685192585,
-0.10504761338233948,
-0.09818939864635468,
-0.06416856497526169,
-0.0029940290842205286,
0.028329499065876007,
-0.0036138598807156086,
-0.08418705314397812,
-0.020274102687835693,
-0.0005303526413626969,
-0.06646943837404251,
0.09439784288406372,
-0.04591992124915123,
0.07667665183544159,
-0.0016869287937879562,
0.05301743373274803,
0.06008618324995041,
0.0844232514500618,
-0.08013021945953369,
-0.055772919207811356,
0.0038931129965931177,
0.15339848399162292,
-0.03584346920251846,
-0.06994015723466873,
-0.09653763473033905,
-0.07207699120044708,
-0.053025659173727036,
0.06234299764037132,
0.06738297641277313,
0.08829968422651291,
-0.2593283951282501,
-0.08563467860221863,
0.16276375949382782,
-0.1011335551738739,
0.0005329558625817299,
0.20491579174995422,
-0.009071444161236286,
0.09735569357872009,
0.14425219595432281,
0.26665037870407104,
0.14904916286468506,
-0.15721236169338226,
0.01879938691854477,
0.023063862696290016,
-0.0055418675765395164,
-0.05406831204891205,
0.06388158351182938,
-0.03760609030723572,
-0.03141895681619644,
0.02991642989218235,
-0.05540371313691139,
0.08362384885549545,
-0.03907351568341255,
-0.06118457764387131,
-0.0045854137279093266,
-0.0881015881896019,
0.07295958697795868,
0.03035191260278225,
0.050091516226530075,
-0.034920722246170044,
-0.062248606234788895,
0.033880554139614105,
0.10855145007371902,
-0.11883396655321121,
0.051567163318395615,
-0.10521124303340912,
0.0627666711807251,
-0.10834651440382004,
-0.004398152232170105,
-0.1427500694990158,
0.17905935645103455,
-0.021549083292484283,
0.04641064256429672,
0.05501842871308327,
0.18162335455417633,
0.023141197860240936,
0.011061638593673706,
-0.043225258588790894,
-0.007405546028167009,
0.02121579460799694,
-0.015631308779120445,
-0.06510401517152786,
-0.10876921564340591,
-0.009842711500823498,
-0.06028354912996292,
0.06781663000583649,
-0.11966026574373245,
-0.008767194114625454,
0.017603084444999695,
-0.014992172829806805,
-0.011746058240532875,
0.01067753229290247,
0.06096601113677025,
0.09726475924253464,
0.012817511335015297,
0.026938093826174736,
0.04626385122537613,
-0.0002763839438557625,
-0.04964780434966087,
0.14290408790111542,
-0.14019013941287994,
-0.0178359504789114,
0.10646873712539673,
-0.03106980212032795,
-0.020708734169602394,
0.049374770373106,
-0.006852676160633564,
-0.01004986185580492,
-0.06320732086896896,
0.007066874764859676,
0.29541340470314026,
-0.0030508548952639103,
0.09255649894475937,
-0.08859649300575256,
0.00975835882127285,
0.04859111085534096,
-0.08298370242118835,
0.04388603940606117,
0.07814180850982666,
0.001508236164227128,
-0.0009268217836506665,
0.02916746959090233,
-0.042984675616025925,
-0.07665585726499557,
0.2456800788640976,
-0.012149895541369915,
-0.10129870474338531,
0.0307699553668499,
-0.045483220368623734,
-0.029628127813339233,
0.10440448671579361,
-0.1466848999261856,
-0.03335600718855858,
0.04466572031378746,
0.060945864766836166,
0.06350669264793396,
-0.15185604989528656,
0.021059583872556686,
0.015293232165277004,
-0.12770654261112213,
-0.22423186898231506,
0.07199982553720474,
-0.040801096707582474,
0.062108609825372696,
-0.10782070457935333,
-0.016115373000502586,
-0.0017685439670458436,
-0.04655318334698677,
-0.17518576979637146,
0.11138518899679184,
-0.07962673157453537,
-0.2126055508852005,
-0.13808442652225494,
0.0020887746941298246,
0.01420055516064167,
0.01520228199660778,
0.1067938283085823,
-0.14230598509311676,
0.0006750200409442186,
-0.016627302393317223,
0.10808961093425751,
-0.006578049156814814,
-0.013405829668045044,
-0.06013456732034683,
0.01276247389614582,
0.05583375692367554,
-0.1262575387954712,
0.028090206906199455,
-0.06441643089056015,
-0.022585850208997726,
0.01999116688966751,
-0.005488511174917221,
0.013754865154623985,
0.1610030084848404,
0.012903066352009773,
0.016848905012011528,
-0.05187463387846947,
0.18358702957630157,
-0.09321600198745728,
-0.02855704352259636,
0.18328984081745148,
0.0010000794427469373,
-0.02114875242114067,
0.0937703400850296,
0.0225291196256876,
-0.036127954721450806,
-0.008470321074128151,
-0.017579438164830208,
-0.09339050203561783,
-0.2332128882408142,
-0.11163569241762161,
-0.06558006256818771,
-0.06311868876218796,
-0.043305713683366776,
-0.012826149351894855,
0.03922155499458313,
0.02777450531721115,
-0.031355660408735275,
-0.06632387638092041,
0.03343089669942856,
-0.024811306968331337,
0.1389230191707611,
0.0059326691552996635,
0.10254783928394318,
-0.07096279412508011,
-0.023113736882805824,
0.01630999892950058,
0.020219018682837486,
0.1641978770494461,
0.04489674046635628,
0.06441207975149155,
0.06975474208593369,
0.1255417764186859,
0.11825994402170181,
0.07861993461847305,
-0.04111980274319649,
-0.005206017289310694,
0.010859990492463112,
-0.06417693197727203,
-0.039263200014829636,
0.032918184995651245,
0.14658883213996887,
-0.059172309935092926,
-0.04762090742588043,
-0.02294374629855156,
0.023428862914443016,
0.17730867862701416,
0.05887288972735405,
-0.18460115790367126,
-0.10005314648151398,
-0.030273104086518288,
-0.07308182120323181,
0.0037516287993639708,
0.042545974254608154,
0.15886658430099487,
-0.12209136039018631,
0.026002777740359306,
-0.001596734975464642,
0.09188313037157059,
0.034027524292469025,
0.03177432715892792,
-0.07628006488084793,
0.025502117350697517,
-0.015963643789291382,
0.11086077243089676,
-0.2815729081630707,
0.22279982268810272,
0.016150692477822304,
0.1255691945552826,
-0.07230053842067719,
-0.01489664800465107,
0.016736820340156555,
0.05010645464062691,
0.09389251470565796,
0.01358713023364544,
-0.010505064390599728,
-0.10120893269777298,
-0.10006417334079742,
0.0685708075761795,
-0.009531150572001934,
0.05345301702618599,
0.051713038235902786,
0.016133692115545273,
0.009427911601960659,
0.010793493129312992,
-0.03153764083981514,
-0.14027519524097443,
-0.06563185155391693,
0.005846051964908838,
0.13445420563220978,
0.08494009077548981,
-0.006780359428375959,
-0.09515658020973206,
-0.1163712739944458,
-0.015672583132982254,
-0.09821978956460953,
-0.0718679279088974,
-0.057808615267276764,
-0.04456266015768051,
0.10708179324865341,
-0.0666557252407074,
-0.012831607833504677,
0.09497081488370895,
0.10233703255653381,
-0.030792932957410812,
-0.06679214537143707,
0.024789974093437195,
-0.11577059328556061,
-0.1069665402173996,
-0.008281150832772255,
0.1605948954820633,
0.10496752709150314,
0.1073404923081398,
0.05708656087517738,
0.005732997786253691,
-0.010530058294534683,
-0.0315948948264122,
0.009159397333860397,
0.07129725813865662,
-0.1215827688574791,
-0.024087034165859222,
0.05035044625401497,
-0.14446914196014404,
-0.1357259899377823,
-0.05888922140002251,
0.15991243720054626,
0.07202911376953125,
-0.07403531670570374,
0.2244747132062912,
0.19355948269367218,
-0.09316523373126984,
-0.19598063826560974,
-0.017015786841511726,
0.1203264594078064,
0.12845930457115173,
0.005858056712895632,
-0.18906310200691223,
0.041546791791915894,
-0.0361410416662693,
-0.03359384089708328,
-0.04689791426062584,
-0.27474772930145264,
-0.1508694291114807,
0.16168710589408875,
-0.031210586428642273,
0.15825875103473663,
-0.003585626371204853,
-0.023886190727353096,
-0.020978450775146484,
0.059661708772182465,
0.038602713495492935,
-0.09826087951660156,
0.11425469815731049,
0.03598404675722122,
0.1373981386423111,
0.04977339506149292,
-0.017015686258673668,
0.08981599658727646,
0.06882057338953018,
-0.02098909392952919,
-0.005138328764587641,
0.11241079866886139,
0.06774263828992844,
0.022000283002853394,
0.08764074742794037,
-0.07603967934846878,
0.03684749826788902,
-0.1139434278011322,
-0.08988992124795914,
-0.0881042405962944,
0.056179750710725784,
0.028138455003499985,
-0.048130229115486145,
0.020886853337287903,
-0.06717070937156677,
0.04069356247782707,
-0.026733888313174248,
-0.04442921280860901,
-0.10991410166025162,
0.04263512045145035,
0.11840494722127914,
0.17448702454566956,
-0.020109767094254494,
-0.09015800058841705,
-0.013021092861890793,
-0.038606636226177216,
0.12919996678829193,
-0.09115061163902283,
0.015412639826536179,
0.06007750332355499,
0.04405534267425537,
0.08150851726531982,
0.00986691378057003,
-0.09666237235069275,
0.08744814246892929,
0.030615201219916344,
-0.07098867744207382,
-0.12053713947534561,
-0.01808881387114525,
0.012229789979755878,
-0.013491126708686352,
0.021759288385510445,
0.11222606152296066,
-0.10994905978441238,
-0.0138274310156703,
-0.025139065459370613,
0.011886844411492348,
-0.11064909398555756,
0.24278850853443146,
0.02604527585208416,
0.06854292750358582,
-0.1064685508608818,
0.015901872888207436,
-0.011990636587142944,
-0.046419281512498856,
0.05561622232198715,
-0.04466492310166359,
-0.07273620367050171,
-0.05848276615142822,
0.006759318057447672,
0.08097448199987411,
0.03782478719949722,
-0.14161400496959686,
-0.05547672137618065,
-0.11063169687986374,
0.009786409325897694,
0.09896140545606613,
0.045171938836574554,
-0.0076577626168727875,
-0.1316586285829544,
-0.057579897344112396,
-0.10729606449604034,
0.06792479753494263,
0.06496208906173706,
-0.0367603562772274,
-0.10497357696294785,
0.20130877196788788,
0.05740487203001976,
0.03617249056696892,
-0.053402744233608246,
-0.06134747713804245,
-0.0008425622945651412,
0.0891818031668663,
-0.11186359077692032,
-0.006159564945846796,
-0.054420921951532364,
0.006414102856069803,
-0.014644045382738113,
-0.0617561936378479,
-0.020856335759162903,
0.07213784009218216,
-0.09637525677680969,
0.08353865146636963,
-0.01618773117661476,
0.06358710676431656,
-0.07750330865383148,
0.042575180530548096,
-0.0016399479936808348,
-0.03745728358626366,
0.08353376388549805,
0.15836714208126068,
-0.10033850371837616,
0.13103002309799194,
-0.18588107824325562,
-0.027386952191591263,
0.038624584674835205,
0.06738503277301788,
-0.029866855591535568,
-0.08821035921573639,
0.03698267042636871,
0.07513954490423203,
0.04421745985746384,
0.0036481120623648167,
0.10126079618930817,
-0.04437832161784172,
-0.002969451481476426,
-0.0708201602101326,
0.026867877691984177,
-0.03959153965115547,
0.0505095012485981,
0.04145784303545952,
0.14987218379974365,
0.16126146912574768,
-0.1120234876871109,
0.10054852813482285,
-0.11592007428407669,
0.002926391316577792,
-0.047096796333789825,
-0.025568779557943344,
-0.1301504373550415,
-0.09832189977169037,
0.08664519339799881,
-0.040653079748153687,
0.13648083806037903,
0.05716584622859955,
0.05861850455403328,
-0.030403856188058853,
-0.06591188162565231,
0.030609076842665672,
-0.023809654638171196,
0.2725769281387329,
0.03484908118844032,
0.03879985213279724,
-0.03563227131962776,
0.011986935511231422,
0.013840436935424805,
0.12414062768220901,
0.004437115043401718,
0.1403341293334961,
0.021075844764709473,
0.07172102481126785,
0.08359719067811966,
-0.0689227506518364,
-0.026711730286478996,
-0.03965000435709953,
-0.1337287575006485,
0.03422681614756584,
-0.0679963082075119,
0.15187010169029236,
0.129850834608078,
-0.10137778520584106,
0.08404657244682312,
-0.008849142119288445,
-0.07847004383802414,
-0.14607951045036316,
-0.12847112119197845,
-0.046180956065654755,
-0.163280189037323,
0.016867339611053467,
-0.08902197331190109,
0.031279731541872025,
0.06443987786769867,
0.03641793504357338,
-0.02142830565571785,
0.14448076486587524,
-0.010777154937386513,
-0.09889749437570572,
0.072165347635746,
-0.0799175277352333,
-0.010761087760329247,
-0.0855703055858612,
0.024362042546272278,
0.18403856456279755,
-0.010510892607271671,
0.0512358658015728,
0.013254767283797264,
-0.07041295617818832,
0.02424481138586998,
-0.06430863589048386,
-0.06095279008150101,
-0.01649085059762001,
-0.009241699241101742,
0.10475470125675201,
0.14998523890972137,
0.11706866323947906,
-0.06456466019153595,
-0.01572129689157009,
0.12609198689460754,
-0.032211627811193466,
-0.13239096105098724,
-0.12885601818561554,
0.1540093570947647,
0.016872551292181015,
0.01861795224249363,
0.013874327763915062,
-0.03663844242691994,
-0.012605849653482437,
0.24753354489803314,
0.22899112105369568,
0.042781587690114975,
0.016435666009783745,
-0.04348064959049225,
-0.005177655257284641,
-0.030910667032003403,
0.09710806608200073,
0.07459630817174911,
0.21752330660820007,
-0.028752148151397705,
0.023659422993659973,
-0.09823890775442123,
-0.06142982840538025,
0.002721830504015088,
0.08559806644916534,
-0.05853354558348656,
-0.12297870963811874,
0.01811615750193596,
0.1585918664932251,
-0.0793192982673645,
-0.10892704874277115,
-0.11190129071474075,
-0.07164858281612396,
-0.07792248576879501,
0.011911682784557343,
0.07156529277563095,
0.11895554512739182,
0.01800617203116417,
-0.08660886436700821,
0.0508136972784996,
0.12352899461984634,
0.0017500676913186908,
-0.06329739838838577,
-0.053724098950624466,
0.05602162331342697,
-0.08308630436658859,
-0.004198337439447641,
0.00012274360051378608,
0.1844584345817566,
0.009442382492125034,
0.10328581184148788,
-0.021712524816393852,
0.14255250990390778,
-0.0001852199638960883,
-0.0867905393242836,
0.02814565785229206,
0.13955990970134735,
-0.024738572537899017,
0.09841453284025192,
0.020080816000699997,
-0.1309346854686737,
0.06158948317170143,
-0.12464598566293716,
-0.0008075851947069168,
-0.048364460468292236,
0.0663723275065422,
-0.03668257221579552,
0.07374245673418045,
0.08392658829689026,
-0.06620094925165176,
-0.04234657064080238,
-0.054270025342702866,
0.042905740439891815,
0.026540914550423622,
-0.07900398224592209,
-0.03850512579083443,
-0.24747538566589355,
-0.024945374578237534,
-0.08891338109970093,
-0.02452358789741993,
-0.20697157084941864,
-0.022249853238463402,
-0.01973458006978035,
-0.08025909215211868,
0.013497603125870228,
0.02884840779006481,
0.0848691537976265,
0.00016670918557792902,
-0.00005752028664574027,
0.026581738144159317,
0.04661530628800392,
0.12293138355016708,
-0.16574348509311676,
-0.10882429778575897
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Ukrainian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Ukrainian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "uk", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-ukrainian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-ukrainian")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Ukrainian test data of Common Voice.
```python
import torch
import torchaudio
import urllib.request
import tarfile
import pandas as pd
from tqdm.auto import tqdm
from datasets import load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# Download the raw data instead of using HF datasets to save disk space
data_url = "https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-6.1-2020-12-11/uk.tar.gz"
filestream = urllib.request.urlopen(data_url)
data_file = tarfile.open(fileobj=filestream, mode="r|gz")
data_file.extractall()
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anton-l/wav2vec2-large-xlsr-53-ukrainian")
model = Wav2Vec2ForCTC.from_pretrained("anton-l/wav2vec2-large-xlsr-53-ukrainian")
model.to("cuda")
cv_test = pd.read_csv("cv-corpus-6.1-2020-12-11/uk/test.tsv", sep='\t')
clips_path = "cv-corpus-6.1-2020-12-11/uk/clips/"
def clean_sentence(sent):
sent = sent.lower()
# normalize apostrophes
sent = sent.replace("’", "'")
# replace non-alpha characters with space
sent = "".join(ch if ch.isalpha() or ch == "'" else " " for ch in sent)
# remove repeated spaces
sent = " ".join(sent.split())
return sent
targets = []
preds = []
for i, row in tqdm(cv_test.iterrows(), total=cv_test.shape[0]):
row["sentence"] = clean_sentence(row["sentence"])
speech_array, sampling_rate = torchaudio.load(clips_path + row["path"])
resampler = torchaudio.transforms.Resample(sampling_rate, 16_000)
row["speech"] = resampler(speech_array).squeeze().numpy()
inputs = processor(row["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
targets.append(row["sentence"])
preds.append(processor.batch_decode(pred_ids)[0])
print("WER: {:2f}".format(100 * wer.compute(predictions=preds, references=targets)))
```
**Test Result**: 32.29 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "uk", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Ukrainian XLSR Wav2Vec2 Large 53 by Anton Lozhkov", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice uk", "type": "common_voice", "args": "uk"}, "metrics": [{"type": "wer", "value": 32.29, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-large-xlsr-53-ukrainian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"uk",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"uk"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #uk #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Ukrainian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Ukrainian test data of Common Voice.
Test Result: 32.29 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Ukrainian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Ukrainian test data of Common Voice.\n\n\n\nTest Result: 32.29 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #uk #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Ukrainian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Ukrainian test data of Common Voice.\n\n\n\nTest Result: 32.29 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
67,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #uk #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Ukrainian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Ukrainian test data of Common Voice.\n\n\n\nTest Result: 32.29 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1593949943780899,
-0.0025023766793310642,
-0.0034795228857547045,
-0.039131201803684235,
0.10510560870170593,
-0.04800867289304733,
0.22333180904388428,
0.07749715447425842,
0.019409647211432457,
-0.006363320164382458,
0.04772629588842392,
-0.017024915665388107,
0.0387553945183754,
0.07370880991220474,
0.027018386870622635,
-0.20652706921100616,
0.03302418068051338,
-0.01663120836019516,
0.00617957254871726,
0.07952875643968582,
0.12041718512773514,
-0.09279372543096542,
-0.030515437945723534,
0.0883282944560051,
-0.13560815155506134,
0.06547580659389496,
0.0625210627913475,
-0.12125330418348312,
0.1364433914422989,
0.08613328635692596,
0.06337897479534149,
0.01701653189957142,
0.0696420893073082,
-0.20380352437496185,
0.026342807337641716,
0.032231565564870834,
0.009812400676310062,
0.0229342058300972,
0.09137119352817535,
-0.02550678327679634,
0.18567752838134766,
0.018839439377188683,
-0.04509366676211357,
0.06830115616321564,
-0.04854109138250351,
-0.2253568172454834,
-0.007494289427995682,
0.05233035981655121,
0.10584499686956406,
0.17393723130226135,
-0.0843137875199318,
0.11319120973348618,
-0.09170175343751907,
0.10810138285160065,
0.06728799641132355,
-0.1447778344154358,
-0.0023919385857880116,
0.06115749105811119,
-0.0026724364142864943,
0.09668625891208649,
-0.04782983288168907,
0.0573013499379158,
0.023946966975927353,
0.011241638101637363,
-0.038164932280778885,
-0.0009580118930898607,
-0.23471148312091827,
-0.0023982960265129805,
-0.15943112969398499,
-0.07018959522247314,
0.2271631509065628,
-0.0033867547754198313,
-0.03491438925266266,
-0.13006791472434998,
0.00008914313366403803,
-0.003454501973465085,
-0.012295260094106197,
-0.03494483605027199,
-0.034668829292058945,
0.008853988721966743,
-0.049074918031692505,
-0.053097810596227646,
-0.14449994266033173,
-0.12495381385087967,
-0.008270814083516598,
0.12817835807800293,
0.024819647893309593,
0.06092038378119469,
-0.09707828611135483,
0.06744495034217834,
-0.08062718063592911,
-0.042432963848114014,
0.01248091273009777,
0.04776981472969055,
-0.09823928028345108,
0.0033595655113458633,
-0.09271406382322311,
-0.14370854198932648,
0.054555196315050125,
-0.06532199680805206,
-0.0129975825548172,
-0.0028632963076233864,
0.03129749000072479,
0.06215710937976837,
0.021542930975556374,
0.07958394289016724,
-0.07786804437637329,
0.03990018367767334,
-0.032634370028972626,
-0.0649653896689415,
-0.05306681618094444,
-0.003140621120110154,
-0.09380315244197845,
-0.09540852904319763,
-0.006093574687838554,
0.06122840195894241,
-0.02531266398727894,
-0.016623103991150856,
-0.006883649155497551,
-0.030091380700469017,
-0.018558887764811516,
-0.08071199804544449,
-0.03341636434197426,
0.05990548059344292,
0.019072357565164566,
0.11742377281188965,
0.010095073841512203,
0.05699468031525612,
-0.09499794989824295,
-0.020932286977767944,
0.04880252853035927,
0.06694353371858597,
-0.01174478605389595,
-0.12376934289932251,
0.026355907320976257,
-0.0038996960502117872,
0.01776151731610298,
-0.09941671043634415,
-0.06947183609008789,
-0.07588960975408554,
-0.007007810287177563,
0.03962132707238197,
0.01898876391351223,
-0.08169076591730118,
-0.040065113455057144,
-0.015285240486264229,
-0.04106691852211952,
0.07892462611198425,
-0.061507415026426315,
0.06037748232483864,
-0.02354610711336136,
0.035257432609796524,
0.012143803760409355,
0.06301084905862808,
-0.08690257370471954,
-0.10242216289043427,
0.040688011795282364,
0.1337154507637024,
-0.03340260311961174,
-0.07686179131269455,
-0.0731256976723671,
-0.06297215074300766,
-0.02108263224363327,
0.07939029484987259,
0.05802018940448761,
0.08734715729951859,
-0.221932053565979,
-0.10153819620609283,
0.16519081592559814,
-0.13051782548427582,
-0.00008909798634704202,
0.1965128481388092,
-0.017995040863752365,
0.07973315566778183,
0.16971391439437866,
0.19272367656230927,
0.15020771324634552,
-0.21435347199440002,
-0.021115867421030998,
-0.006528638303279877,
-0.05584518611431122,
-0.06124846637248993,
0.08459147065877914,
-0.06530798226594925,
-0.000720157811883837,
0.038485538214445114,
-0.07978133112192154,
0.07586409151554108,
-0.049588825553655624,
-0.0820588693022728,
-0.019599558785557747,
-0.06896995007991791,
0.04913533478975296,
0.04975196719169617,
0.0041970135644078255,
-0.07106179744005203,
-0.06436340510845184,
0.07171013206243515,
0.12527692317962646,
-0.1315418928861618,
0.06982342898845673,
-0.09227883815765381,
0.03780054301023483,
-0.08640442788600922,
-0.013218364678323269,
-0.15370425581932068,
0.1551564633846283,
-0.05027420446276665,
0.06356155872344971,
0.05976860597729683,
0.17811810970306396,
0.017411569133400917,
0.01626221090555191,
-0.05274525657296181,
-0.0036503891460597515,
-0.018898246809840202,
-0.03681137412786484,
-0.04970971494913101,
-0.15215730667114258,
-0.014241600409150124,
-0.0503850020468235,
0.05640038102865219,
-0.1451793760061264,
-0.014699897728860378,
0.037003859877586365,
0.03129403293132782,
-0.000014790354725846555,
0.0023564386647194624,
0.022206567227840424,
0.11548119783401489,
-0.00862446241080761,
0.039607543498277664,
0.053592607378959656,
-0.01833026483654976,
-0.047174736857414246,
0.12158545106649399,
-0.11759860813617706,
-0.05247346684336662,
0.12559981644153595,
-0.027735481038689613,
-0.008845651522278786,
0.03690562769770622,
-0.017968803644180298,
-0.0005990417557768524,
-0.07475322484970093,
0.0005175851401872933,
0.24048581719398499,
-0.01851719804108143,
0.054807379841804504,
-0.07495851814746857,
0.004396891687065363,
0.05034191161394119,
-0.08429092913866043,
0.02202344313263893,
0.08819624036550522,
-0.008225950412452221,
0.048869043588638306,
0.03138954937458038,
-0.1052476018667221,
-0.06865376979112625,
0.31192588806152344,
-0.04510585218667984,
-0.13214543461799622,
0.02007754147052765,
-0.04259759187698364,
-0.020552005618810654,
0.15117773413658142,
-0.2078785002231598,
-0.03829890489578247,
0.04152395576238632,
0.04653186723589897,
0.06888462603092194,
-0.15152084827423096,
0.021023951470851898,
-0.002505298238247633,
-0.17048928141593933,
-0.19428256154060364,
0.07070295512676239,
-0.03687988966703415,
0.04944425821304321,
-0.11690118908882141,
0.010651076212525368,
-0.01495365984737873,
-0.06225539371371269,
-0.1771179437637329,
0.09048281610012054,
-0.08785036951303482,
-0.1611478477716446,
-0.1755380928516388,
-0.0019313868833705783,
0.0069300029426813126,
0.021296560764312744,
0.10614166408777237,
-0.15772464871406555,
-0.0014402722008526325,
0.010220186784863472,
0.18871578574180603,
-0.0015760438982397318,
-0.007485281676054001,
-0.08388730883598328,
0.06961609423160553,
0.043059926480054855,
-0.11539570242166519,
0.01989927515387535,
-0.08084594458341599,
-0.0018982025794684887,
0.01342734694480896,
0.0019701572600752115,
0.018017854541540146,
0.13829441368579865,
0.001274425769224763,
0.022009016945958138,
-0.04696550592780113,
0.17852835357189178,
-0.12947458028793335,
-0.0648937001824379,
0.1753861904144287,
0.009101031348109245,
-0.04701460152864456,
0.10766679048538208,
0.03031262569129467,
-0.02216317690908909,
-0.003903374308720231,
-0.020566990599036217,
-0.08343962579965591,
-0.25423723459243774,
-0.180805966258049,
-0.06921248883008957,
-0.03928080201148987,
-0.03860468789935112,
-0.013448446989059448,
0.018668394535779953,
0.025243639945983887,
-0.07026632875204086,
-0.12831968069076538,
0.05087772384285927,
-0.006494420580565929,
0.12902997434139252,
0.01929323375225067,
0.0815080776810646,
-0.06861911714076996,
-0.023342888802289963,
0.054233368486166,
-0.035643693059682846,
0.20303750038146973,
0.05508018657565117,
0.05412250757217407,
0.08110584318637848,
0.14082612097263336,
0.09042409062385559,
0.09395954757928848,
0.027862973511219025,
-0.004826321732252836,
0.05557456240057945,
-0.08041904121637344,
0.027073310688138008,
0.04668469354510307,
0.13851557672023773,
-0.058566659688949585,
-0.08027936518192291,
-0.014231467619538307,
0.027248725295066833,
0.20620596408843994,
0.10464245080947876,
-0.16984790563583374,
-0.1377648413181305,
-0.0453336276113987,
-0.07126636058092117,
0.001237650983966887,
0.014192371629178524,
0.16879883408546448,
-0.13839246332645416,
0.015024758875370026,
0.008968410082161427,
0.0827852115035057,
0.024826863780617714,
0.04105702415108681,
-0.09500357508659363,
0.035740792751312256,
-0.01353425532579422,
0.12008680403232574,
-0.2538128197193146,
0.26464125514030457,
0.004448217805474997,
0.14851811528205872,
-0.06413470208644867,
-0.029476657509803772,
0.005411099176853895,
0.03278061002492905,
0.0878264382481575,
0.021619876846671104,
-0.05458837375044823,
-0.11695768684148788,
-0.08504696935415268,
0.06458274275064468,
0.00436298968270421,
0.046061672270298004,
0.07760845124721527,
0.018799247220158577,
0.018033290281891823,
0.016721436753869057,
-0.06407766044139862,
-0.16764679551124573,
-0.08260883390903473,
-0.013684726320207119,
0.18374162912368774,
0.11236865818500519,
-0.00826242659240961,
-0.11008772999048233,
-0.04929433763027191,
-0.009855014272034168,
-0.110124871134758,
-0.09377791732549667,
-0.07037022709846497,
-0.0049323998391628265,
0.14454030990600586,
-0.07417471706867218,
0.0031416756100952625,
0.10138227790594101,
0.13905762135982513,
-0.04680846258997917,
-0.04501912370324135,
0.029648160561919212,
-0.10189885646104813,
-0.11804832518100739,
0.017193354666233063,
0.18952122330665588,
0.15815484523773193,
0.07808543741703033,
0.03177895024418831,
0.028204213827848434,
-0.024494841694831848,
-0.022784631699323654,
0.01525755412876606,
0.07451920211315155,
-0.08771739900112152,
0.06902173906564713,
0.0481012798845768,
-0.19641703367233276,
-0.15193632245063782,
-0.10111609846353531,
0.1644999086856842,
0.08878079801797867,
-0.08870328217744827,
0.18606312572956085,
0.16320091485977173,
-0.09162008762359619,
-0.19928708672523499,
0.001740624662488699,
0.12599240243434906,
0.1520356386899948,
-0.021490655839443207,
-0.18119128048419952,
0.05735264718532562,
-0.02433575503528118,
-0.01608615182340145,
-0.03472049906849861,
-0.23972748219966888,
-0.1768042892217636,
0.1543693095445633,
-0.046503301709890366,
0.13152338564395905,
0.04484272748231888,
0.002916700206696987,
-0.030303331092000008,
0.004875837359577417,
0.02797221578657627,
-0.10742975026369095,
0.10696446895599365,
0.030395090579986572,
0.120223768055439,
0.06307566910982132,
-0.0028398504946380854,
0.11596798896789551,
0.09506344050168991,
-0.008581159636378288,
0.008309622295200825,
0.09897486120462418,
0.07416136562824249,
0.029656708240509033,
0.11328363418579102,
-0.03795802220702171,
0.0209182295948267,
-0.10336003452539444,
-0.09590210020542145,
-0.08044278621673584,
0.06900957971811295,
0.02172689512372017,
-0.04276789724826813,
-0.015351828187704086,
-0.054196517914533615,
0.010425930842757225,
-0.013156408444046974,
-0.0035129867028445005,
-0.1057477593421936,
0.03163006529211998,
0.08023614436388016,
0.17656897008419037,
-0.02226072922348976,
-0.10494212806224823,
0.0190410315990448,
-0.044507451355457306,
0.11827385425567627,
-0.0790005773305893,
0.019335206598043442,
0.06505297869443893,
0.038008008152246475,
0.0590464286506176,
0.010067977011203766,
-0.11769231408834457,
0.0758390724658966,
0.036564528942108154,
-0.09472338110208511,
-0.1400560736656189,
-0.03087940625846386,
-0.07087933272123337,
-0.020206496119499207,
0.014787260442972183,
0.11690372228622437,
-0.10492762178182602,
0.009897652082145214,
-0.03030160441994667,
-0.025137102231383324,
-0.12697714567184448,
0.22973141074180603,
0.015693647786974907,
0.06141502410173416,
-0.10715843737125397,
0.017465949058532715,
-0.025484275072813034,
-0.051392849534749985,
0.04796445369720459,
-0.06077723950147629,
-0.08246570080518723,
-0.05155132710933685,
0.012993469834327698,
0.08593010157346725,
-0.042912039905786514,
-0.1697329729795456,
-0.048729829490184784,
-0.12071547657251358,
0.023298369720578194,
0.07901860028505325,
0.07599253207445145,
-0.009482355788350105,
-0.09309720993041992,
-0.04372309893369675,
-0.13407766819000244,
0.0908406600356102,
0.10577680170536041,
-0.05402074754238129,
-0.11676842719316483,
0.22551991045475006,
0.05787703022360802,
0.01930491253733635,
-0.046225305646657944,
-0.061275966465473175,
0.00979897752404213,
0.10374736040830612,
-0.13712762296199799,
-0.0217671487480402,
-0.06902918219566345,
-0.02582504041492939,
-0.004629189148545265,
-0.09208153933286667,
-0.018857497721910477,
0.09544287621974945,
-0.0749165490269661,
0.07596161216497421,
-0.028520891442894936,
0.06019344925880432,
-0.08110251277685165,
0.047041963785886765,
-0.0028083669021725655,
-0.03882426768541336,
0.09063239395618439,
0.19059953093528748,
-0.07934990525245667,
0.15564391016960144,
-0.1586126983165741,
-0.008080816827714443,
0.056832533329725266,
0.06000137701630592,
-0.018399527296423912,
-0.09176681190729141,
0.02302813157439232,
0.09143557399511337,
0.09328947216272354,
0.01527270209044218,
0.08948411792516708,
-0.037837810814380646,
0.005686309188604355,
-0.05622190982103348,
0.021729890257120132,
-0.030845005065202713,
0.06985060125589371,
0.0342126302421093,
0.15992146730422974,
0.21145841479301453,
-0.13149923086166382,
0.09554082900285721,
-0.09911929816007614,
0.02118125930428505,
-0.06752152740955353,
-0.006913674063980579,
-0.13411933183670044,
-0.07669541984796524,
0.07312475144863129,
-0.05521673709154129,
0.14931975305080414,
0.03999047726392746,
0.08815935254096985,
0.00006929059600224718,
-0.06165295094251633,
0.021593846380710602,
-0.009542915970087051,
0.2752310335636139,
0.050814609974622726,
0.03281291201710701,
-0.039692532271146774,
0.030262291431427002,
0.025077948346734047,
0.10856565833091736,
0.0021893607918173075,
0.1423986256122589,
0.07437719404697418,
0.08179011195898056,
0.09122251719236374,
-0.03689214959740639,
-0.027454501017928123,
-0.08618127554655075,
-0.11824921518564224,
0.004112883470952511,
-0.07639815658330917,
0.14843590557575226,
0.1774076670408249,
-0.13809911906719208,
0.0972694531083107,
0.03621969744563103,
-0.09766021370887756,
-0.15531247854232788,
-0.12052340805530548,
-0.05276854336261749,
-0.18290087580680847,
0.022217905148863792,
-0.10668366402387619,
0.00754920020699501,
0.07328229397535324,
0.05108090862631798,
-0.038164764642715454,
0.12271282821893692,
0.031249182298779488,
-0.10120924562215805,
0.079616479575634,
-0.07458297908306122,
0.02733401022851467,
-0.06750356405973434,
0.045090679079294205,
0.18396249413490295,
-0.04878856614232063,
0.04831242933869362,
0.0327063724398613,
-0.08805546164512634,
0.034567978233098984,
-0.07856912910938263,
-0.070543572306633,
-0.0033056032843887806,
0.008260786533355713,
0.1088239848613739,
0.169551819562912,
0.1244964674115181,
-0.06698726862668991,
-0.018944475799798965,
0.15605510771274567,
-0.029850874096155167,
-0.1585742086172104,
-0.14008267223834991,
0.11972443759441376,
0.029650401324033737,
0.028028324246406555,
0.016420191153883934,
-0.060898203402757645,
-0.03454608470201492,
0.23953969776630402,
0.22706128656864166,
0.10039766132831573,
0.030209308490157127,
-0.04619739577174187,
-0.012755964882671833,
-0.03299499303102493,
0.07562891393899918,
0.0385049469769001,
0.23218606412410736,
-0.015415665693581104,
0.008055835030972958,
-0.11381492018699646,
-0.05100172758102417,
-0.029014982283115387,
0.030375415459275246,
-0.054390810430049896,
-0.15513193607330322,
0.022729739546775818,
0.16731791198253632,
-0.06744759529829025,
-0.0979994386434555,
-0.0968165323138237,
-0.059876449406147,
-0.06548864394426346,
0.014979694038629532,
0.06940849870443344,
0.14351928234100342,
0.0326276533305645,
-0.09438925236463547,
0.032195962965488434,
0.12633156776428223,
0.015018518082797527,
-0.09796761721372604,
-0.07383286952972412,
0.05539316684007645,
-0.05393436178565025,
-0.013211419805884361,
-0.0061483182944357395,
0.20806299149990082,
0.005951727740466595,
0.09293720126152039,
-0.025983842089772224,
0.11553338915109634,
-0.026163844391703606,
-0.06514772027730942,
0.03701692447066307,
0.11343339830636978,
-0.042143773287534714,
0.08368207514286041,
0.04237079247832298,
-0.14314110577106476,
0.05315704643726349,
-0.08146650344133377,
0.0062922821380198,
-0.06616644561290741,
0.08477573096752167,
-0.05308166891336441,
0.04165501892566681,
0.09623550623655319,
-0.06491704285144806,
-0.03092631883919239,
-0.04454778507351875,
0.062254294753074646,
0.05594441294670105,
-0.05073468014597893,
-0.024414459243416786,
-0.235666424036026,
-0.016138100996613503,
-0.048216793686151505,
-0.03157949075102806,
-0.17761942744255066,
-0.02956569381058216,
0.0006974192219786346,
-0.08569730073213577,
0.026373960077762604,
0.04438501223921776,
0.1099056601524353,
-0.004449389409273863,
0.010737009346485138,
0.03311722353100777,
0.06423145532608032,
0.11268753558397293,
-0.1574210673570633,
-0.13288266956806183
] |
null | null | null |
This is a standalone Turkish Wav2Vec2 tokenizer config intended for use with `run_speech_recognition_ctc_streaming.py`
|
{"license": "cc0-1.0"}
| null |
anton-l/wav2vec2-tokenizer-turkish
|
[
"license:cc0-1.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#license-cc0-1.0 #region-us
|
This is a standalone Turkish Wav2Vec2 tokenizer config intended for use with 'run_speech_recognition_ctc_streaming.py'
|
[] |
[
"TAGS\n#license-cc0-1.0 #region-us \n"
] |
[
14
] |
[
"passage: TAGS\n#license-cc0-1.0 #region-us \n"
] |
[
-0.015827827155590057,
0.0207549799233675,
-0.006497819907963276,
-0.04290052875876427,
-0.008116698823869228,
0.07027317583560944,
0.17494411766529083,
0.04256151616573334,
0.21734844148159027,
-0.020508237183094025,
0.17894452810287476,
0.05383828282356262,
0.012251714244484901,
0.013546179048717022,
-0.031910061836242676,
-0.07461658120155334,
0.06502379477024078,
-0.01734144240617752,
0.061062395572662354,
0.025558017194271088,
-0.0035808782558888197,
-0.045654892921447754,
0.016418294981122017,
-0.02278972417116165,
-0.11548300832509995,
-0.007016019895672798,
0.08283799141645432,
-0.034927476197481155,
0.09569909423589706,
-0.03669785335659981,
0.1488795280456543,
0.12566709518432617,
0.047976598143577576,
-0.21645991504192352,
0.007226462010294199,
-0.074689120054245,
-0.14275676012039185,
0.04021013155579567,
-0.002384168328717351,
0.038798946887254715,
0.029695669189095497,
0.08111638575792313,
-0.026238055899739265,
0.043591033667325974,
-0.19998596608638763,
-0.1481328010559082,
-0.10702550411224365,
0.03871911019086838,
0.06454846262931824,
0.0650472566485405,
0.11070746183395386,
0.08298075199127197,
-0.1759617030620575,
-0.019254064187407494,
0.04221664369106293,
-0.3985176384449005,
0.10433376580476761,
0.2524176836013794,
0.015190071426331997,
0.08450863510370255,
-0.01707599312067032,
0.06626342982053757,
0.09601504355669022,
-0.057876840233802795,
-0.09547819942235947,
-0.07555457204580307,
-0.004737574141472578,
0.10565069317817688,
0.008250449784100056,
-0.12633982300758362,
0.31104162335395813,
0.005960594862699509,
-0.02898787520825863,
0.12754853069782257,
-0.004315655212849379,
-0.1518958956003189,
0.0220046304166317,
0.07752695679664612,
0.06581158190965652,
0.1794823259115219,
0.1345766931772232,
-0.03197266533970833,
-0.18553856015205383,
-0.07936955243349075,
-0.2299686074256897,
0.050761427730321884,
-0.04536014050245285,
0.13710550963878632,
-0.11852117627859116,
0.029426462948322296,
-0.168117493391037,
-0.011613838374614716,
-0.08339264988899231,
-0.025684406980872154,
0.09128108620643616,
-0.007883008569478989,
-0.06916080415248871,
0.18163245916366577,
0.0767785906791687,
0.13714002072811127,
-0.03617716208100319,
0.0010543889366090298,
-0.08272183686494827,
0.17322269082069397,
-0.039915166795253754,
0.03520270064473152,
0.1707763373851776,
0.21105235815048218,
0.034060288220644,
-0.17040330171585083,
0.05841115489602089,
-0.03097379207611084,
-0.20284830033779144,
-0.028764287009835243,
-0.1235160306096077,
0.178719624876976,
-0.07272069156169891,
-0.1120561882853508,
-0.07850244641304016,
0.11062582582235336,
0.1746472865343094,
0.006332656368613243,
0.0015781199326738715,
0.052597835659980774,
0.04972757399082184,
-0.06476195901632309,
-0.03800445795059204,
0.009140897542238235,
0.11390510201454163,
0.08567581325769424,
-0.15639014542102814,
0.009338120929896832,
0.00847355555742979,
0.04368881508708,
0.1489730179309845,
-0.10177883505821228,
0.0031057363376021385,
-0.0988050326704979,
-0.1214434802532196,
0.02210068888962269,
-0.018870722502470016,
-0.0013024756917729974,
0.0697474479675293,
0.07618407905101776,
0.02391822822391987,
-0.00707099586725235,
-0.06859580427408218,
-0.13437165319919586,
-0.09691616892814636,
0.09489395469427109,
-0.08992120623588562,
0.001434947014786303,
-0.31798821687698364,
-0.03579287976026535,
-0.12229394912719727,
0.06312623620033264,
0.007610782980918884,
-0.11566945910453796,
-0.12733198702335358,
0.11878558993339539,
-0.021428963169455528,
0.02241014502942562,
-0.09214052557945251,
0.010798091068863869,
-0.04514536261558533,
0.0935865119099617,
-0.12018639594316483,
-0.02178751304745674,
0.13598762452602386,
-0.1433018445968628,
-0.1446334421634674,
-0.03895501047372818,
0.019982166588306427,
0.006798567716032267,
0.024144388735294342,
0.3136502802371979,
-0.1047099232673645,
-0.18356886506080627,
0.07592222094535828,
0.18125678598880768,
-0.10997959971427917,
-0.32262301445007324,
0.1311771422624588,
-0.1998024582862854,
-0.187198206782341,
0.0014316501328721642,
-0.06889328360557556,
0.015019197016954422,
-0.05126368626952171,
-0.05998285487294197,
-0.006330584641546011,
-0.00101490318775177,
-0.043657414615154266,
-0.02342895045876503,
0.05602645128965378,
-0.04209410771727562,
0.07747554033994675,
-0.045815009623765945,
-0.006485676858574152,
0.10631822049617767,
0.042011212557554245,
-0.054521460086107254,
0.06824697554111481,
-0.03340939059853554,
-0.04051266610622406,
-0.03962068259716034,
-0.09377148747444153,
0.045961927622556686,
0.01636369340121746,
0.1081070527434349,
0.1499423384666443,
0.010570126585662365,
-0.03262319043278694,
0.031655292958021164,
0.03699525073170662,
0.026514539495110512,
0.06497609615325928,
0.0337197445333004,
-0.05697975680232048,
0.044706832617521286,
-0.00427925493568182,
-0.003093044040724635,
-0.04839417338371277,
-0.03098047524690628,
0.16389934718608856,
-0.11026563495397568,
-0.02162609063088894,
0.048698510974645615,
-0.05410287529230118,
0.011966915801167488,
0.05759406089782715,
0.028150059282779694,
0.13676424324512482,
0.04184141010046005,
-0.10588086396455765,
0.23670996725559235,
-0.04869065061211586,
0.1752292364835739,
0.18292628228664398,
-0.10221138596534729,
-0.012063030153512955,
-0.12153352797031403,
0.03150835260748863,
0.007108768448233604,
0.0687030777335167,
-0.0007574483752250671,
-0.010056051425635815,
-0.02561722882091999,
0.04204174503684044,
-0.05724786967039108,
0.0762060284614563,
-0.006334691308438778,
-0.07775887101888657,
-0.09490123391151428,
0.002592582954093814,
0.29607272148132324,
-0.13206219673156738,
0.11347667127847672,
0.43130505084991455,
0.07491658627986908,
0.08675660938024521,
-0.1222805306315422,
-0.05011172965168953,
-0.09899600595235825,
0.02446424961090088,
-0.008660665713250637,
0.1864352524280548,
-0.06780783832073212,
-0.018456993624567986,
0.05519538372755051,
0.053670529276132584,
0.08048956841230392,
-0.22636665403842926,
-0.1373836249113083,
-0.011820163577795029,
-0.06036384776234627,
-0.21180830895900726,
0.024661719799041748,
-0.06467945128679276,
0.04326812922954559,
0.031106621026992798,
-0.1218821108341217,
0.15767976641654968,
-0.021642114967107773,
-0.08097198605537415,
0.09410100430250168,
-0.2045363038778305,
-0.0674152672290802,
-0.2070121467113495,
-0.05083119124174118,
0.05370981991291046,
0.05475202575325966,
0.048254210501909256,
-0.024410801008343697,
-0.05628518387675285,
0.019049039110541344,
-0.10748866200447083,
-0.14164051413536072,
-0.03148767724633217,
0.04221876338124275,
0.09188827872276306,
-0.07903469353914261,
-0.08440255373716354,
-0.09073856472969055,
-0.03989356383681297,
-0.06643612682819366,
0.057834167033433914,
-0.10433634370565414,
0.09688135981559753,
0.20073744654655457,
-0.011589284986257553,
0.06887412816286087,
-0.06287635117769241,
0.08089158684015274,
-0.01656951755285263,
-0.12636809051036835,
0.06265083700418472,
-0.009094814769923687,
0.02501223422586918,
0.18434375524520874,
0.10860177874565125,
-0.11188825964927673,
-0.0426454059779644,
-0.14142030477523804,
-0.15677288174629211,
-0.2284785360097885,
-0.0722619891166687,
-0.07796493172645569,
0.10695859789848328,
0.049525342881679535,
0.1110767275094986,
0.13268820941448212,
0.02224240079522133,
0.09962653368711472,
-0.017251551151275635,
-0.00040528737008571625,
0.03874737024307251,
0.20739394426345825,
-0.03365937992930412,
-0.027623334899544716,
-0.14161135256290436,
0.0691787451505661,
0.14249524474143982,
0.139995276927948,
0.16600163280963898,
0.3162063956260681,
0.22497759759426117,
0.12792226672172546,
0.1330370157957077,
0.16604992747306824,
0.0472303070127964,
0.09019041806459427,
-0.019124092534184456,
-0.017582710832357407,
-0.02325849048793316,
0.013185273855924606,
0.04936143010854721,
0.062171682715415955,
-0.21109169721603394,
-0.0013586258282884955,
-0.25098541378974915,
-0.027053730562329292,
-0.029432402923703194,
0.1569761335849762,
-0.06975451111793518,
0.09832741320133209,
0.043669864535331726,
0.09876921027898788,
-0.015986476093530655,
0.1536543369293213,
-0.03239721804857254,
-0.014573484659194946,
-0.011625164188444614,
0.05013485625386238,
0.057652924209833145,
0.013699566945433617,
0.03295789659023285,
-0.05778241157531738,
-0.2052765190601349,
0.02501329965889454,
0.08287971466779709,
-0.19295738637447357,
0.31623080372810364,
0.042994868010282516,
-0.08778177201747894,
-0.0479305237531662,
-0.08007439970970154,
-0.02403469756245613,
0.1669764518737793,
0.11431702226400375,
0.05748489871621132,
-0.2534176707267761,
-0.15290476381778717,
-0.06660697609186172,
-0.01424906775355339,
0.11530584841966629,
0.014175640419125557,
-0.13470326364040375,
-0.039485979825258255,
0.04143144562840462,
0.002195778302848339,
0.0799378752708435,
-0.07753798365592957,
-0.05727759376168251,
0.024149620905518532,
0.13680347800254822,
0.027762344107031822,
-0.07132229208946228,
0.07605323195457458,
-0.0003505912027321756,
0.04955020174384117,
-0.18805696070194244,
0.05452967807650566,
-0.06911149621009827,
-0.2425590604543686,
0.04135144501924515,
-0.02663508988916874,
0.01498977467417717,
-0.05824420228600502,
-0.12647001445293427,
-0.12276162207126617,
-0.15032735466957092,
0.11973794549703598,
-0.03637445718050003,
-0.0015369025059044361,
-0.06563126295804977,
0.16168387234210968,
-0.10179031640291214,
0.055348824709653854,
0.014502118341624737,
0.053833503276109695,
-0.0014364761300384998,
-0.10604432970285416,
0.11800113320350647,
-0.12128034234046936,
0.05950487405061722,
0.04800638556480408,
-0.020337622612714767,
0.030141154304146767,
0.07355710864067078,
-0.06245864927768707,
0.19461935758590698,
0.3996458649635315,
-0.06735697388648987,
0.21052922308444977,
0.29975077509880066,
-0.11306716501712799,
-0.23508331179618835,
-0.0946723073720932,
-0.28400173783302307,
-0.09579131752252579,
0.031889621168375015,
-0.12271463871002197,
0.004994374234229326,
0.21855145692825317,
-0.12724006175994873,
0.32071229815483093,
-0.2246803492307663,
-0.0602361224591732,
0.11491435021162033,
-0.05154652148485184,
0.46896660327911377,
-0.11816392838954926,
-0.14725728332996368,
-0.03627987205982208,
-0.23439070582389832,
0.14878790080547333,
0.024235181510448456,
0.06288565695285797,
-0.006328092887997627,
-0.04687908664345741,
-0.021492354571819305,
-0.047317106276750565,
0.21927282214164734,
0.018123384565114975,
0.1139855682849884,
-0.05366784334182739,
-0.1671951413154602,
0.25412479043006897,
-0.015160803683102131,
0.00700352992862463,
-0.026240788400173187,
-0.04796367511153221,
-0.027118513360619545,
0.046213872730731964,
-0.03789704293012619,
0.057056162506341934,
-0.019342228770256042,
-0.07026860117912292,
-0.09488411247730255,
-0.023786107078194618,
-0.11998835951089859,
-0.04897436127066612,
0.315212219953537,
-0.04751734435558319,
0.0377190001308918,
0.14091940224170685,
-0.05110262334346771,
-0.1145578920841217,
0.03313014656305313,
-0.035150352865457535,
-0.1092200055718422,
0.0739675909280777,
-0.1602747142314911,
0.004372658208012581,
0.10586830973625183,
-0.007210183423012495,
0.08530297130346298,
0.10976134985685349,
-0.04242664948105812,
0.01910211332142353,
0.15716031193733215,
-0.09256551414728165,
0.00018715320038609207,
0.06702352315187454,
0.032279256731271744,
0.17425884306430817,
0.008439388126134872,
0.04447799548506737,
0.006953931413590908,
0.042794425040483475,
0.0017727067461237311,
0.012158522382378578,
-0.12908081710338593,
-0.06533278524875641,
0.05416157469153404,
-0.0037941618356853724,
-0.1149822324514389,
0.2116687297821045,
0.04668469727039337,
-0.05039292201399803,
-0.06049032881855965,
0.04103400930762291,
-0.08662349730730057,
-0.07701002061367035,
-0.17442163825035095,
0.005864012986421585,
-0.20347069203853607,
-0.11531949043273926,
0.02061457559466362,
-0.1133415549993515,
-0.015374796465039253,
0.15724389255046844,
0.039288803935050964,
0.17433549463748932,
0.057092487812042236,
-0.012051633559167385,
0.07364100217819214,
-0.10508841276168823,
-0.27986207604408264,
0.019700372591614723,
-0.042031969875097275,
-0.024145841598510742,
0.010130248963832855,
0.03936855494976044,
-0.053051792085170746,
-0.07699041813611984,
-0.15602891147136688,
0.08743733912706375,
-0.10756352543830872,
0.042746059596538544,
-0.09767849743366241,
-0.058715805411338806,
0.08856650441884995,
0.016887011006474495,
-0.04606348276138306,
0.015388241969048977,
-0.16701138019561768,
0.040576010942459106,
0.013213451951742172,
0.056745436042547226,
-0.06186240166425705,
-0.028149524703621864,
0.07983431220054626,
0.05936424061655998,
0.07884479314088821,
0.07507334649562836,
0.04281532019376755,
0.12806454300880432,
-0.16458827257156372,
-0.005474859848618507,
0.10466022044420242,
-0.022124456241726875,
0.006573962047696114,
0.080918088555336,
-0.04723747447133064,
0.09231097251176834,
-0.09327706694602966,
0.042155634611845016,
-0.08635548502206802,
-0.09218272566795349,
-0.10282430052757263,
-0.004678824916481972,
-0.18162137269973755,
0.020271798595786095,
-0.14090165495872498,
0.22998687624931335,
0.011611484922468662,
0.08870216459035873,
0.05398242920637131,
-0.004889402538537979,
0.041578181087970734,
-0.014590299688279629,
-0.006731848232448101,
-0.038029205054044724,
-0.1499195694923401,
-0.06091319024562836,
-0.08210523426532745,
-0.00894157961010933,
0.3709453344345093,
-0.03841117024421692,
-0.16051799058914185,
0.05174537003040314,
0.12808112800121307,
-0.004495458677411079,
0.028847621753811836,
0.31545841693878174,
0.07578998804092407,
-0.021821945905685425,
-0.126447856426239,
0.05344526469707489,
-0.06069440394639969,
-0.17001225054264069,
0.1430491805076599,
0.03048855811357498,
0.08475354313850403,
0.04582849517464638,
0.1244293600320816,
-0.10502518713474274,
0.0002800747752189636,
-0.05334014445543289,
0.08342163264751434,
0.005081727635115385,
0.03116319328546524,
0.047058623284101486,
0.1639874279499054,
-0.03309374675154686,
-0.01276120450347662,
-0.052541930228471756,
-0.016218816861510277,
-0.17245665192604065,
-0.1259782910346985,
0.01787005178630352,
-0.12343088537454605,
0.07029888033866882,
0.02314363420009613,
0.0693025290966034,
0.27778932452201843,
0.02227831445634365,
-0.037991445511579514,
-0.06574410200119019,
-0.16170920431613922,
-0.043377168476581573,
-0.013468199409544468,
-0.00225620623677969,
0.03427441418170929,
-0.1587003618478775,
-0.11113078147172928,
0.010077851824462414,
-0.23150713741779327,
-0.030404146760702133,
0.04201243445277214,
0.07477619498968124,
-0.027351640164852142,
-0.09659646451473236,
-0.05292573198676109,
-0.04989369213581085,
0.0978633314371109,
-0.02792114019393921,
0.23666667938232422,
-0.005066197831183672,
0.03435163199901581,
0.07493063807487488,
0.035143837332725525,
-0.017472874373197556,
-0.05500023066997528,
0.036954253911972046,
0.13017874956130981,
-0.0026822062209248543,
0.10952424257993698,
-0.05029696598649025,
-0.018618782982230186,
0.022701913490891457,
0.1532883197069168,
0.2295265942811966,
-0.06098145246505737,
0.016661008819937706,
-0.00993722677230835,
0.018924623727798462,
0.06299242377281189,
0.18725284934043884,
-0.029469145461916924,
0.2254985123872757,
-0.07282853871583939,
-0.0450349897146225,
-0.040356263518333435,
0.05596847087144852,
-0.07746271044015884,
0.01888779178261757,
0.006103279069066048,
-0.11678760498762131,
-0.11230915039777756,
0.09417769312858582,
-0.12361448258161545,
0.1333419233560562,
0.2602144479751587,
-0.08850222080945969,
0.08162997663021088,
0.009261987172067165,
0.10575367510318756,
-0.028377942740917206,
0.06303248554468155,
-0.14998990297317505,
-0.09340017288923264,
-0.07175090163946152,
0.010366047732532024,
-0.35277682542800903,
-0.1527424454689026,
0.05217723548412323,
0.13464516401290894,
0.19973614811897278,
0.0193361546844244,
0.17953117191791534,
0.00010866340744541958,
0.12162303924560547,
-0.07970237731933594,
0.16948284208774567,
0.04674931988120079,
-0.07192179560661316,
-0.14947031438350677,
-0.18935741484165192,
-0.04290371760725975,
0.02863827347755432,
0.04246106371283531,
-0.04032924026250839,
0.04598650708794594,
0.1590558886528015,
-0.09096425026655197,
-0.02076193317770958,
-0.0870942771434784,
-0.10240193456411362,
0.07305222749710083,
-0.06793145835399628,
0.00989602878689766,
-0.08067302405834198,
0.003461869666352868,
-0.036065466701984406,
0.11958018690347672,
-0.18655182421207428,
-0.08325619250535965,
0.13677145540714264,
0.031074590981006622,
0.20743191242218018,
-0.0042498474940657616,
-0.06865084916353226,
-0.006513423286378384,
-0.07911794632673264,
0.0904112383723259,
-0.13705450296401978,
0.03480922058224678,
0.09766968339681625,
0.01710483245551586,
0.020746951922774315,
-0.21379989385604858,
0.07336468249559402,
-0.006897794082760811,
-0.05903564393520355,
-0.09125455468893051
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xls-r-common_voice-tr-ft
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the COMMON_VOICE - TR dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5806
- Wer: 0.3998
- Cer: 0.1053
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 5000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
| 0.5369 | 17.0 | 500 | 0.6021 | 0.6366 | 0.1727 |
| 0.3542 | 34.0 | 1000 | 0.5265 | 0.4906 | 0.1278 |
| 0.1866 | 51.0 | 1500 | 0.5805 | 0.4768 | 0.1261 |
| 0.1674 | 68.01 | 2000 | 0.5336 | 0.4518 | 0.1186 |
| 0.19 | 86.0 | 2500 | 0.5676 | 0.4427 | 0.1151 |
| 0.0815 | 103.0 | 3000 | 0.5510 | 0.4268 | 0.1125 |
| 0.0545 | 120.0 | 3500 | 0.5608 | 0.4175 | 0.1099 |
| 0.0299 | 137.01 | 4000 | 0.5875 | 0.4222 | 0.1124 |
| 0.0267 | 155.0 | 4500 | 0.5882 | 0.4026 | 0.1063 |
| 0.025 | 172.0 | 5000 | 0.5806 | 0.3998 | 0.1053 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
{"language": ["tr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer"], "model-index": [{"name": "wav2vec2-xls-r-common_voice-tr-ft", "results": []}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-xls-r-common_voice-tr-ft-100sh
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"tr",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-xls-r-common\_voice-tr-ft
==================================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the COMMON\_VOICE - TR dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5806
* Wer: 0.3998
* Cer: 0.1053
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0005
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 4
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 64
* total\_eval\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 5000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.2
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
64,
191,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
-0.09650726616382599,
0.12117193639278412,
-0.004527002107352018,
0.06514307856559753,
0.11457565426826477,
0.038704510778188705,
0.08740687370300293,
0.15096403658390045,
-0.07117082923650742,
0.10256026685237885,
0.09016414731740952,
0.0954277291893959,
0.06730622053146362,
0.121658556163311,
-0.0018589081009849906,
-0.30084073543548584,
-0.0006064401823095977,
-0.035291507840156555,
-0.14174160361289978,
0.11107107251882553,
0.06938152760267258,
-0.10199827700853348,
0.035254497081041336,
-0.008177969604730606,
-0.09387392550706863,
-0.015441520139575005,
-0.033370617777109146,
-0.04377298429608345,
0.10544753819704056,
0.03250429034233093,
0.05154293030500412,
0.03388528525829315,
0.11582028120756149,
-0.2696355879306793,
0.0036237118765711784,
0.07798583060503006,
0.03489015996456146,
0.06552547961473465,
0.11395703256130219,
-0.01575239934027195,
0.155069038271904,
-0.07745800167322159,
0.07093654572963715,
0.04058479145169258,
-0.09942469000816345,
-0.26683321595191956,
-0.07749894261360168,
0.023178834468126297,
0.11465299874544144,
0.07137655466794968,
-0.01727989874780178,
0.03503761813044548,
-0.08861485123634338,
0.07530400902032852,
0.21005716919898987,
-0.2613418996334076,
-0.05939130857586861,
0.011426916345953941,
0.02991652302443981,
0.06295371800661087,
-0.10645954310894012,
-0.010591779835522175,
0.002896168502047658,
0.021983426064252853,
0.10758715122938156,
0.016453802585601807,
0.03283093869686127,
0.021959437057375908,
-0.14630067348480225,
-0.03533899411559105,
0.07027588784694672,
0.06160784140229225,
0.000020418443455128,
-0.10918369144201279,
-0.03621574491262436,
-0.21188385784626007,
-0.04603882506489754,
0.010660563595592976,
0.025481369346380234,
-0.03806672990322113,
-0.07220764458179474,
0.03760514408349991,
-0.036025211215019226,
-0.08902984112501144,
0.024924386292696,
0.12707112729549408,
0.06156732514500618,
-0.018294665962457657,
0.03176337480545044,
0.10607672482728958,
0.04564259573817253,
-0.14285829663276672,
0.010038777254521847,
0.030723029747605324,
-0.12570980191230774,
0.0032613875810056925,
-0.006937896367162466,
0.013869810849428177,
0.0398072749376297,
0.1311415582895279,
-0.012562629766762257,
0.09423021227121353,
0.036090951412916183,
0.0015186886303126812,
-0.07671743631362915,
0.1566774547100067,
-0.09057160466909409,
-0.09620054811239243,
-0.052357107400894165,
0.10791704803705215,
0.0013356053968891501,
-0.011795931495726109,
-0.060727328062057495,
0.034138571470975876,
0.08319862186908722,
0.05953458696603775,
-0.00788439903408289,
0.03258901834487915,
-0.06455041468143463,
-0.02305104397237301,
0.026378018781542778,
-0.1114794909954071,
0.0498797744512558,
0.04513555020093918,
-0.0766843780875206,
-0.010874662548303604,
-0.011593121103942394,
-0.005367440637201071,
-0.031230654567480087,
0.11955691874027252,
-0.0568624809384346,
-0.017134033143520355,
-0.07421621680259705,
-0.09237245470285416,
0.01933680661022663,
-0.05092591792345047,
-0.006972816307097673,
-0.03509507328271866,
-0.0755171999335289,
-0.055336419492959976,
0.0721474438905716,
-0.08773519843816757,
-0.06590203940868378,
-0.06506765633821487,
-0.07392776757478714,
0.05690854415297508,
-0.012676794081926346,
0.1672700047492981,
-0.06899184733629227,
0.10178542882204056,
0.003811935195699334,
0.07161282002925873,
0.10332481563091278,
0.06601402163505554,
-0.025667661800980568,
0.04806382209062576,
-0.15635289251804352,
0.09336093813180923,
-0.10886199027299881,
0.02737072855234146,
-0.14595738053321838,
-0.1177339255809784,
0.012437171302735806,
-0.012858756817877293,
0.09155197441577911,
0.10355601459741592,
-0.17630049586296082,
-0.07944928109645844,
0.16015970706939697,
-0.05935999006032944,
-0.05750656872987747,
0.12639184296131134,
-0.0034450942184776068,
-0.08173701167106628,
0.01364877913147211,
0.18014028668403625,
0.11558209359645844,
-0.10465649515390396,
0.0258636474609375,
-0.03916841000318527,
0.09431231766939163,
0.03784100338816643,
0.08851591497659683,
-0.03213164582848549,
0.01864403858780861,
0.005747093353420496,
-0.028984401375055313,
0.05206282064318657,
-0.09483376145362854,
-0.0845029428601265,
-0.015388154424726963,
-0.08690223097801208,
0.018435847014188766,
0.05585990846157074,
0.014366020448505878,
-0.07803957164287567,
-0.1408282369375229,
-0.015206616371870041,
0.10370628535747528,
-0.1025032326579094,
0.014914145693182945,
-0.07959572970867157,
0.04273127019405365,
-0.00136687105987221,
0.004191636107861996,
-0.1508346050977707,
-0.02572690136730671,
0.0373956672847271,
-0.07912580668926239,
0.015243024565279484,
-0.01703687757253647,
0.07285024225711823,
0.05898039788007736,
-0.05843387544155121,
-0.053451456129550934,
-0.029981613159179688,
-0.0005431451718322933,
-0.05321763455867767,
-0.24829073250293732,
-0.06063608080148697,
-0.017463689669966698,
0.11950088292360306,
-0.19546708464622498,
-0.005130867473781109,
0.009505760855972767,
0.1062871590256691,
0.015317042358219624,
-0.057556916028261185,
0.0067534781992435455,
0.07056788355112076,
-0.009248172864317894,
-0.08137263357639313,
0.0394209586083889,
-0.01087295450270176,
-0.09308500587940216,
-0.003892136737704277,
-0.12488295137882233,
0.07878796756267548,
0.08454620093107224,
0.047100335359573364,
-0.09303056448698044,
-0.017544720321893692,
-0.06039084121584892,
-0.061199188232421875,
-0.016729922965168953,
0.041393253952264786,
0.1784866452217102,
0.020322374999523163,
0.09032922238111496,
-0.05505417659878731,
-0.051257021725177765,
0.03983549773693085,
0.02499673143029213,
-0.004433903377503157,
0.1588943600654602,
0.08472131192684174,
-0.03040558286011219,
0.0873672366142273,
0.05281677842140198,
-0.05540505051612854,
0.11064938455820084,
-0.06559780985116959,
-0.09643793106079102,
-0.04491405561566353,
0.01520730834454298,
0.032546184957027435,
0.1147979199886322,
-0.12804773449897766,
0.001790310605429113,
0.022178659215569496,
0.026831630617380142,
0.014585593715310097,
-0.18266832828521729,
-0.014595383778214455,
0.029792703688144684,
-0.09041447192430496,
-0.02908508852124214,
-0.011465982533991337,
-0.004178459290415049,
0.09733716398477554,
0.010962596163153648,
-0.06074533611536026,
-0.03238954395055771,
-0.027315450832247734,
-0.07195647060871124,
0.17312005162239075,
-0.10649093240499496,
-0.12859269976615906,
-0.09693026542663574,
-0.015674591064453125,
-0.01656954362988472,
-0.01806756481528282,
0.03416433557868004,
-0.10802824050188065,
-0.051601435989141464,
-0.06305265426635742,
0.035556040704250336,
-0.03626472130417824,
0.041519805788993835,
0.025757957249879837,
0.006891528144478798,
0.06608571112155914,
-0.09295368939638138,
0.020806055516004562,
-0.022632718086242676,
-0.04870061203837395,
0.028026899322867393,
0.06267334520816803,
0.09605301171541214,
0.1769210547208786,
0.03849821165204048,
0.03229113295674324,
-0.013883654028177261,
0.16069002449512482,
-0.10870067030191422,
0.009016416035592556,
0.08731616288423538,
0.018506791442632675,
0.04322023317217827,
0.15436333417892456,
0.05321677401661873,
-0.09025441855192184,
0.0074879880994558334,
0.05073830485343933,
-0.028257351368665695,
-0.2320399433374405,
-0.05448167398571968,
-0.05091191455721855,
-0.01398072112351656,
0.11123993247747421,
0.03806055709719658,
-0.05097490921616554,
0.003313750261440873,
0.009692511521279812,
-0.020374102517962456,
0.02252248302102089,
0.03781552240252495,
0.07318771630525589,
0.04457278549671173,
0.10708366334438324,
-0.019281012937426567,
-0.02620791271328926,
0.043712861835956573,
0.00202089617960155,
0.23092514276504517,
-0.026585832238197327,
0.1515386700630188,
0.04335152730345726,
0.1428251713514328,
0.002357055200263858,
0.03922758251428604,
0.006063040345907211,
-0.009093119762837887,
0.02340478077530861,
-0.042141810059547424,
-0.016171788796782494,
0.015427989885210991,
0.09812910854816437,
-0.00715153943747282,
-0.0940544381737709,
0.02236972190439701,
0.03422601521015167,
0.3009266257286072,
0.07420157641172409,
-0.26897305250167847,
-0.07192477583885193,
0.0009158822358585894,
-0.05961545929312706,
-0.025527682155370712,
0.0429861806333065,
0.12171454727649689,
-0.06417351961135864,
0.07781057804822922,
-0.06511335074901581,
0.07244770973920822,
-0.09089361876249313,
-0.00297488272190094,
0.1191580668091774,
0.11426755785942078,
0.009926973842084408,
0.05497979745268822,
-0.24731574952602386,
0.2645098865032196,
-0.016467560082674026,
0.06259509176015854,
-0.04513252526521683,
0.03186352923512459,
0.00009752627374837175,
-0.03447254002094269,
0.08653848618268967,
-0.0028904250357300043,
-0.09974342584609985,
-0.1672193855047226,
-0.1084480956196785,
0.02069324627518654,
0.13463392853736877,
-0.04463903233408928,
0.10218235850334167,
-0.03758090361952782,
-0.043390605598688126,
0.04232379421591759,
-0.12660428881645203,
-0.08346126228570938,
-0.11520218104124069,
0.029666809365153313,
-0.03263112157583237,
0.02876753732562065,
-0.06936533004045486,
-0.09426268935203552,
-0.08865419030189514,
0.16735237836837769,
-0.08708090335130692,
-0.021659687161445618,
-0.12905645370483398,
0.05929184705018997,
0.18283024430274963,
-0.06577032059431076,
0.04774423688650131,
0.020175259560346603,
0.0815303847193718,
0.04569804668426514,
-0.04685577377676964,
0.11585226655006409,
-0.07228069752454758,
-0.20218445360660553,
-0.05953390896320343,
0.15543992817401886,
0.047235824167728424,
0.07142528891563416,
-0.043016884475946426,
0.03287417069077492,
-0.0021519677247852087,
-0.09723682701587677,
0.08130601048469543,
0.047964420169591904,
0.01750962622463703,
0.03227691724896431,
-0.03229621425271034,
0.02166011743247509,
-0.05720260366797447,
-0.05649036541581154,
0.10849513113498688,
0.26616954803466797,
-0.10367998480796814,
0.0788557380437851,
0.04322325810790062,
-0.05226834863424301,
-0.17440110445022583,
-0.02594470977783203,
0.11275532841682434,
0.02316022850573063,
-0.0017402281519025564,
-0.2179841250181198,
0.039338596165180206,
0.09111566841602325,
-0.01732790842652321,
0.08656188100576401,
-0.37155354022979736,
-0.1328900307416916,
0.07441651076078415,
0.08302823454141617,
-0.03580643981695175,
-0.16398796439170837,
-0.055205825716257095,
0.004730965476483107,
-0.0922386422753334,
0.058545827865600586,
0.03238997980952263,
0.12864381074905396,
-0.02110089175403118,
0.016660818830132484,
0.019528035074472427,
-0.05383923649787903,
0.13569529354572296,
0.0033540388103574514,
0.04024743661284447,
-0.018322423100471497,
0.041622571647167206,
-0.011826783418655396,
-0.05390524864196777,
-0.0007410020334646106,
-0.07462171465158463,
0.03651195392012596,
-0.09747902303934097,
-0.02385615184903145,
-0.08681236952543259,
-0.0008977237157523632,
-0.03432483598589897,
-0.034052833914756775,
-0.029403725638985634,
0.05407027155160904,
0.09365296363830566,
-0.022348325699567795,
0.08515956252813339,
-0.031427234411239624,
0.11632730066776276,
0.09280026704072952,
0.07378551363945007,
0.007391475606709719,
-0.10513000935316086,
-0.00743955560028553,
0.0007005666266195476,
0.02865145169198513,
-0.12448158115148544,
0.03713028505444527,
0.15550534427165985,
0.044977132230997086,
0.12117257714271545,
0.05319216102361679,
-0.06144792214035988,
0.0004034526355098933,
0.07230211794376373,
-0.09920009225606918,
-0.16381962597370148,
0.008197021670639515,
-0.038876764476299286,
-0.10440065711736679,
-0.001688867574557662,
0.07588072866201401,
-0.030340440571308136,
-0.007327147759497166,
0.01673533394932747,
0.06482790410518646,
-0.029286077246069908,
0.24501995742321014,
0.013233632780611515,
0.07933172583580017,
-0.10929732024669647,
0.09834078699350357,
0.05016762763261795,
-0.14422036707401276,
0.030390016734600067,
0.10513386130332947,
-0.06351803988218307,
-0.009594671428203583,
0.07744330912828445,
0.06385409086942673,
0.032533757388591766,
-0.02142919786274433,
-0.1076708436012268,
-0.14297744631767273,
0.10264381021261215,
0.061468422412872314,
0.025061914697289467,
0.030823374167084694,
-0.03422961384057999,
0.03526155650615692,
-0.10935273766517639,
0.10692198574542999,
0.09399983286857605,
0.06231293827295303,
-0.1284976601600647,
0.10511147975921631,
0.008788865059614182,
0.006144794635474682,
-0.009202076122164726,
0.017180537804961205,
-0.1200597882270813,
0.011027061380445957,
-0.09358393400907516,
-0.024891065433621407,
-0.056636590510606766,
0.00972068216651678,
0.0030919990967959166,
-0.05646122246980667,
-0.048842523247003555,
0.019678102806210518,
-0.10384080559015274,
-0.05550370365381241,
-0.03154940903186798,
0.06059170514345169,
-0.10915971547365189,
-0.01491143461316824,
0.026812691241502762,
-0.11137153953313828,
0.10208720713853836,
0.03846249729394913,
0.02717260830104351,
0.027643674984574318,
-0.08544818311929703,
-0.002567353192716837,
0.032409343868494034,
-0.0006897735293023288,
0.03911951556801796,
-0.18174827098846436,
-0.005338575225323439,
-0.03631289303302765,
0.0049512386322021484,
-0.0007479509222321212,
0.036468468606472015,
-0.12859341502189636,
0.022926101461052895,
-0.06685075908899307,
-0.05210915952920914,
-0.05452004075050354,
0.04754200577735901,
0.08671417087316513,
0.027895305305719376,
0.14331623911857605,
-0.08837692439556122,
0.05759640783071518,
-0.21715058386325836,
-0.009373086504638195,
-0.013277309946715832,
-0.05512319877743721,
-0.04960798844695091,
-0.004228977952152491,
0.10535820573568344,
-0.052263420075178146,
0.11168516427278519,
-0.0233757346868515,
0.033922214061021805,
0.02441100776195526,
-0.07933463901281357,
-0.0011383831733837724,
0.057502083480358124,
0.16502638161182404,
0.03995382413268089,
-0.031749412417411804,
0.06735267490148544,
0.0014011558378115296,
0.0434248186647892,
0.12941449880599976,
0.1657746434211731,
0.1264907866716385,
0.05398554727435112,
0.04796602949500084,
0.06247594952583313,
-0.16479066014289856,
-0.17059847712516785,
0.16798456013202667,
-0.09403549134731293,
0.1484539806842804,
-0.026357892900705338,
0.20697996020317078,
0.061630766838788986,
-0.2028375118970871,
0.07272141426801682,
-0.03962358832359314,
-0.10476059466600418,
-0.10600815713405609,
-0.07937774062156677,
-0.08039525151252747,
-0.18543218076229095,
0.020505433902144432,
-0.09887391328811646,
0.08545331656932831,
0.06046439707279205,
0.05069505050778389,
0.0441245436668396,
0.09642977267503738,
0.05187873914837837,
0.020142411813139915,
0.07694606482982635,
0.04939291253685951,
-0.034809134900569916,
-0.010421297512948513,
-0.0727028176188469,
0.02227138727903366,
-0.028406711295247078,
0.06023168936371803,
-0.0352761410176754,
-0.10891562700271606,
0.072280153632164,
0.010031148791313171,
-0.08480335026979446,
0.025013089179992676,
-0.01962737925350666,
0.05326996371150017,
0.09589334577322006,
0.041658613830804825,
-0.017702845856547356,
-0.024416808038949966,
0.2402745485305786,
-0.09461177885532379,
-0.06592399626970291,
-0.12304878234863281,
0.2732031047344208,
-0.0005129066994413733,
0.004362833686172962,
0.035331759601831436,
-0.059652362018823624,
-0.03335849940776825,
0.15209072828292847,
0.17547371983528137,
-0.029461288824677467,
-0.026274021714925766,
0.022546999156475067,
-0.008922592736780643,
-0.01743367686867714,
0.10404233634471893,
0.12582863867282867,
0.07405202090740204,
-0.05486546829342842,
-0.038770899176597595,
-0.046554576605558395,
-0.05376826971769333,
-0.037523310631513596,
0.08614908903837204,
0.03389982134103775,
-0.011954160407185555,
-0.026441430673003197,
0.09846607595682144,
-0.07897283136844635,
-0.14057864248752594,
0.060372188687324524,
-0.18550674617290497,
-0.19662149250507355,
-0.03292994946241379,
0.07228443026542664,
0.007707824464887381,
0.04552292823791504,
0.012731441296637058,
-0.02853395789861679,
0.101543128490448,
0.008695941418409348,
-0.06344974786043167,
-0.0904678925871849,
0.05386866256594658,
-0.07269972562789917,
0.18769490718841553,
-0.02542051672935486,
0.019987402483820915,
0.11124308407306671,
0.05846165493130684,
-0.10489511489868164,
0.05290534719824791,
0.07746513187885284,
-0.1457371860742569,
0.046982187777757645,
0.18174487352371216,
-0.04381519556045532,
0.11560030281543732,
0.04037439450621605,
-0.04501301795244217,
0.013095674104988575,
-0.10848080366849899,
-0.04496502876281738,
-0.03941347822546959,
-0.017140787094831467,
-0.035666681826114655,
0.15170004963874817,
0.22492875158786774,
-0.0535806305706501,
0.004094380885362625,
-0.054887693375349045,
0.005916445516049862,
0.01393944676965475,
0.16744928061962128,
-0.03273686766624451,
-0.25648555159568787,
0.026252107694745064,
-0.013258266262710094,
0.04055996611714363,
-0.173325777053833,
-0.08320947736501694,
0.029206736013293266,
-0.04867954179644585,
-0.07764510065317154,
0.13923712074756622,
0.0567268431186676,
0.04915127903223038,
-0.058029480278491974,
-0.09960106760263443,
-0.022662555798888206,
0.17042776942253113,
-0.19349786639213562,
-0.05798094719648361
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xls-r-common_voice-tr-ft-stream
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the COMMON_VOICE - TR dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3519
- Wer: 0.2927
- Cer: 0.0694
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 5000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 0.6768 | 9.01 | 500 | 0.4220 | 0.5143 | 0.1235 |
| 0.3801 | 19.01 | 1000 | 0.3303 | 0.4403 | 0.1055 |
| 0.3616 | 29.0 | 1500 | 0.3540 | 0.3716 | 0.0878 |
| 0.2334 | 39.0 | 2000 | 0.3666 | 0.3671 | 0.0842 |
| 0.3141 | 49.0 | 2500 | 0.3407 | 0.3373 | 0.0819 |
| 0.1926 | 58.01 | 3000 | 0.3886 | 0.3520 | 0.0867 |
| 0.1372 | 68.01 | 3500 | 0.3415 | 0.3189 | 0.0743 |
| 0.091 | 78.0 | 4000 | 0.3750 | 0.3164 | 0.0757 |
| 0.0893 | 88.0 | 4500 | 0.3559 | 0.2968 | 0.0712 |
| 0.095 | 98.0 | 5000 | 0.3519 | 0.2927 | 0.0694 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
{"language": ["tr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer"], "model-index": [{"name": "wav2vec2-xls-r-common_voice-tr-ft-stream", "results": []}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-xls-r-common_voice-tr-ft-stream
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"tr",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-xls-r-common\_voice-tr-ft-stream
=========================================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the COMMON\_VOICE - TR dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3519
* Wer: 0.2927
* Cer: 0.0694
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0005
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 4
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 64
* total\_eval\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 5000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.2
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
64,
191,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
-0.09708340466022491,
0.123112253844738,
-0.004483203403651714,
0.06645161658525467,
0.11221414059400558,
0.03873610496520996,
0.09424649924039841,
0.15145829319953918,
-0.06672719866037369,
0.10680897533893585,
0.08587447553873062,
0.09450570493936539,
0.06739989668130875,
0.12236902862787247,
-0.005881501827389002,
-0.29590681195259094,
0.00259216851554811,
-0.03583206236362457,
-0.14229947328567505,
0.10848086327314377,
0.06852485239505768,
-0.09881313145160675,
0.035926561802625656,
-0.007723635993897915,
-0.09560459107160568,
-0.01535017229616642,
-0.032503336668014526,
-0.043750911951065063,
0.10358407348394394,
0.03833933174610138,
0.05198868736624718,
0.030914293602108955,
0.11175552755594254,
-0.2675160765647888,
0.004727359861135483,
0.08062958717346191,
0.030080994591116905,
0.06426609307527542,
0.11634349822998047,
-0.020817957818508148,
0.16065841913223267,
-0.07891291379928589,
0.07443206012248993,
0.03996475785970688,
-0.09950761497020721,
-0.2609938383102417,
-0.07413537055253983,
0.02186519466340542,
0.11565817892551422,
0.06828462332487106,
-0.015377731062471867,
0.03559396415948868,
-0.08348655700683594,
0.0776565745472908,
0.21313846111297607,
-0.2593514621257782,
-0.05945070832967758,
0.013487910851836205,
0.028895758092403412,
0.06137877702713013,
-0.10731623321771622,
-0.012809005565941334,
-0.0003380915441084653,
0.019853204488754272,
0.10958023369312286,
0.0158661138266325,
0.029265420511364937,
0.018711606040596962,
-0.14479728043079376,
-0.03429912403225899,
0.06172749772667885,
0.05743872746825218,
0.0011235193815082312,
-0.11186368018388748,
-0.04093155264854431,
-0.20539641380310059,
-0.0476541742682457,
0.01268367562443018,
0.023853575810790062,
-0.03747273609042168,
-0.06668829917907715,
0.03694699704647064,
-0.0306050106883049,
-0.09075834602117538,
0.02018078975379467,
0.13885018229484558,
0.06120777130126953,
-0.019033897668123245,
0.02871895767748356,
0.10237082839012146,
0.04604865238070488,
-0.14665155112743378,
0.007176143117249012,
0.033307723701000214,
-0.13215133547782898,
0.0035967573057860136,
-0.007440968882292509,
0.009212667122483253,
0.03924066573381424,
0.13343589007854462,
-0.00946157705038786,
0.09906964749097824,
0.031788405030965805,
0.0001197719102492556,
-0.0769546777009964,
0.15922307968139648,
-0.08610229194164276,
-0.09762619435787201,
-0.04844046011567116,
0.10526161640882492,
0.0019324104068800807,
-0.015156375244259834,
-0.0583476684987545,
0.03534655272960663,
0.07878172397613525,
0.060069404542446136,
-0.008935652673244476,
0.03204844519495964,
-0.06516791135072708,
-0.023017289116978645,
0.02420804463326931,
-0.1163308322429657,
0.051324982196092606,
0.04577057063579559,
-0.07623889297246933,
-0.008890870027244091,
-0.011653809808194637,
-0.010214331559836864,
-0.033838916569948196,
0.1206173226237297,
-0.05692020803689957,
-0.018341686576604843,
-0.07458939403295517,
-0.09407215565443039,
0.018845967948436737,
-0.0538947656750679,
-0.0099942022934556,
-0.033827949315309525,
-0.07823479175567627,
-0.05661749094724655,
0.07415056973695755,
-0.0869675725698471,
-0.059192098677158356,
-0.06345511972904205,
-0.07218916714191437,
0.05955604463815689,
-0.009108062833547592,
0.15989801287651062,
-0.0707208439707756,
0.09741216152906418,
-0.0021615016739815474,
0.07434287667274475,
0.10886005312204361,
0.0671105608344078,
-0.02648168057203293,
0.05002281442284584,
-0.156598761677742,
0.09508836269378662,
-0.11178920418024063,
0.027280965819954872,
-0.14780382812023163,
-0.11424428224563599,
0.01684003882110119,
-0.011427397839725018,
0.09069259464740753,
0.10135223716497421,
-0.17936694622039795,
-0.07778509706258774,
0.15532764792442322,
-0.06475087255239487,
-0.05673462152481079,
0.12510058283805847,
-0.005675689782947302,
-0.0837876945734024,
0.014820326119661331,
0.1830972582101822,
0.12084019184112549,
-0.10573367774486542,
0.02633851021528244,
-0.03823259100317955,
0.09497940540313721,
0.037827469408512115,
0.08428142219781876,
-0.03008653037250042,
0.01622471585869789,
0.0044092643074691296,
-0.028322581201791763,
0.050468094646930695,
-0.08794873207807541,
-0.08457810431718826,
-0.016816837713122368,
-0.08618170022964478,
0.012989334762096405,
0.05378377437591553,
0.010719497688114643,
-0.07645794004201889,
-0.14069461822509766,
-0.013696789741516113,
0.10373178869485855,
-0.10592744499444962,
0.014781592413783073,
-0.08021073788404465,
0.04181743785738945,
0.0022921813651919365,
0.00783771276473999,
-0.152248814702034,
-0.032904863357543945,
0.036486972123384476,
-0.09137055277824402,
0.015004989691078663,
-0.02099612168967724,
0.072693832218647,
0.06112779304385185,
-0.055391233414411545,
-0.04986253008246422,
-0.02885362319648266,
-0.0023435482289642096,
-0.05151592567563057,
-0.24863743782043457,
-0.059494707733392715,
-0.02004953846335411,
0.10826984792947769,
-0.1928507536649704,
-0.004371516406536102,
0.014787214808166027,
0.10423990339040756,
0.019686227664351463,
-0.05938434600830078,
0.010159268043935299,
0.06926847249269485,
-0.011121412739157677,
-0.08498495817184448,
0.036378465592861176,
-0.011542785912752151,
-0.09433671087026596,
-0.00018440262647345662,
-0.13099460303783417,
0.08345429599285126,
0.08138634264469147,
0.05928793549537659,
-0.0923287570476532,
-0.01849854178726673,
-0.057766713201999664,
-0.0610104501247406,
-0.021235434338450432,
0.04348202794790268,
0.17967666685581207,
0.02238420397043228,
0.0888364240527153,
-0.05595608055591583,
-0.04990651085972786,
0.03547351062297821,
0.028844766318798065,
-0.005199737381190062,
0.15609970688819885,
0.08306778967380524,
-0.022630389779806137,
0.08500739932060242,
0.04962661489844322,
-0.06305274367332458,
0.11041969805955887,
-0.06701112538576126,
-0.09261739253997803,
-0.0446883924305439,
0.014481857419013977,
0.03586587309837341,
0.11379998177289963,
-0.12282341718673706,
0.005387065466493368,
0.023171141743659973,
0.026376953348517418,
0.012405704706907272,
-0.18084409832954407,
-0.010743345133960247,
0.028735347092151642,
-0.09267506003379822,
-0.026901811361312866,
-0.008016243577003479,
-0.0039520263671875,
0.09405568987131119,
0.007516327779740095,
-0.05912691727280617,
-0.034546077251434326,
-0.03084167279303074,
-0.07071563601493835,
0.17579306662082672,
-0.10572537034749985,
-0.12729455530643463,
-0.09777630865573883,
-0.014667659066617489,
-0.022385794669389725,
-0.018955668434500694,
0.033816516399383545,
-0.10333633422851562,
-0.05795477703213692,
-0.06629710644483566,
0.035963863134384155,
-0.027019396424293518,
0.04374279826879501,
0.024270517751574516,
0.003100456902757287,
0.06646516174077988,
-0.09821505844593048,
0.020059993490576744,
-0.020153110846877098,
-0.04876183345913887,
0.022958260029554367,
0.06500347703695297,
0.09763052314519882,
0.17618656158447266,
0.03759557753801346,
0.0329168438911438,
-0.012602957896888256,
0.16606412827968597,
-0.1081203892827034,
0.012204133905470371,
0.08598629385232925,
0.016289709135890007,
0.046253859996795654,
0.15550871193408966,
0.05418463051319122,
-0.09390797466039658,
0.007658233400434256,
0.05204140394926071,
-0.031006719917058945,
-0.2318192571401596,
-0.05408565327525139,
-0.05282386764883995,
-0.018226509913802147,
0.11426588147878647,
0.0403752438724041,
-0.05112558603286743,
0.004752570763230324,
0.009370364248752594,
-0.017480574548244476,
0.02417500503361225,
0.04060673713684082,
0.07543949037790298,
0.0534135103225708,
0.10691442340612411,
-0.022000817582011223,
-0.02174784615635872,
0.0431549958884716,
0.004371676128357649,
0.22942829132080078,
-0.029225870966911316,
0.15772657096385956,
0.040746625512838364,
0.13663560152053833,
0.005159077700227499,
0.04217987507581711,
0.0011250199750065804,
-0.007270888891071081,
0.023982875049114227,
-0.04505401477217674,
-0.01605888642370701,
0.016737505793571472,
0.0937945693731308,
-0.009135554544627666,
-0.08942463248968124,
0.02079571783542633,
0.03507209196686745,
0.2972845435142517,
0.06898444890975952,
-0.2707175016403198,
-0.07026863098144531,
0.0017686458304524422,
-0.05896444618701935,
-0.02515997737646103,
0.04304073750972748,
0.12316393107175827,
-0.06355014443397522,
0.08046714961528778,
-0.06527342647314072,
0.07602956146001816,
-0.09460511058568954,
-0.004949511494487524,
0.1163039356470108,
0.12127923965454102,
0.006524837575852871,
0.05138779059052467,
-0.24865122139453888,
0.2611266076564789,
-0.01609685830771923,
0.06555280089378357,
-0.04554546996951103,
0.03186098486185074,
0.003940739203244448,
-0.03671325370669365,
0.08341854810714722,
-0.006082299631088972,
-0.10793697834014893,
-0.162232905626297,
-0.10663843154907227,
0.02222195267677307,
0.13057394325733185,
-0.04244270920753479,
0.09951233118772507,
-0.03767655789852142,
-0.044844042509794235,
0.042047131806612015,
-0.13086703419685364,
-0.08120281994342804,
-0.11757500469684601,
0.029717860743403435,
-0.02812616527080536,
0.031613245606422424,
-0.06762196123600006,
-0.09505794197320938,
-0.0837499350309372,
0.16413839161396027,
-0.0886337012052536,
-0.024821294471621513,
-0.13209427893161774,
0.047963935881853104,
0.18268372118473053,
-0.06774618476629257,
0.04789208248257637,
0.017850002273917198,
0.08873315155506134,
0.04413802921772003,
-0.05096021667122841,
0.11451714485883713,
-0.0736694261431694,
-0.19919463992118835,
-0.060116272419691086,
0.15322642028331757,
0.04772886261343956,
0.06842628121376038,
-0.04112133011221886,
0.033398766070604324,
-0.0014903752598911524,
-0.10215732455253601,
0.07605016231536865,
0.05037326738238335,
0.008682230487465858,
0.03432811424136162,
-0.03084961511194706,
0.02000494860112667,
-0.05655832588672638,
-0.050784409046173096,
0.10213914513587952,
0.2626713812351227,
-0.10347425937652588,
0.08452125638723373,
0.04472040385007858,
-0.05264737457036972,
-0.17573857307434082,
-0.02353517711162567,
0.11173569411039352,
0.023353448137640953,
-0.0015581020852550864,
-0.21739614009857178,
0.04317023605108261,
0.09017045795917511,
-0.016490770503878593,
0.08603951334953308,
-0.37637585401535034,
-0.13633210957050323,
0.0766601487994194,
0.08732996135950089,
-0.03746713325381279,
-0.16340483725070953,
-0.05533153936266899,
-0.0010075541213154793,
-0.08887722343206406,
0.06491657346487045,
0.03188692405819893,
0.12639084458351135,
-0.021021980792284012,
0.016467230394482613,
0.020406030118465424,
-0.05158364772796631,
0.1373956948518753,
0.00986742414534092,
0.04457302764058113,
-0.01835484243929386,
0.05071257799863815,
-0.013672188855707645,
-0.0576375350356102,
0.000856157683301717,
-0.08371006697416306,
0.03699376806616783,
-0.09649200737476349,
-0.02411068044602871,
-0.08089148998260498,
0.0010509849525988102,
-0.03260461241006851,
-0.03190239518880844,
-0.0316913016140461,
0.050628095865249634,
0.0990065336227417,
-0.02415432408452034,
0.08868148177862167,
-0.02891821227967739,
0.1172749251127243,
0.09455347061157227,
0.06960716843605042,
0.005849598441272974,
-0.1142682284116745,
-0.00241970201022923,
0.004929282236844301,
0.033355165272951126,
-0.12818607687950134,
0.039970993995666504,
0.15632154047489166,
0.04532111808657646,
0.1224624291062355,
0.05283626914024353,
-0.05844571813941002,
-0.00013993348693475127,
0.07173637300729752,
-0.09845138341188431,
-0.16486354172229767,
0.01182908471673727,
-0.038329437375068665,
-0.10234199464321136,
-0.004535104148089886,
0.07684481143951416,
-0.03107282891869545,
-0.0029568749014288187,
0.019669432193040848,
0.06508081406354904,
-0.023201633244752884,
0.2465585470199585,
0.013892162591218948,
0.07979929447174072,
-0.11202466487884521,
0.10006172955036163,
0.051158249378204346,
-0.14658589661121368,
0.03257300332188606,
0.10880515724420547,
-0.06343161314725876,
-0.00836294423788786,
0.08136594295501709,
0.06826445460319519,
0.03169620409607887,
-0.021650416776537895,
-0.1062016710639,
-0.14263387024402618,
0.10315902531147003,
0.07145597785711288,
0.026947034522891045,
0.03262803703546524,
-0.03813564032316208,
0.03307680785655975,
-0.10711977630853653,
0.10711926221847534,
0.09755418449640274,
0.06382139027118683,
-0.12234892696142197,
0.09811203926801682,
0.010760939680039883,
0.0036821034736931324,
-0.00986426044255495,
0.013810639269649982,
-0.1256767213344574,
0.009147027507424355,
-0.09592614322900772,
-0.017716698348522186,
-0.060825929045677185,
0.012539324350655079,
0.0038721070159226656,
-0.05880214646458626,
-0.049153223633766174,
0.019449036568403244,
-0.10407531261444092,
-0.056828200817108154,
-0.03339381515979767,
0.06053215637803078,
-0.11276260763406754,
-0.013393530622124672,
0.02932538464665413,
-0.11453311145305634,
0.10003460943698883,
0.03538758307695389,
0.023767167702317238,
0.027515392750501633,
-0.09022776782512665,
-0.001944559277035296,
0.030843187123537064,
-0.0005022844998165965,
0.03940568119287491,
-0.1819745898246765,
-0.002539536450058222,
-0.037109918892383575,
0.003759484039619565,
-0.002415100345388055,
0.04036194086074829,
-0.1295977532863617,
0.027242405340075493,
-0.06940453499555588,
-0.05100986734032631,
-0.052777208387851715,
0.046800870448350906,
0.08992382138967514,
0.026811381801962852,
0.14648517966270447,
-0.08748271316289902,
0.05872950702905655,
-0.2114032506942749,
-0.012455154210329056,
-0.010283258743584156,
-0.05666137859225273,
-0.04995337873697281,
-0.00534874526783824,
0.10564364492893219,
-0.05397621914744377,
0.11021674424409866,
-0.018815284594893456,
0.03212767839431763,
0.025259453803300858,
-0.07361817359924316,
-0.0057209874503314495,
0.055188097059726715,
0.1632082462310791,
0.0413675419986248,
-0.030257582664489746,
0.07259989529848099,
-0.0021634548902511597,
0.043395254760980606,
0.1278219223022461,
0.16817434132099152,
0.12381463497877121,
0.04976625740528107,
0.05022549629211426,
0.06132610887289047,
-0.16539502143859863,
-0.17093166708946228,
0.16811490058898926,
-0.09225627779960632,
0.14998412132263184,
-0.026081601157784462,
0.20322097837924957,
0.05930488184094429,
-0.20296482741832733,
0.07699404656887054,
-0.030323106795549393,
-0.10427676141262054,
-0.10661925375461578,
-0.0827418714761734,
-0.08086570352315903,
-0.1832336038351059,
0.01937785930931568,
-0.09835866838693619,
0.08644725382328033,
0.06136118993163109,
0.050474103540182114,
0.04680264741182327,
0.10224292427301407,
0.04490302875638008,
0.018161125481128693,
0.0750197023153305,
0.05119473859667778,
-0.03463025391101837,
-0.005007974803447723,
-0.07230564951896667,
0.016826899722218513,
-0.029157467186450958,
0.059798866510391235,
-0.030623242259025574,
-0.09968660771846771,
0.07237901538610458,
0.006752936635166407,
-0.08297285437583923,
0.02590889297425747,
-0.01973455213010311,
0.047127865254879,
0.09716176986694336,
0.04557202011346817,
-0.018396694213151932,
-0.023332709446549416,
0.23910871148109436,
-0.09419196099042892,
-0.06606562435626984,
-0.12360741943120956,
0.2700648307800293,
0.004492291249334812,
0.0035594685468822718,
0.032594338059425354,
-0.05736343190073967,
-0.03819623216986656,
0.15169082581996918,
0.17039379477500916,
-0.02768678218126297,
-0.024763058871030807,
0.017330320551991463,
-0.008090423420071602,
-0.014867434278130531,
0.10816439986228943,
0.12426535785198212,
0.07943955808877945,
-0.04949444532394409,
-0.03710078075528145,
-0.04629074037075043,
-0.05630826950073242,
-0.041434984654188156,
0.08660101145505905,
0.02895866520702839,
-0.01604476571083069,
-0.02479700744152069,
0.09681287407875061,
-0.08074696362018585,
-0.1453768014907837,
0.061752188950777054,
-0.183742955327034,
-0.19735026359558105,
-0.03379570320248604,
0.06989771872758865,
0.007399208843708038,
0.04352550208568573,
0.015397170558571815,
-0.031156975775957108,
0.10249149799346924,
0.008804005570709705,
-0.06433547288179398,
-0.09285733848810196,
0.05349551513791084,
-0.07480040192604065,
0.18776708841323853,
-0.023415785282850266,
0.01658002659678459,
0.10934573411941528,
0.059715427458286285,
-0.10691707581281662,
0.0542290098965168,
0.0775686576962471,
-0.15278401970863342,
0.04608899727463722,
0.1814330369234085,
-0.044454947113990784,
0.1154862642288208,
0.041124776005744934,
-0.04104189574718475,
0.010763371363282204,
-0.1096528172492981,
-0.041866909712553024,
-0.039108216762542725,
-0.020934078842401505,
-0.03641657158732414,
0.1507704257965088,
0.22590196132659912,
-0.050375621765851974,
0.0021592658013105392,
-0.05231296271085739,
0.008295176550745964,
0.01298009417951107,
0.16475220024585724,
-0.031875789165496826,
-0.2564619481563568,
0.027200322598218918,
-0.010774344205856323,
0.04073798656463623,
-0.1668022722005844,
-0.08673913776874542,
0.028686899691820145,
-0.046488240361213684,
-0.07449369132518768,
0.14081574976444244,
0.05739855021238327,
0.047937579452991486,
-0.05841344594955444,
-0.1132873073220253,
-0.02030651643872261,
0.16920135915279388,
-0.19153012335300446,
-0.06009126827120781
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xls-r-common_voice-tr-ft-500sh
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the COMMON_VOICE - TR dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5794
- Wer: 0.4009
- Cer: 0.1032
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 5000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
| 0.5288 | 17.0 | 500 | 0.5099 | 0.5426 | 0.1432 |
| 0.2967 | 34.0 | 1000 | 0.5421 | 0.4746 | 0.1256 |
| 0.2447 | 51.0 | 1500 | 0.5347 | 0.4831 | 0.1267 |
| 0.122 | 68.01 | 2000 | 0.5854 | 0.4479 | 0.1161 |
| 0.1035 | 86.0 | 2500 | 0.5597 | 0.4457 | 0.1166 |
| 0.081 | 103.0 | 3000 | 0.5748 | 0.4250 | 0.1144 |
| 0.0849 | 120.0 | 3500 | 0.5598 | 0.4337 | 0.1145 |
| 0.0542 | 137.01 | 4000 | 0.5687 | 0.4223 | 0.1097 |
| 0.0318 | 155.0 | 4500 | 0.5904 | 0.4057 | 0.1052 |
| 0.0106 | 172.0 | 5000 | 0.5794 | 0.4009 | 0.1032 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
{"language": ["tr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer"], "model-index": [{"name": "wav2vec2-xls-r-common_voice-tr-ft-500sh", "results": []}]}
|
automatic-speech-recognition
|
anton-l/wav2vec2-xls-r-common_voice-tr-ft
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"tr",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-xls-r-common\_voice-tr-ft-500sh
========================================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the COMMON\_VOICE - TR dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5794
* Wer: 0.4009
* Cer: 0.1032
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0005
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 4
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 64
* total\_eval\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 5000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.2
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
64,
191,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #tr #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0005\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.2\n* Tokenizers 0.10.3"
] |
[
-0.09650726616382599,
0.12117193639278412,
-0.004527002107352018,
0.06514307856559753,
0.11457565426826477,
0.038704510778188705,
0.08740687370300293,
0.15096403658390045,
-0.07117082923650742,
0.10256026685237885,
0.09016414731740952,
0.0954277291893959,
0.06730622053146362,
0.121658556163311,
-0.0018589081009849906,
-0.30084073543548584,
-0.0006064401823095977,
-0.035291507840156555,
-0.14174160361289978,
0.11107107251882553,
0.06938152760267258,
-0.10199827700853348,
0.035254497081041336,
-0.008177969604730606,
-0.09387392550706863,
-0.015441520139575005,
-0.033370617777109146,
-0.04377298429608345,
0.10544753819704056,
0.03250429034233093,
0.05154293030500412,
0.03388528525829315,
0.11582028120756149,
-0.2696355879306793,
0.0036237118765711784,
0.07798583060503006,
0.03489015996456146,
0.06552547961473465,
0.11395703256130219,
-0.01575239934027195,
0.155069038271904,
-0.07745800167322159,
0.07093654572963715,
0.04058479145169258,
-0.09942469000816345,
-0.26683321595191956,
-0.07749894261360168,
0.023178834468126297,
0.11465299874544144,
0.07137655466794968,
-0.01727989874780178,
0.03503761813044548,
-0.08861485123634338,
0.07530400902032852,
0.21005716919898987,
-0.2613418996334076,
-0.05939130857586861,
0.011426916345953941,
0.02991652302443981,
0.06295371800661087,
-0.10645954310894012,
-0.010591779835522175,
0.002896168502047658,
0.021983426064252853,
0.10758715122938156,
0.016453802585601807,
0.03283093869686127,
0.021959437057375908,
-0.14630067348480225,
-0.03533899411559105,
0.07027588784694672,
0.06160784140229225,
0.000020418443455128,
-0.10918369144201279,
-0.03621574491262436,
-0.21188385784626007,
-0.04603882506489754,
0.010660563595592976,
0.025481369346380234,
-0.03806672990322113,
-0.07220764458179474,
0.03760514408349991,
-0.036025211215019226,
-0.08902984112501144,
0.024924386292696,
0.12707112729549408,
0.06156732514500618,
-0.018294665962457657,
0.03176337480545044,
0.10607672482728958,
0.04564259573817253,
-0.14285829663276672,
0.010038777254521847,
0.030723029747605324,
-0.12570980191230774,
0.0032613875810056925,
-0.006937896367162466,
0.013869810849428177,
0.0398072749376297,
0.1311415582895279,
-0.012562629766762257,
0.09423021227121353,
0.036090951412916183,
0.0015186886303126812,
-0.07671743631362915,
0.1566774547100067,
-0.09057160466909409,
-0.09620054811239243,
-0.052357107400894165,
0.10791704803705215,
0.0013356053968891501,
-0.011795931495726109,
-0.060727328062057495,
0.034138571470975876,
0.08319862186908722,
0.05953458696603775,
-0.00788439903408289,
0.03258901834487915,
-0.06455041468143463,
-0.02305104397237301,
0.026378018781542778,
-0.1114794909954071,
0.0498797744512558,
0.04513555020093918,
-0.0766843780875206,
-0.010874662548303604,
-0.011593121103942394,
-0.005367440637201071,
-0.031230654567480087,
0.11955691874027252,
-0.0568624809384346,
-0.017134033143520355,
-0.07421621680259705,
-0.09237245470285416,
0.01933680661022663,
-0.05092591792345047,
-0.006972816307097673,
-0.03509507328271866,
-0.0755171999335289,
-0.055336419492959976,
0.0721474438905716,
-0.08773519843816757,
-0.06590203940868378,
-0.06506765633821487,
-0.07392776757478714,
0.05690854415297508,
-0.012676794081926346,
0.1672700047492981,
-0.06899184733629227,
0.10178542882204056,
0.003811935195699334,
0.07161282002925873,
0.10332481563091278,
0.06601402163505554,
-0.025667661800980568,
0.04806382209062576,
-0.15635289251804352,
0.09336093813180923,
-0.10886199027299881,
0.02737072855234146,
-0.14595738053321838,
-0.1177339255809784,
0.012437171302735806,
-0.012858756817877293,
0.09155197441577911,
0.10355601459741592,
-0.17630049586296082,
-0.07944928109645844,
0.16015970706939697,
-0.05935999006032944,
-0.05750656872987747,
0.12639184296131134,
-0.0034450942184776068,
-0.08173701167106628,
0.01364877913147211,
0.18014028668403625,
0.11558209359645844,
-0.10465649515390396,
0.0258636474609375,
-0.03916841000318527,
0.09431231766939163,
0.03784100338816643,
0.08851591497659683,
-0.03213164582848549,
0.01864403858780861,
0.005747093353420496,
-0.028984401375055313,
0.05206282064318657,
-0.09483376145362854,
-0.0845029428601265,
-0.015388154424726963,
-0.08690223097801208,
0.018435847014188766,
0.05585990846157074,
0.014366020448505878,
-0.07803957164287567,
-0.1408282369375229,
-0.015206616371870041,
0.10370628535747528,
-0.1025032326579094,
0.014914145693182945,
-0.07959572970867157,
0.04273127019405365,
-0.00136687105987221,
0.004191636107861996,
-0.1508346050977707,
-0.02572690136730671,
0.0373956672847271,
-0.07912580668926239,
0.015243024565279484,
-0.01703687757253647,
0.07285024225711823,
0.05898039788007736,
-0.05843387544155121,
-0.053451456129550934,
-0.029981613159179688,
-0.0005431451718322933,
-0.05321763455867767,
-0.24829073250293732,
-0.06063608080148697,
-0.017463689669966698,
0.11950088292360306,
-0.19546708464622498,
-0.005130867473781109,
0.009505760855972767,
0.1062871590256691,
0.015317042358219624,
-0.057556916028261185,
0.0067534781992435455,
0.07056788355112076,
-0.009248172864317894,
-0.08137263357639313,
0.0394209586083889,
-0.01087295450270176,
-0.09308500587940216,
-0.003892136737704277,
-0.12488295137882233,
0.07878796756267548,
0.08454620093107224,
0.047100335359573364,
-0.09303056448698044,
-0.017544720321893692,
-0.06039084121584892,
-0.061199188232421875,
-0.016729922965168953,
0.041393253952264786,
0.1784866452217102,
0.020322374999523163,
0.09032922238111496,
-0.05505417659878731,
-0.051257021725177765,
0.03983549773693085,
0.02499673143029213,
-0.004433903377503157,
0.1588943600654602,
0.08472131192684174,
-0.03040558286011219,
0.0873672366142273,
0.05281677842140198,
-0.05540505051612854,
0.11064938455820084,
-0.06559780985116959,
-0.09643793106079102,
-0.04491405561566353,
0.01520730834454298,
0.032546184957027435,
0.1147979199886322,
-0.12804773449897766,
0.001790310605429113,
0.022178659215569496,
0.026831630617380142,
0.014585593715310097,
-0.18266832828521729,
-0.014595383778214455,
0.029792703688144684,
-0.09041447192430496,
-0.02908508852124214,
-0.011465982533991337,
-0.004178459290415049,
0.09733716398477554,
0.010962596163153648,
-0.06074533611536026,
-0.03238954395055771,
-0.027315450832247734,
-0.07195647060871124,
0.17312005162239075,
-0.10649093240499496,
-0.12859269976615906,
-0.09693026542663574,
-0.015674591064453125,
-0.01656954362988472,
-0.01806756481528282,
0.03416433557868004,
-0.10802824050188065,
-0.051601435989141464,
-0.06305265426635742,
0.035556040704250336,
-0.03626472130417824,
0.041519805788993835,
0.025757957249879837,
0.006891528144478798,
0.06608571112155914,
-0.09295368939638138,
0.020806055516004562,
-0.022632718086242676,
-0.04870061203837395,
0.028026899322867393,
0.06267334520816803,
0.09605301171541214,
0.1769210547208786,
0.03849821165204048,
0.03229113295674324,
-0.013883654028177261,
0.16069002449512482,
-0.10870067030191422,
0.009016416035592556,
0.08731616288423538,
0.018506791442632675,
0.04322023317217827,
0.15436333417892456,
0.05321677401661873,
-0.09025441855192184,
0.0074879880994558334,
0.05073830485343933,
-0.028257351368665695,
-0.2320399433374405,
-0.05448167398571968,
-0.05091191455721855,
-0.01398072112351656,
0.11123993247747421,
0.03806055709719658,
-0.05097490921616554,
0.003313750261440873,
0.009692511521279812,
-0.020374102517962456,
0.02252248302102089,
0.03781552240252495,
0.07318771630525589,
0.04457278549671173,
0.10708366334438324,
-0.019281012937426567,
-0.02620791271328926,
0.043712861835956573,
0.00202089617960155,
0.23092514276504517,
-0.026585832238197327,
0.1515386700630188,
0.04335152730345726,
0.1428251713514328,
0.002357055200263858,
0.03922758251428604,
0.006063040345907211,
-0.009093119762837887,
0.02340478077530861,
-0.042141810059547424,
-0.016171788796782494,
0.015427989885210991,
0.09812910854816437,
-0.00715153943747282,
-0.0940544381737709,
0.02236972190439701,
0.03422601521015167,
0.3009266257286072,
0.07420157641172409,
-0.26897305250167847,
-0.07192477583885193,
0.0009158822358585894,
-0.05961545929312706,
-0.025527682155370712,
0.0429861806333065,
0.12171454727649689,
-0.06417351961135864,
0.07781057804822922,
-0.06511335074901581,
0.07244770973920822,
-0.09089361876249313,
-0.00297488272190094,
0.1191580668091774,
0.11426755785942078,
0.009926973842084408,
0.05497979745268822,
-0.24731574952602386,
0.2645098865032196,
-0.016467560082674026,
0.06259509176015854,
-0.04513252526521683,
0.03186352923512459,
0.00009752627374837175,
-0.03447254002094269,
0.08653848618268967,
-0.0028904250357300043,
-0.09974342584609985,
-0.1672193855047226,
-0.1084480956196785,
0.02069324627518654,
0.13463392853736877,
-0.04463903233408928,
0.10218235850334167,
-0.03758090361952782,
-0.043390605598688126,
0.04232379421591759,
-0.12660428881645203,
-0.08346126228570938,
-0.11520218104124069,
0.029666809365153313,
-0.03263112157583237,
0.02876753732562065,
-0.06936533004045486,
-0.09426268935203552,
-0.08865419030189514,
0.16735237836837769,
-0.08708090335130692,
-0.021659687161445618,
-0.12905645370483398,
0.05929184705018997,
0.18283024430274963,
-0.06577032059431076,
0.04774423688650131,
0.020175259560346603,
0.0815303847193718,
0.04569804668426514,
-0.04685577377676964,
0.11585226655006409,
-0.07228069752454758,
-0.20218445360660553,
-0.05953390896320343,
0.15543992817401886,
0.047235824167728424,
0.07142528891563416,
-0.043016884475946426,
0.03287417069077492,
-0.0021519677247852087,
-0.09723682701587677,
0.08130601048469543,
0.047964420169591904,
0.01750962622463703,
0.03227691724896431,
-0.03229621425271034,
0.02166011743247509,
-0.05720260366797447,
-0.05649036541581154,
0.10849513113498688,
0.26616954803466797,
-0.10367998480796814,
0.0788557380437851,
0.04322325810790062,
-0.05226834863424301,
-0.17440110445022583,
-0.02594470977783203,
0.11275532841682434,
0.02316022850573063,
-0.0017402281519025564,
-0.2179841250181198,
0.039338596165180206,
0.09111566841602325,
-0.01732790842652321,
0.08656188100576401,
-0.37155354022979736,
-0.1328900307416916,
0.07441651076078415,
0.08302823454141617,
-0.03580643981695175,
-0.16398796439170837,
-0.055205825716257095,
0.004730965476483107,
-0.0922386422753334,
0.058545827865600586,
0.03238997980952263,
0.12864381074905396,
-0.02110089175403118,
0.016660818830132484,
0.019528035074472427,
-0.05383923649787903,
0.13569529354572296,
0.0033540388103574514,
0.04024743661284447,
-0.018322423100471497,
0.041622571647167206,
-0.011826783418655396,
-0.05390524864196777,
-0.0007410020334646106,
-0.07462171465158463,
0.03651195392012596,
-0.09747902303934097,
-0.02385615184903145,
-0.08681236952543259,
-0.0008977237157523632,
-0.03432483598589897,
-0.034052833914756775,
-0.029403725638985634,
0.05407027155160904,
0.09365296363830566,
-0.022348325699567795,
0.08515956252813339,
-0.031427234411239624,
0.11632730066776276,
0.09280026704072952,
0.07378551363945007,
0.007391475606709719,
-0.10513000935316086,
-0.00743955560028553,
0.0007005666266195476,
0.02865145169198513,
-0.12448158115148544,
0.03713028505444527,
0.15550534427165985,
0.044977132230997086,
0.12117257714271545,
0.05319216102361679,
-0.06144792214035988,
0.0004034526355098933,
0.07230211794376373,
-0.09920009225606918,
-0.16381962597370148,
0.008197021670639515,
-0.038876764476299286,
-0.10440065711736679,
-0.001688867574557662,
0.07588072866201401,
-0.030340440571308136,
-0.007327147759497166,
0.01673533394932747,
0.06482790410518646,
-0.029286077246069908,
0.24501995742321014,
0.013233632780611515,
0.07933172583580017,
-0.10929732024669647,
0.09834078699350357,
0.05016762763261795,
-0.14422036707401276,
0.030390016734600067,
0.10513386130332947,
-0.06351803988218307,
-0.009594671428203583,
0.07744330912828445,
0.06385409086942673,
0.032533757388591766,
-0.02142919786274433,
-0.1076708436012268,
-0.14297744631767273,
0.10264381021261215,
0.061468422412872314,
0.025061914697289467,
0.030823374167084694,
-0.03422961384057999,
0.03526155650615692,
-0.10935273766517639,
0.10692198574542999,
0.09399983286857605,
0.06231293827295303,
-0.1284976601600647,
0.10511147975921631,
0.008788865059614182,
0.006144794635474682,
-0.009202076122164726,
0.017180537804961205,
-0.1200597882270813,
0.011027061380445957,
-0.09358393400907516,
-0.024891065433621407,
-0.056636590510606766,
0.00972068216651678,
0.0030919990967959166,
-0.05646122246980667,
-0.048842523247003555,
0.019678102806210518,
-0.10384080559015274,
-0.05550370365381241,
-0.03154940903186798,
0.06059170514345169,
-0.10915971547365189,
-0.01491143461316824,
0.026812691241502762,
-0.11137153953313828,
0.10208720713853836,
0.03846249729394913,
0.02717260830104351,
0.027643674984574318,
-0.08544818311929703,
-0.002567353192716837,
0.032409343868494034,
-0.0006897735293023288,
0.03911951556801796,
-0.18174827098846436,
-0.005338575225323439,
-0.03631289303302765,
0.0049512386322021484,
-0.0007479509222321212,
0.036468468606472015,
-0.12859341502189636,
0.022926101461052895,
-0.06685075908899307,
-0.05210915952920914,
-0.05452004075050354,
0.04754200577735901,
0.08671417087316513,
0.027895305305719376,
0.14331623911857605,
-0.08837692439556122,
0.05759640783071518,
-0.21715058386325836,
-0.009373086504638195,
-0.013277309946715832,
-0.05512319877743721,
-0.04960798844695091,
-0.004228977952152491,
0.10535820573568344,
-0.052263420075178146,
0.11168516427278519,
-0.0233757346868515,
0.033922214061021805,
0.02441100776195526,
-0.07933463901281357,
-0.0011383831733837724,
0.057502083480358124,
0.16502638161182404,
0.03995382413268089,
-0.031749412417411804,
0.06735267490148544,
0.0014011558378115296,
0.0434248186647892,
0.12941449880599976,
0.1657746434211731,
0.1264907866716385,
0.05398554727435112,
0.04796602949500084,
0.06247594952583313,
-0.16479066014289856,
-0.17059847712516785,
0.16798456013202667,
-0.09403549134731293,
0.1484539806842804,
-0.026357892900705338,
0.20697996020317078,
0.061630766838788986,
-0.2028375118970871,
0.07272141426801682,
-0.03962358832359314,
-0.10476059466600418,
-0.10600815713405609,
-0.07937774062156677,
-0.08039525151252747,
-0.18543218076229095,
0.020505433902144432,
-0.09887391328811646,
0.08545331656932831,
0.06046439707279205,
0.05069505050778389,
0.0441245436668396,
0.09642977267503738,
0.05187873914837837,
0.020142411813139915,
0.07694606482982635,
0.04939291253685951,
-0.034809134900569916,
-0.010421297512948513,
-0.0727028176188469,
0.02227138727903366,
-0.028406711295247078,
0.06023168936371803,
-0.0352761410176754,
-0.10891562700271606,
0.072280153632164,
0.010031148791313171,
-0.08480335026979446,
0.025013089179992676,
-0.01962737925350666,
0.05326996371150017,
0.09589334577322006,
0.041658613830804825,
-0.017702845856547356,
-0.024416808038949966,
0.2402745485305786,
-0.09461177885532379,
-0.06592399626970291,
-0.12304878234863281,
0.2732031047344208,
-0.0005129066994413733,
0.004362833686172962,
0.035331759601831436,
-0.059652362018823624,
-0.03335849940776825,
0.15209072828292847,
0.17547371983528137,
-0.029461288824677467,
-0.026274021714925766,
0.022546999156475067,
-0.008922592736780643,
-0.01743367686867714,
0.10404233634471893,
0.12582863867282867,
0.07405202090740204,
-0.05486546829342842,
-0.038770899176597595,
-0.046554576605558395,
-0.05376826971769333,
-0.037523310631513596,
0.08614908903837204,
0.03389982134103775,
-0.011954160407185555,
-0.026441430673003197,
0.09846607595682144,
-0.07897283136844635,
-0.14057864248752594,
0.060372188687324524,
-0.18550674617290497,
-0.19662149250507355,
-0.03292994946241379,
0.07228443026542664,
0.007707824464887381,
0.04552292823791504,
0.012731441296637058,
-0.02853395789861679,
0.101543128490448,
0.008695941418409348,
-0.06344974786043167,
-0.0904678925871849,
0.05386866256594658,
-0.07269972562789917,
0.18769490718841553,
-0.02542051672935486,
0.019987402483820915,
0.11124308407306671,
0.05846165493130684,
-0.10489511489868164,
0.05290534719824791,
0.07746513187885284,
-0.1457371860742569,
0.046982187777757645,
0.18174487352371216,
-0.04381519556045532,
0.11560030281543732,
0.04037439450621605,
-0.04501301795244217,
0.013095674104988575,
-0.10848080366849899,
-0.04496502876281738,
-0.03941347822546959,
-0.017140787094831467,
-0.035666681826114655,
0.15170004963874817,
0.22492875158786774,
-0.0535806305706501,
0.004094380885362625,
-0.054887693375349045,
0.005916445516049862,
0.01393944676965475,
0.16744928061962128,
-0.03273686766624451,
-0.25648555159568787,
0.026252107694745064,
-0.013258266262710094,
0.04055996611714363,
-0.173325777053833,
-0.08320947736501694,
0.029206736013293266,
-0.04867954179644585,
-0.07764510065317154,
0.13923712074756622,
0.0567268431186676,
0.04915127903223038,
-0.058029480278491974,
-0.09960106760263443,
-0.022662555798888206,
0.17042776942253113,
-0.19349786639213562,
-0.05798094719648361
] |
null | null |
transformers
|
# Italian Bert Base Uncased on Squad-it
## Model description
This model is the uncased base version of the italian BERT (which you may find at `dbmdz/bert-base-italian-uncased`) trained on the question answering task.
#### How to use
```python
from transformers import pipeline
nlp = pipeline('question-answering', model='antoniocappiello/bert-base-italian-uncased-squad-it')
# nlp(context="D'Annunzio nacque nel 1863", question="Quando nacque D'Annunzio?")
# {'score': 0.9990354180335999, 'start': 22, 'end': 25, 'answer': '1863'}
```
## Training data
It has been trained on the question answering task using [SQuAD-it](http://sag.art.uniroma2.it/demo-software/squadit/), derived from the original SQuAD dataset and obtained through the semi-automatic translation of the SQuAD dataset in Italian.
## Training procedure
```bash
python ./examples/run_squad.py \
--model_type bert \
--model_name_or_path dbmdz/bert-base-italian-uncased \
--do_train \
--do_eval \
--train_file ./squad_it_uncased/train-v1.1.json \
--predict_file ./squad_it_uncased/dev-v1.1.json \
--learning_rate 3e-5 \
--num_train_epochs 2 \
--max_seq_length 384 \
--doc_stride 128 \
--output_dir ./models/bert-base-italian-uncased-squad-it/ \
--per_gpu_eval_batch_size=3 \
--per_gpu_train_batch_size=3 \
--do_lower_case \
```
## Eval Results
| Metric | # Value |
| ------ | --------- |
| **EM** | **63.8** |
| **F1** | **75.30** |
## Comparison
| Model | EM | F1 score |
| -------------------------------------------------------------------------------------------------------------------------------- | --------- | --------- |
| [DrQA-it trained on SQuAD-it](https://github.com/crux82/squad-it/blob/master/README.md#evaluating-a-neural-model-over-squad-it) | 56.1 | 65.9 |
| This one | **63.8** | **75.30** |
|
{"language": "it", "widget": [{"text": "Quando nacque D'Annunzio?", "context": "D'Annunzio nacque nel 1863"}]}
|
question-answering
|
antoniocappiello/bert-base-italian-uncased-squad-it
|
[
"transformers",
"pytorch",
"question-answering",
"it",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"it"
] |
TAGS
#transformers #pytorch #question-answering #it #endpoints_compatible #has_space #region-us
|
Italian Bert Base Uncased on Squad-it
=====================================
Model description
-----------------
This model is the uncased base version of the italian BERT (which you may find at 'dbmdz/bert-base-italian-uncased') trained on the question answering task.
#### How to use
Training data
-------------
It has been trained on the question answering task using SQuAD-it, derived from the original SQuAD dataset and obtained through the semi-automatic translation of the SQuAD dataset in Italian.
Training procedure
------------------
Eval Results
------------
Comparison
----------
Model: DrQA-it trained on SQuAD-it, EM: 56.1, F1 score: 65.9
Model: This one, EM: 63.8, F1 score: 75.30
|
[
"#### How to use\n\n\nTraining data\n-------------\n\n\nIt has been trained on the question answering task using SQuAD-it, derived from the original SQuAD dataset and obtained through the semi-automatic translation of the SQuAD dataset in Italian.\n\n\nTraining procedure\n------------------\n\n\nEval Results\n------------\n\n\n\nComparison\n----------\n\n\nModel: DrQA-it trained on SQuAD-it, EM: 56.1, F1 score: 65.9\nModel: This one, EM: 63.8, F1 score: 75.30"
] |
[
"TAGS\n#transformers #pytorch #question-answering #it #endpoints_compatible #has_space #region-us \n",
"#### How to use\n\n\nTraining data\n-------------\n\n\nIt has been trained on the question answering task using SQuAD-it, derived from the original SQuAD dataset and obtained through the semi-automatic translation of the SQuAD dataset in Italian.\n\n\nTraining procedure\n------------------\n\n\nEval Results\n------------\n\n\n\nComparison\n----------\n\n\nModel: DrQA-it trained on SQuAD-it, EM: 56.1, F1 score: 65.9\nModel: This one, EM: 63.8, F1 score: 75.30"
] |
[
33,
113
] |
[
"passage: TAGS\n#transformers #pytorch #question-answering #it #endpoints_compatible #has_space #region-us \n#### How to use\n\n\nTraining data\n-------------\n\n\nIt has been trained on the question answering task using SQuAD-it, derived from the original SQuAD dataset and obtained through the semi-automatic translation of the SQuAD dataset in Italian.\n\n\nTraining procedure\n------------------\n\n\nEval Results\n------------\n\n\n\nComparison\n----------\n\n\nModel: DrQA-it trained on SQuAD-it, EM: 56.1, F1 score: 65.9\nModel: This one, EM: 63.8, F1 score: 75.30"
] |
[
-0.1093226969242096,
0.0031429894734174013,
0.0005049300962127745,
0.07040009647607803,
0.13437451422214508,
0.08587895333766937,
0.03698419779539108,
0.1214599460363388,
0.0022225650027394295,
0.022817391902208328,
0.10599853843450546,
0.08414679020643234,
0.03438630327582359,
0.0931738093495369,
-0.04800209775567055,
-0.14116422832012177,
0.025723718106746674,
0.0407346673309803,
-0.12926998734474182,
0.1383921504020691,
0.10831847041845322,
-0.1128004863858223,
0.012041643261909485,
-0.032799869775772095,
-0.16503570973873138,
0.04130074009299278,
-0.02817758172750473,
-0.05580112338066101,
0.12839630246162415,
0.005730492994189262,
0.21051321923732758,
0.023198990151286125,
0.0742359459400177,
-0.1733413189649582,
0.03300613537430763,
0.017534108832478523,
-0.030115947127342224,
0.07301439344882965,
0.0002961471618618816,
-0.013415198773145676,
0.06529422104358673,
-0.0030174183193594217,
0.024768739938735962,
0.004116428084671497,
-0.11251001060009003,
-0.1554720103740692,
-0.061728451400995255,
-0.013776990585029125,
0.11959286779165268,
0.14645987749099731,
-0.031573984771966934,
0.10725072026252747,
-0.2266661524772644,
0.060049280524253845,
0.08298086374998093,
-0.2591642737388611,
-0.046214133501052856,
0.13995808362960815,
-0.00923971924930811,
0.0009653830784372985,
-0.1020180881023407,
0.039122529327869415,
0.036800384521484375,
0.05141964182257652,
-0.021116185933351517,
-0.022687342017889023,
-0.07083921879529953,
0.0756046399474144,
-0.13852836191654205,
-0.07987944781780243,
0.28149527311325073,
0.015280336141586304,
-0.03665696457028389,
0.007028771098703146,
0.007470614742487669,
-0.08109667152166367,
-0.0026507002767175436,
-0.1101861447095871,
-0.007336189970374107,
-0.0044352891854941845,
-0.0548454225063324,
-0.024162938818335533,
-0.14440728724002838,
-0.17530333995819092,
-0.12228576838970184,
0.15127165615558624,
0.03504783660173416,
0.025102294981479645,
-0.10847970098257065,
0.13494566082954407,
-0.018688613548874855,
-0.04273901507258415,
-0.05625537410378456,
-0.034232594072818756,
-0.11513391137123108,
-0.04079378768801689,
-0.09477660059928894,
-0.035420287400484085,
0.028995834290981293,
0.10770158469676971,
-0.009582140482962132,
0.012952563352882862,
0.05652725324034691,
0.0614561028778553,
0.0005787643021903932,
0.1683548539876938,
-0.09826862812042236,
-0.023128706961870193,
-0.08060971647500992,
-0.0007177299703471363,
-0.09772910177707672,
-0.009016117081046104,
-0.12719540297985077,
-0.09727304428815842,
0.09973429888486862,
0.045665379613637924,
0.004002148751169443,
0.027528388425707817,
-0.04338992014527321,
-0.0293102003633976,
-0.13858619332313538,
-0.07577227801084518,
0.02476184256374836,
0.001172723714262247,
-0.12130195647478104,
-0.0054966481402516365,
0.00002056698031083215,
0.010867896489799023,
-0.014840351417660713,
0.04612109810113907,
-0.041142236441373825,
0.007814832031726837,
-0.1149996966123581,
-0.10084235668182373,
-0.007090344559401274,
-0.08407971262931824,
0.018183734267950058,
-0.0982561782002449,
-0.12076210230588913,
-0.03874248266220093,
0.042905520647764206,
-0.06175461411476135,
0.053683090955019,
-0.04718044772744179,
-0.0400511734187603,
0.03562093898653984,
-0.026086099445819855,
0.1256413608789444,
-0.045992814004421234,
0.07286117970943451,
-0.0035562312696129084,
0.11788627505302429,
-0.06562423706054688,
0.06764617562294006,
-0.02456522546708584,
-0.005922835785895586,
-0.06008139252662659,
0.09150359779596329,
-0.07760855555534363,
-0.03922726958990097,
-0.07527525722980499,
-0.092926025390625,
-0.08626488596200943,
0.024107493460178375,
0.13049757480621338,
0.1669807732105255,
-0.18004903197288513,
0.0329115130007267,
0.1131725162267685,
-0.060588087886571884,
-0.1282729059457779,
0.12149696052074432,
-0.0663393884897232,
0.07408326864242554,
0.011162144131958485,
0.19233353435993195,
0.03842023015022278,
-0.11050183326005936,
-0.04013853520154953,
0.09907087683677673,
-0.057873763144016266,
-0.0293539110571146,
0.08894326537847519,
0.022716909646987915,
-0.07092604041099548,
0.0024963365867733955,
-0.12352745980024338,
0.03174464404582977,
-0.1251204013824463,
-0.09362497180700302,
-0.02525925077497959,
-0.05576108768582344,
0.0604688785970211,
0.042863789945840836,
0.11060693114995956,
-0.0326305590569973,
-0.04261362925171852,
-0.030740607529878616,
0.09522830694913864,
0.0020327975507825613,
-0.0037580064963549376,
-0.09734039008617401,
0.1381198763847351,
-0.10782889276742935,
-0.005929024424403906,
-0.16619272530078888,
-0.06625514477491379,
-0.04379523918032646,
0.1187211200594902,
-0.030438095331192017,
0.13332007825374603,
0.07225383818149567,
-0.020844360813498497,
-0.02902965620160103,
-0.03996637463569641,
-0.08823787420988083,
0.0018286977428942919,
-0.08499214798212051,
-0.08730776607990265,
-0.05849892273545265,
-0.05343702435493469,
0.13142408430576324,
-0.12729068100452423,
0.003910587634891272,
-0.01062085572630167,
0.04118328168988228,
-0.027280844748020172,
0.04517335444688797,
-0.02576049044728279,
0.03228263184428215,
-0.030581122264266014,
0.0044081006199121475,
0.0633668303489685,
-0.0005833149771206081,
-0.09635354578495026,
-0.010388761758804321,
0.013622498139739037,
0.1724178045988083,
0.14113134145736694,
-0.14868992567062378,
-0.10445256531238556,
-0.06329129636287689,
-0.04293477535247803,
0.025551754981279373,
-0.03709596395492554,
-0.006088686175644398,
0.10987164825201035,
-0.020494369789958,
0.09805835038423538,
-0.05066937208175659,
0.024730579927563667,
-0.01649809628725052,
-0.05536329373717308,
0.000014911691323504783,
0.09748508781194687,
0.0793832540512085,
-0.08155795186758041,
0.1026335135102272,
0.16616269946098328,
-0.06379321217536926,
0.1591462641954422,
-0.010436911135911942,
-0.0926157757639885,
-0.06851913779973984,
0.013278018683195114,
-0.044120341539382935,
0.14363183081150055,
-0.1890934556722641,
-0.01906951330602169,
0.03397541865706444,
0.09621639549732208,
0.09097670763731003,
-0.19227296113967896,
-0.08297812193632126,
0.04711585491895676,
-0.0376468300819397,
-0.1414746642112732,
0.026681432500481606,
0.0017435100162401795,
0.11990849673748016,
0.028416277840733528,
-0.05371091142296791,
0.04306803271174431,
-0.022395793348550797,
-0.06067350506782532,
0.2009996920824051,
-0.06824751198291779,
-0.15342625975608826,
-0.06552798300981522,
0.038317322731018066,
0.018065033480525017,
0.015003830194473267,
0.04436836764216423,
-0.12600409984588623,
0.026601174846291542,
0.020594481378793716,
0.06173674762248993,
-0.11998485773801804,
-0.0030691996216773987,
-0.05635804310441017,
-0.0051735700108110905,
-0.024240106344223022,
-0.10264217108488083,
-0.007757762912660837,
-0.08782240003347397,
-0.08082418143749237,
0.07528311759233475,
-0.09164149314165115,
0.13830998539924622,
0.08077383786439896,
-0.051395315676927567,
0.08747777342796326,
-0.034485869109630585,
0.20772677659988403,
-0.1269194483757019,
-0.04023841395974159,
0.05882830172777176,
0.05532628297805786,
-0.022054284811019897,
0.10254953056573868,
0.01932128518819809,
-0.07240667939186096,
0.0069526019506156445,
0.02028607949614525,
-0.06019314378499985,
-0.2385014295578003,
-0.05163724347949028,
-0.08009234070777893,
-0.030278297141194344,
-0.021359028294682503,
0.025774432346224785,
0.0476832278072834,
0.0741809830069542,
0.059578992426395416,
-0.0967211201786995,
-0.07439368963241577,
0.0028730081394314766,
0.013432302512228489,
-0.01981966942548752,
0.13195998966693878,
-0.0543096661567688,
-0.05378670245409012,
0.07453183829784393,
0.06529506295919418,
0.26452040672302246,
0.015036025084555149,
-0.007368396036326885,
0.06734681129455566,
0.2356017529964447,
0.0161306019872427,
0.10795186460018158,
0.020279299467802048,
-0.06467810273170471,
-0.019933128729462624,
0.031803615391254425,
-0.022726239636540413,
0.04060959443449974,
0.1118561401963234,
-0.05977781116962433,
-0.10866891592741013,
0.022061944007873535,
0.057354386895895004,
0.16994866728782654,
0.066217802464962,
-0.19253265857696533,
-0.03978985920548439,
-0.01685815490782261,
0.04091129079461098,
-0.04506887495517731,
0.04599850997328758,
0.14660482108592987,
-0.11256249994039536,
-0.07712464034557343,
0.003735880134627223,
0.10936781764030457,
0.019132692366838455,
0.0030968384817242622,
-0.03698865696787834,
-0.05195657163858414,
-0.0051954020746052265,
0.13483919203281403,
-0.3474605083465576,
0.28439363837242126,
-0.004948095418512821,
0.06957485526800156,
-0.04237766191363335,
-0.05931926146149635,
0.047115471214056015,
0.09538102149963379,
0.1769237071275711,
0.0011026161955669522,
-0.057897891849279404,
-0.18069219589233398,
-0.009233738295733929,
0.021817941218614578,
0.09429451078176498,
-0.012460894882678986,
0.07320491969585419,
-0.020227918401360512,
0.055659763514995575,
0.04783089831471443,
0.20102085173130035,
-0.06401896476745605,
-0.07239434123039246,
-0.02327021211385727,
0.05975555256009102,
0.049935828894376755,
-0.02801661193370819,
-0.08497808128595352,
-0.04650780186057091,
0.03542288392782211,
-0.05548692122101784,
0.025631699711084366,
-0.12304306030273438,
0.10786038637161255,
0.053729381412267685,
-0.053839024156332016,
0.0031061279587447643,
-0.00003307620863779448,
-0.0118922283872962,
0.037189967930316925,
-0.004198702983558178,
0.09932580590248108,
-0.09048757702112198,
0.013040918856859207,
-0.023141665384173393,
0.027411457151174545,
0.01567087136209011,
0.07384485751390457,
0.06623876094818115,
-0.009679856710135937,
-0.08349839597940445,
-0.10794686526060104,
0.11220826208591461,
-0.09378669410943985,
0.07692443579435349,
0.11970998346805573,
-0.06153258681297302,
-0.000586880836635828,
-0.02737506292760372,
0.0006072560208849609,
0.298289954662323,
0.10454961657524109,
-0.1287713199853897,
-0.03409286215901375,
0.06629382818937302,
-0.06882983446121216,
-0.2666914463043213,
0.06388834118843079,
0.05196349322795868,
0.07959897816181183,
-0.035507313907146454,
-0.1614494025707245,
0.017093922942876816,
0.044408880174160004,
-0.02055238001048565,
-0.015953117981553078,
-0.25864818692207336,
-0.0484112910926342,
0.13229583203792572,
0.07010912895202637,
0.24458114802837372,
-0.09803961217403412,
0.005861416459083557,
0.048594970256090164,
-0.12180587649345398,
0.09041125327348709,
-0.13992001116275787,
0.13116705417633057,
-0.027372855693101883,
0.0355994813144207,
0.011169427074491978,
-0.0793825015425682,
0.15815918147563934,
0.05803694576025009,
0.1111985296010971,
-0.04154829680919647,
-0.08010726422071457,
0.046821437776088715,
-0.023341741412878036,
0.1114281564950943,
0.07286996394395828,
0.04391659423708916,
-0.18623894453048706,
-0.035376328974962234,
-0.09413604438304901,
0.032473497092723846,
-0.018546883016824722,
-0.049972038716077805,
-0.011073580943048,
0.05621969699859619,
0.01881733536720276,
0.008738214150071144,
0.006563983391970396,
-0.07206575572490692,
0.09724166244268417,
0.033193305134773254,
0.05835495516657829,
-0.008948421105742455,
-0.06644520908594131,
0.04309087246656418,
0.0029160864651203156,
0.09082317352294922,
-0.07967420667409897,
0.0411197803914547,
0.1370646357536316,
0.02692991867661476,
0.1433863490819931,
0.10873731225728989,
-0.012877615168690681,
0.05991575866937637,
0.08408761024475098,
-0.08441658318042755,
-0.16710352897644043,
-0.01634388603270054,
-0.07591719180345535,
-0.02703067846596241,
0.04078815504908562,
0.03533702716231346,
-0.00954199768602848,
0.023181920871138573,
-0.03743370249867439,
-0.05358732119202614,
-0.09863553941249847,
0.1657668799161911,
0.10831030458211899,
0.039701178669929504,
-0.07521935552358627,
0.06226136535406113,
0.003199162660166621,
-0.0852799192070961,
0.05032826587557793,
-0.021738702431321144,
-0.08310957252979279,
-0.08732451498508453,
-0.04638352990150452,
0.20896591246128082,
-0.08268986642360687,
-0.10192413628101349,
-0.08654671907424927,
-0.07796650379896164,
0.05730392411351204,
0.03216295316815376,
0.10626747459173203,
-0.004807050805538893,
-0.04796872287988663,
-0.005844666622579098,
-0.08297163248062134,
0.04274792969226837,
0.029407793655991554,
-0.00982606690376997,
-0.07618337869644165,
0.08774378895759583,
-0.00026086787693202496,
0.12296507507562637,
-0.056302476674318314,
-0.07382068783044815,
-0.09632302820682526,
0.07456129789352417,
-0.16807126998901367,
-0.046473708003759384,
-0.00512186111882329,
-0.028544088825583458,
0.0008919549873098731,
-0.09436137974262238,
-0.08439642935991287,
0.01082345750182867,
-0.13746999204158783,
-0.0058007738552987576,
0.01940179616212845,
0.005619367118924856,
-0.10368667542934418,
-0.0437312014400959,
0.0189658235758543,
-0.01339796744287014,
0.07277774065732956,
0.12938664853572845,
0.018741978332400322,
0.009921138174831867,
-0.08507755398750305,
-0.0419745035469532,
-0.0428469181060791,
0.00490451417863369,
0.08671826124191284,
-0.11019152402877808,
0.037837326526641846,
0.01185236219316721,
0.01366367656737566,
0.04732365161180496,
0.03918624296784401,
-0.07155387103557587,
-0.015559453517198563,
-0.08609029650688171,
-0.05314579978585243,
-0.05504095181822777,
0.0738053247332573,
0.10409650951623917,
0.09640676528215408,
0.12653256952762604,
-0.00815546978265047,
0.06652285903692245,
-0.12012830376625061,
0.0022023734636604786,
-0.0319710373878479,
-0.02398708462715149,
-0.07688852399587631,
-0.0964292362332344,
0.062032584100961685,
-0.051102519035339355,
0.10289710015058517,
0.051468830555677414,
0.06516918540000916,
-0.0028699920512735844,
0.0005293710855767131,
0.03665800392627716,
-0.02803868055343628,
0.15305906534194946,
0.03730543330311775,
-0.03324958309531212,
0.04200552776455879,
0.12337515503168106,
-0.0141025735065341,
0.12846535444259644,
0.08790052682161331,
0.18178865313529968,
0.04374784976243973,
0.07404665648937225,
0.0019180004019290209,
-0.08126668632030487,
-0.09210442751646042,
-0.015109898522496223,
-0.04359442740678787,
-0.01898445002734661,
-0.03308772295713425,
0.04229773208498955,
0.09509828686714172,
-0.1482240855693817,
0.06307106465101242,
-0.07797311991453171,
-0.08521965891122818,
-0.083580382168293,
-0.018070537596940994,
-0.04778698459267616,
-0.1505744904279709,
0.016838017851114273,
-0.11884568631649017,
-0.007834605872631073,
0.2052460014820099,
0.07511794567108154,
-0.019730525091290474,
0.19884218275547028,
-0.044413406401872635,
-0.023187773302197456,
0.07835760712623596,
-0.045884404331445694,
0.0734776109457016,
-0.11891307681798935,
0.002226690761744976,
-0.0016607126453891397,
-0.07892590761184692,
0.05589864030480385,
-0.003116501960903406,
-0.0492168627679348,
-0.014824140816926956,
-0.043022122234106064,
-0.03252222016453743,
-0.04219197481870651,
0.042330335825681686,
0.06848664581775665,
0.14407122135162354,
-0.0069391424767673016,
-0.006309401243925095,
-0.01204720139503479,
0.2941392958164215,
-0.05136127024888992,
-0.05430328845977783,
-0.17734302580356598,
0.1067904382944107,
0.07669054716825485,
0.058479055762290955,
-0.011031074449419975,
-0.06379356235265732,
0.0008923099376261234,
0.24692825973033905,
0.1564764529466629,
-0.10749945044517517,
-0.027083968743681908,
0.06512197107076645,
0.01011032983660698,
0.05494721233844757,
0.07042752206325531,
0.11012975126504898,
0.09782472252845764,
-0.12247610837221146,
0.001884732278995216,
-0.13816048204898834,
-0.023535985499620438,
0.08180694282054901,
0.14205724000930786,
0.09410178661346436,
-0.035073067992925644,
-0.09676234424114227,
0.09794874489307404,
-0.046456772834062576,
-0.08831106126308441,
0.0678718090057373,
-0.16327142715454102,
-0.14172598719596863,
-0.011412840336561203,
0.07626990228891373,
0.021388422697782516,
0.0905999019742012,
-0.07883064448833466,
-0.005750351585447788,
0.028873056173324585,
0.011126975528895855,
-0.07666828483343124,
-0.12054771184921265,
0.13145901262760162,
-0.008115793578326702,
0.14424943923950195,
-0.030461590737104416,
0.10870052129030228,
0.11078886687755585,
0.04350879415869713,
-0.06852828711271286,
0.06425125896930695,
0.045868489891290665,
0.00142399943433702,
-0.0023268244694918394,
0.034748684614896774,
-0.017955850809812546,
0.012419069185853004,
0.034282803535461426,
-0.20814938843250275,
0.05662461742758751,
-0.07615004479885101,
-0.07649706304073334,
-0.11927298456430435,
0.06558355689048767,
-0.042308058589696884,
0.12751899659633636,
0.18205823004245758,
-0.02234797365963459,
0.031225893646478653,
-0.07138221710920334,
0.058963075280189514,
0.024115582928061485,
0.02550475113093853,
-0.07470913231372833,
-0.11330600827932358,
0.022503923624753952,
0.03107135370373726,
-0.042421605437994,
-0.1623310148715973,
-0.05389619991183281,
0.056652482599020004,
-0.06865871697664261,
0.03719908371567726,
0.03156762570142746,
0.14604593813419342,
0.08370504528284073,
-0.04320329800248146,
-0.016759861260652542,
-0.030830292031168938,
0.10952041298151016,
-0.08513861894607544,
-0.10212692618370056
] |
null | null |
transformers
|
# Question answering model for Estonian
This is a question answering model based on XLM-Roberta base model. It is fine-tuned subsequentially on:
1. English SQuAD v1.1
2. SQuAD v1.1 translated into Estonian
3. Small native Estonian dataset (800 samples)
The model has retained good multilingual properties and can be used for extractive QA tasks in all languages included in XLM-Roberta. The performance is best in the fine-tuning languages of Estonian and English.
| Tested on | F1 | EM |
| ----------- | --- | --- |
| EstQA test set | 82.4 | 75.3 |
| SQuAD v1.1 dev set | 86.9 | 77.9 |
The Estonian dataset used for fine-tuning and validating results is available in https://huggingface.co/datasets/anukaver/EstQA/ (version 1.0)
|
{"tags": ["question-answering"], "datasets": ["squad", "anukaver/EstQA"]}
|
question-answering
|
anukaver/xlm-roberta-est-qa
|
[
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"dataset:squad",
"dataset:anukaver/EstQA",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #xlm-roberta #question-answering #dataset-squad #dataset-anukaver/EstQA #endpoints_compatible #region-us
|
Question answering model for Estonian
=====================================
This is a question answering model based on XLM-Roberta base model. It is fine-tuned subsequentially on:
1. English SQuAD v1.1
2. SQuAD v1.1 translated into Estonian
3. Small native Estonian dataset (800 samples)
The model has retained good multilingual properties and can be used for extractive QA tasks in all languages included in XLM-Roberta. The performance is best in the fine-tuning languages of Estonian and English.
Tested on: EstQA test set, F1: 82.4, EM: 75.3
Tested on: SQuAD v1.1 dev set, F1: 86.9, EM: 77.9
The Estonian dataset used for fine-tuning and validating results is available in URL (version 1.0)
|
[] |
[
"TAGS\n#transformers #pytorch #xlm-roberta #question-answering #dataset-squad #dataset-anukaver/EstQA #endpoints_compatible #region-us \n"
] |
[
49
] |
[
"passage: TAGS\n#transformers #pytorch #xlm-roberta #question-answering #dataset-squad #dataset-anukaver/EstQA #endpoints_compatible #region-us \n"
] |
[
-0.12480838596820831,
0.09146685153245926,
-0.007749066222459078,
-0.0004266511823516339,
0.11852368712425232,
0.07320306450128555,
0.054531168192625046,
0.10624785721302032,
0.09028752893209457,
-0.003647019388154149,
0.13094070553779602,
0.2226932942867279,
-0.006166793406009674,
0.055633094161748886,
-0.10539251565933228,
-0.12235067784786224,
0.04467181861400604,
0.10080049932003021,
-0.07800672948360443,
0.12052025645971298,
0.07697542011737823,
-0.12065239995718002,
0.06855230033397675,
-0.039916615933179855,
-0.11820831894874573,
0.047901999205350876,
0.006098079960793257,
-0.06458276510238647,
0.11186202615499496,
0.05637618899345398,
0.1319209188222885,
0.04874057695269585,
-0.08662239462137222,
-0.19989855587482452,
0.04670624062418938,
-0.012469409964978695,
-0.058416157960891724,
0.044550031423568726,
0.02430585026741028,
-0.0632789134979248,
-0.007305409759283066,
-0.029106823727488518,
-0.0170159712433815,
0.054797254502773285,
-0.16723918914794922,
-0.1345091015100479,
-0.0879276916384697,
-0.04587606340646744,
0.07914812117815018,
0.07664043456315994,
-0.022237339988350868,
0.22845527529716492,
-0.15247996151447296,
0.1045033261179924,
0.14397543668746948,
-0.28379330039024353,
-0.026580194011330605,
0.044136177748441696,
0.10736719518899918,
0.04753099009394646,
-0.021016571670770645,
0.060677338391542435,
0.04873933270573616,
0.025869295001029968,
-0.07266708463430405,
-0.10242343693971634,
-0.13513219356536865,
0.09879238158464432,
-0.08682142943143845,
-0.06362666934728622,
0.34268155694007874,
0.02514377422630787,
0.05827701836824417,
-0.002507016295567155,
-0.061626896262168884,
0.05277455970644951,
0.002460239455103874,
-0.019876016303896904,
-0.01850719191133976,
-0.009043176658451557,
-0.030544353649020195,
-0.004069629590958357,
-0.11170607060194016,
-0.0005569480708800256,
-0.2465575635433197,
0.12928035855293274,
0.01692262850701809,
0.08795752376317978,
-0.22947998344898224,
0.013023201376199722,
0.005779660306870937,
-0.060217101126909256,
-0.022143078967928886,
-0.08634530007839203,
-0.11268066614866257,
-0.020638126879930496,
-0.024495819583535194,
-0.0013553804019466043,
0.12802599370479584,
0.14289453625679016,
-0.03195595741271973,
0.013119278475642204,
0.015860212966799736,
0.06616420298814774,
0.11516116559505463,
0.08361920714378357,
-0.0743291974067688,
-0.06304554641246796,
-0.003865570994094014,
-0.1255851536989212,
-0.05164355784654617,
-0.023120107129216194,
-0.05102432519197464,
-0.05860016122460365,
-0.001192991971038282,
0.13959696888923645,
0.1007039025425911,
0.01420558150857687,
-0.07740041613578796,
-0.03919677063822746,
0.00687195872887969,
-0.033210523426532745,
-0.003425422590225935,
-0.0007783149485476315,
-0.0019097994081676006,
0.12582358717918396,
-0.06698063760995865,
0.03953368216753006,
-0.030297748744487762,
-0.036505408585071564,
-0.08751246333122253,
-0.022667039185762405,
0.01442453358322382,
-0.04992019012570381,
0.06976324319839478,
-0.1366882026195526,
0.07339698076248169,
-0.12066880613565445,
-0.1040419489145279,
0.004956747405230999,
0.01749400980770588,
-0.01948815956711769,
-0.0020832892041653395,
-0.02186678908765316,
-0.008328523486852646,
-0.02079739421606064,
-0.04867744445800781,
0.007340279407799244,
-0.07600585371255875,
0.08890195190906525,
0.004121416248381138,
0.06320659071207047,
-0.07634365558624268,
0.059688325971364975,
-0.08590321242809296,
0.02657642960548401,
-0.03326595947146416,
0.04596371203660965,
-0.040801312774419785,
0.09982042759656906,
-0.04587678983807564,
-0.054990414530038834,
-0.017636677250266075,
0.02679314836859703,
-0.030498197302222252,
0.18331830203533173,
-0.13994885981082916,
-0.06528033316135406,
0.1840183138847351,
-0.05088476464152336,
-0.2543193995952606,
0.07687553018331528,
-0.017454931512475014,
-0.003172286320477724,
0.0772525891661644,
0.16804111003875732,
0.030978461727499962,
-0.07908951491117477,
-0.03541572019457817,
0.07730978727340698,
-0.13320539891719818,
-0.14948812127113342,
0.053929898887872696,
0.019191008061170578,
-0.01699244976043701,
0.03551729768514633,
0.025476116687059402,
0.05418261140584946,
-0.11600004881620407,
-0.07702597975730896,
-0.038761578500270844,
-0.03968140855431557,
0.012948849238455296,
0.07070618122816086,
0.02957743965089321,
-0.05591598525643349,
0.04809520021080971,
-0.007179593201726675,
0.02441423200070858,
0.0217587873339653,
-0.004525118041783571,
-0.09791102260351181,
0.13056066632270813,
-0.1721130609512329,
-0.0028657333459705114,
-0.20043708384037018,
-0.16534192860126495,
-0.04029713571071625,
0.11465124785900116,
-0.05370436981320381,
0.19153353571891785,
0.07360009849071503,
-0.1231403723359108,
-0.0013396823778748512,
-0.026424968615174294,
0.1238512173295021,
0.0205394234508276,
-0.05957770720124245,
-0.09061634540557861,
0.034018173813819885,
-0.0934039056301117,
0.0021208624821156263,
-0.04008942097425461,
-0.03981315344572067,
0.02268623374402523,
0.1214025542140007,
0.009946501813828945,
0.0553637333214283,
0.03293583542108536,
0.045069754123687744,
-0.023265812546014786,
0.016027214005589485,
0.10065165907144547,
-0.022336017340421677,
-0.08689237385988235,
0.09599226713180542,
-0.017324266955256462,
0.21148943901062012,
0.1409028172492981,
-0.19507808983325958,
0.028771843761205673,
-0.05109678953886032,
-0.052967872470617294,
0.019304288551211357,
0.010451956652104855,
0.015478799119591713,
0.03230362385511398,
0.024473724886775017,
0.07730947434902191,
-0.051744524389505386,
-0.045837968587875366,
0.019835276529192924,
-0.040934283286333084,
-0.05999917909502983,
0.11607789248228073,
0.13518063724040985,
-0.1911778301000595,
0.157151460647583,
0.19627155363559723,
0.10496794432401657,
0.10578355193138123,
-0.06717219948768616,
-0.04522975906729698,
-0.0289436187595129,
-0.020634610205888748,
-0.059386998414993286,
0.09283578395843506,
-0.2157677561044693,
0.027348477393388748,
0.10107598453760147,
-0.006131735164672136,
0.06919534504413605,
-0.12156819552183151,
-0.10463539510965347,
-0.006636247504502535,
-0.005104709416627884,
-0.13009996712207794,
0.12405994534492493,
0.06430041790008545,
0.10707249492406845,
-0.02158665843307972,
-0.02482227422297001,
0.08755110949277878,
-0.01677239127457142,
-0.06756820529699326,
0.18038637936115265,
-0.09842990338802338,
-0.2993029057979584,
-0.00718895997852087,
-0.056577399373054504,
-0.037903349846601486,
-0.029436537995934486,
0.06627720594406128,
-0.1090439185500145,
-0.01147229690104723,
0.07972955703735352,
0.007795940153300762,
-0.0935133844614029,
0.007466775830835104,
-0.021393222734332085,
0.05997015908360481,
-0.06533781439065933,
-0.07870226353406906,
-0.03783828765153885,
-0.06661715358495712,
-0.025786638259887695,
0.14120526611804962,
-0.10138633102178574,
0.13256682455539703,
0.09089639037847519,
0.04696483165025711,
0.0649348646402359,
-0.0323849655687809,
0.27021506428718567,
-0.11573877930641174,
0.00043673504842445254,
0.1535256952047348,
0.03076031431555748,
0.05250925570726395,
0.13363949954509735,
0.012382258661091328,
-0.09113737940788269,
-0.023583630099892616,
-0.006785445846617222,
-0.0649716705083847,
-0.3025449812412262,
-0.10014209896326065,
-0.13352079689502716,
0.011987310834228992,
-0.02937217243015766,
0.014925043098628521,
0.013585254549980164,
0.09731252491474152,
0.03364274650812149,
-0.09928037226200104,
-0.11384373158216476,
0.003448422998189926,
0.1652652472257614,
-0.029254188761115074,
0.08880723267793655,
-0.09199649095535278,
-0.0807570368051529,
0.08056661486625671,
0.12102150171995163,
0.20060689747333527,
0.09646537899971008,
0.000327875604853034,
0.10165713727474213,
0.1461225003004074,
0.07515903562307358,
0.08288631588220596,
0.010359319858253002,
-0.06608850508928299,
-0.011706887744367123,
0.02532554604113102,
-0.05428086966276169,
-0.007965237833559513,
0.15420246124267578,
-0.10282104462385178,
-0.05451716110110283,
-0.08768031001091003,
0.09763920307159424,
0.09284244477748871,
0.09130831807851791,
-0.14400160312652588,
0.020085841417312622,
0.07647839933633804,
-0.009797676466405392,
-0.0464613251388073,
0.04150272160768509,
0.04007493704557419,
-0.13265550136566162,
0.04951845109462738,
-0.04321736842393875,
0.1250385046005249,
0.017939113080501556,
0.02997409552335739,
-0.026442309841513634,
-0.09344889223575592,
0.04146070405840874,
0.09958042949438095,
-0.24941740930080414,
0.29754459857940674,
0.020935015752911568,
-0.054705679416656494,
-0.09263727813959122,
-0.05164485424757004,
-0.04850732535123825,
0.11524356156587601,
0.19410724937915802,
0.032814282923936844,
-0.097687728703022,
-0.10034145414829254,
0.015357978641986847,
0.09753898531198502,
0.04558998718857765,
0.03221049904823303,
0.029369618743658066,
0.019989144057035446,
0.016827870160341263,
-0.0168844573199749,
0.05426819249987602,
0.0013937221374362707,
-0.12054383009672165,
0.024081915616989136,
0.029412273317575455,
-0.027580181136727333,
0.012423031963407993,
-0.03514279052615166,
-0.10931205749511719,
0.12296842038631439,
-0.037843652069568634,
-0.07936316728591919,
-0.09401598572731018,
-0.0460512675344944,
0.13382303714752197,
-0.09431856870651245,
-0.004985813982784748,
-0.042855069041252136,
-0.020149044692516327,
-0.03474424034357071,
-0.13688264787197113,
0.08701453357934952,
-0.11371470987796783,
0.019903216511011124,
-0.04225011169910431,
0.14859795570373535,
-0.038338176906108856,
0.03557521104812622,
0.03749656304717064,
0.0395757257938385,
-0.10160987079143524,
-0.08755506575107574,
0.05540715530514717,
-0.034173231571912766,
0.0762898400425911,
0.12446589767932892,
-0.01610110141336918,
0.01725873351097107,
0.005662852432578802,
-0.014958875253796577,
0.25993141531944275,
0.16813762485980988,
-0.07516803592443466,
0.10665062814950943,
0.09745802730321884,
-0.0379047654569149,
-0.2438262552022934,
-0.028787797316908836,
-0.09820403158664703,
-0.023559710010886192,
-0.001666245050728321,
-0.10411674529314041,
0.11951271444559097,
0.06461896747350693,
0.005866487976163626,
0.08166452497243881,
-0.29928281903266907,
-0.01762755587697029,
0.13863752782344818,
0.003117190208286047,
0.33584848046302795,
-0.13839003443717957,
-0.05579259246587753,
-0.019561106339097023,
-0.1883557140827179,
0.16123266518115997,
-0.030884351581335068,
0.07960978895425797,
-0.050838690251111984,
0.10173989832401276,
0.021092932671308517,
-0.07392441481351852,
0.17208515107631683,
0.06542911380529404,
0.0243977103382349,
-0.04402880743145943,
-0.09217556565999985,
0.10238289088010788,
0.013636436313390732,
0.0157818291336298,
0.03792211413383484,
0.03783384710550308,
-0.22624215483665466,
-0.0019568384159356356,
-0.12169669568538666,
0.039171312004327774,
0.01693127490580082,
-0.024295242503285408,
-0.03488830476999283,
0.0013197918888181448,
0.018176548182964325,
-0.011668180115520954,
0.19706177711486816,
-0.028284450992941856,
0.14219225943088531,
-0.03101964294910431,
0.15556612610816956,
-0.15502367913722992,
-0.1369137465953827,
-0.07272014021873474,
-0.03400328382849693,
0.05025652050971985,
-0.06874237209558487,
0.05669606849551201,
0.1916862279176712,
-0.0006925227353349328,
0.06147036328911781,
0.08313383907079697,
0.018521221354603767,
0.02707107923924923,
0.11091645061969757,
-0.16176588833332062,
-0.15122579038143158,
0.006523903924971819,
-0.06946064531803131,
-0.010224779136478901,
0.09951141476631165,
0.08911837637424469,
0.107960544526577,
-0.04196787253022194,
-0.02606174163520336,
0.014409746043384075,
-0.04014674574136734,
0.12936340272426605,
0.11127382516860962,
0.0343506820499897,
-0.15276983380317688,
0.05742349848151207,
-0.021708974614739418,
-0.17686183750629425,
-0.014900144189596176,
0.0368347130715847,
-0.0943957194685936,
-0.10722602903842926,
-0.022911496460437775,
0.11384917050600052,
-0.19789201021194458,
-0.06199832260608673,
-0.07666348665952682,
-0.0951567217707634,
0.08417510986328125,
0.18137674033641815,
0.07330676913261414,
0.06640003621578217,
0.01675810106098652,
-0.09259825199842453,
0.0020561600103974342,
0.0074564553797245026,
0.019507789984345436,
-0.00469157425686717,
-0.09125899523496628,
-0.05376354977488518,
-0.0372285395860672,
0.22190755605697632,
-0.06539211422204971,
-0.06394027918577194,
-0.1589754968881607,
0.08018246293067932,
-0.2003975212574005,
-0.04525561258196831,
-0.11922474950551987,
-0.04361230880022049,
-0.0017603624146431684,
-0.10601992905139923,
-0.08135131001472473,
-0.027237001806497574,
-0.10382847487926483,
0.07814640551805496,
0.05499505624175072,
0.047038160264492035,
-0.09412647038698196,
-0.06471800804138184,
0.08368625491857529,
-0.012590273283421993,
0.09931550174951553,
0.12501324713230133,
-0.08270393311977386,
0.05280604958534241,
-0.11368143558502197,
-0.14627696573734283,
0.049943506717681885,
0.0531141571700573,
0.1257971078157425,
-0.057892195880413055,
0.033826522529125214,
0.1087128296494484,
0.01576709933578968,
0.0887981653213501,
-0.04941103607416153,
-0.0916406586766243,
0.005696361884474754,
-0.020024895668029785,
-0.11541429907083511,
-0.060710176825523376,
-0.07536802440881729,
0.13802872598171234,
0.04053225740790367,
0.1192571371793747,
0.014655742794275284,
0.11272130161523819,
-0.11843768507242203,
-0.00633102934807539,
-0.02475026436150074,
-0.13945017755031586,
0.04677209258079529,
-0.01897561177611351,
0.05615666136145592,
-0.03040509857237339,
0.2843298316001892,
-0.0357656367123127,
0.07940903306007385,
0.019543159753084183,
0.091245137155056,
0.023319832980632782,
-0.00924327876418829,
0.18300406634807587,
0.06455188989639282,
-0.03726717829704285,
-0.019165808334946632,
0.09515516459941864,
-0.047609828412532806,
0.03519193455576897,
0.15130122005939484,
0.11564355343580246,
0.11184350401163101,
0.035081345587968826,
0.03749202936887741,
-0.0016322378069162369,
0.0380273275077343,
-0.14517374336719513,
-0.04361701384186745,
0.0157563928514719,
0.0434269905090332,
0.04321328178048134,
0.11728417128324509,
-0.08800962567329407,
0.06983530521392822,
-0.07688125222921371,
-0.04557415097951889,
-0.12807349860668182,
-0.06259945034980774,
-0.08401059359312057,
-0.07253509759902954,
0.03111584298312664,
-0.141324982047081,
-0.02473726123571396,
0.12244636565446854,
0.058103885501623154,
-0.03160673379898071,
0.05386020615696907,
0.08794356137514114,
-0.038862816989421844,
0.014627539552748203,
0.00972866453230381,
0.054439228028059006,
-0.004933733027428389,
0.07104703783988953,
-0.04114781692624092,
-0.03156469762325287,
-0.032627396285533905,
0.02436693198978901,
-0.06970072537660599,
0.02302606962621212,
-0.15559224784374237,
-0.08875439316034317,
-0.08274678140878677,
0.07785511016845703,
-0.036181822419166565,
0.07513311505317688,
0.019498717039823532,
0.06655776500701904,
0.04296107590198517,
0.1773400902748108,
-0.02421777881681919,
-0.07102217525243759,
-0.11970601975917816,
0.10858796536922455,
-0.004947703797370195,
0.0541405975818634,
0.024126488715410233,
-0.012500101700425148,
-0.022688670083880424,
0.2923106253147125,
0.2633183002471924,
-0.0883219912648201,
0.02635693922638893,
0.006662076339125633,
0.03578740358352661,
0.048225920647382736,
0.023127304390072823,
0.08918988704681396,
0.25084760785102844,
-0.09284573048353195,
-0.06404519826173782,
-0.07256656140089035,
0.022653251886367798,
-0.005275750067085028,
0.04141400009393692,
0.08328638225793839,
-0.020482635125517845,
-0.030924523249268532,
0.1503015160560608,
-0.12242441624403,
0.07664661854505539,
0.02656591683626175,
-0.17413972318172455,
-0.10279668867588043,
-0.04524426534771919,
0.10612408816814423,
0.036322642117738724,
0.08037302643060684,
-0.050333861261606216,
-0.07264198362827301,
0.08061744272708893,
0.03953545168042183,
-0.2419227808713913,
-0.08136077225208282,
0.14569790661334991,
-0.024661147966980934,
0.006423562299460173,
-0.0205026064068079,
0.09035378694534302,
0.09784910082817078,
0.01831045374274254,
-0.0686483234167099,
0.02312418259680271,
0.0780843049287796,
-0.02220802567899227,
-0.009681771509349346,
-0.012280913069844246,
0.03439688682556152,
-0.061579614877700806,
0.09574694186449051,
-0.10450655221939087,
0.049886565655469894,
0.0055689215660095215,
-0.0016880178591236472,
-0.05861955136060715,
0.09067666530609131,
-0.05462032929062843,
0.06900230795145035,
0.07242130488157272,
-0.028768453747034073,
0.0019430327229201794,
-0.046208467334508896,
0.028410719707608223,
0.02930843085050583,
-0.034898072481155396,
-0.10142533481121063,
-0.08871957659721375,
-0.031565070152282715,
0.0521099790930748,
-0.02561725676059723,
-0.09800312668085098,
-0.0225636325776577,
-0.0532100684940815,
-0.011390496976673603,
-0.025980589911341667,
0.04929773882031441,
0.072679802775383,
0.04988623037934303,
-0.012860886752605438,
-0.0027125792112201452,
0.018361881375312805,
0.07675626128911972,
-0.08158101886510849,
-0.06468789279460907
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-as
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9068
- Wer: 0.6679
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.12
- num_epochs: 240
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 5.7027 | 21.05 | 400 | 3.4157 | 1.0 |
| 1.1638 | 42.1 | 800 | 1.3498 | 0.7461 |
| 0.2266 | 63.15 | 1200 | 1.6147 | 0.7273 |
| 0.1473 | 84.21 | 1600 | 1.6649 | 0.7108 |
| 0.1043 | 105.26 | 2000 | 1.7691 | 0.7090 |
| 0.0779 | 126.31 | 2400 | 1.8300 | 0.7009 |
| 0.0613 | 147.36 | 2800 | 1.8681 | 0.6916 |
| 0.0471 | 168.41 | 3200 | 1.8567 | 0.6875 |
| 0.0343 | 189.46 | 3600 | 1.9054 | 0.6840 |
| 0.0265 | 210.51 | 4000 | 1.9020 | 0.6786 |
| 0.0219 | 231.56 | 4400 | 1.9068 | 0.6679 |
### Framework versions
- Transformers 4.16.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_7_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-as --dataset mozilla-foundation/common_voice_7_0 --config as --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-as"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_7_0", "as", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "জাহাজত তো তিশকুৰলৈ যাব কিন্তু জহাজিটো আহিপনে"
```
### Eval results on Common Voice 7 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 67 | 56.995 |
|
{"language": ["as"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "hf-asr-leaderboard", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-as", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "as"}, "metrics": [{"type": "wer", "value": 56.995, "name": "Test WER"}, {"type": "cer", "value": 20.39, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-as
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"robust-speech-event",
"as",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"as"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #hf-asr-leaderboard #robust-speech-event #as #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-as
============================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 1.9068
* Wer: 0.6679
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.12
* num\_epochs: 240
### Training results
### Framework versions
* Transformers 4.16.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_7\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 7 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #hf-asr-leaderboard #robust-speech-event #as #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
92,
144,
4,
35,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #hf-asr-leaderboard #robust-speech-event #as #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240### Training results### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
-0.11215634644031525,
0.12959110736846924,
-0.00697297090664506,
0.03118935599923134,
0.08367100358009338,
0.014747746288776398,
0.10080301016569138,
0.16885869204998016,
-0.04974254593253136,
0.12505799531936646,
0.06866682320833206,
0.08732202649116516,
0.08489532023668289,
0.09508002549409866,
-0.03664928302168846,
-0.22645555436611176,
0.013786058872938156,
-0.044199004769325256,
-0.09800765663385391,
0.11329706013202667,
0.10154468566179276,
-0.09157131612300873,
0.027298709377646446,
0.01924421638250351,
-0.07532370090484619,
-0.009349706582725048,
-0.042120277881622314,
-0.04724779352545738,
0.07781581580638885,
0.04135490581393242,
0.019675685092806816,
0.0493333674967289,
0.05713183432817459,
-0.28578364849090576,
0.0017655910924077034,
0.05767187848687172,
0.04179012030363083,
0.06201343238353729,
0.12392336130142212,
-0.03097614087164402,
0.1223268210887909,
-0.06741843372583389,
0.03324790298938751,
0.06782886385917664,
-0.07450060546398163,
-0.22698567807674408,
-0.07770849019289017,
0.03681791573762894,
0.13968199491500854,
0.06645721197128296,
-0.04529942572116852,
0.06174403801560402,
-0.09721367806196213,
0.09393557906150818,
0.226456418633461,
-0.21545428037643433,
-0.05679988116025925,
-0.005593718029558659,
0.01671345718204975,
-0.003497318597510457,
-0.11899729073047638,
-0.01796908862888813,
0.017660198733210564,
-0.00613428233191371,
0.06821338087320328,
0.004347852431237698,
0.0632329210639,
0.004196114372462034,
-0.13853183388710022,
-0.05700480192899704,
0.12898893654346466,
0.07752527296543121,
-0.03161676600575447,
-0.12228385359048843,
-0.011690566316246986,
-0.15802933275699615,
-0.033984046429395676,
0.010370765812695026,
0.01760357990860939,
-0.03982258960604668,
-0.017501365393400192,
0.05416988953948021,
-0.06203395500779152,
-0.06510896980762482,
0.054098136723041534,
0.1456449329853058,
0.036834027618169785,
-0.03468908742070198,
0.015730969607830048,
0.09142518043518066,
0.048444706946611404,
-0.1727917194366455,
-0.032778169959783554,
0.045175790786743164,
-0.13188302516937256,
-0.0211178045719862,
-0.012700720690190792,
0.0004236951644998044,
0.09322789311408997,
0.13936564326286316,
0.010231740772724152,
0.10590671747922897,
0.013598243705928326,
0.009383167140185833,
-0.04496128484606743,
0.183583602309227,
-0.03717430680990219,
-0.10053633153438568,
-0.0469340980052948,
0.1376655101776123,
-0.007312556263059378,
-0.0022062186617404222,
-0.043916892260313034,
0.03932734951376915,
0.12931497395038605,
0.07321424037218094,
0.006903381552547216,
0.01153787411749363,
-0.0736609622836113,
-0.027710655704140663,
-0.001112273195758462,
-0.1474715918302536,
0.051584888249635696,
0.09169093519449234,
-0.056226152926683426,
-0.00017449491133447737,
-0.013696555979549885,
-0.0024704423267394304,
-0.04577341303229332,
0.0867566391825676,
-0.0416136234998703,
-0.0031357426196336746,
-0.06082763895392418,
-0.06905154138803482,
0.03115287981927395,
-0.02525179460644722,
-0.008886358700692654,
-0.03926057368516922,
-0.05004351586103439,
-0.07842473685741425,
0.05410202220082283,
-0.07930126041173935,
-0.07291574776172638,
-0.07514359802007675,
-0.10392056405544281,
0.04333950951695442,
-0.0070860134437680244,
0.15147271752357483,
-0.04611275717616081,
0.08210258930921555,
0.05266399681568146,
0.04712345823645592,
0.16287176311016083,
0.05930871516466141,
-0.024366267025470734,
0.05552434176206589,
-0.15477254986763,
0.12397680431604385,
-0.12924684584140778,
0.040130313485860825,
-0.15520431101322174,
-0.08771388232707977,
0.008895747363567352,
-0.007462439592927694,
0.08468735963106155,
0.15097397565841675,
-0.16185778379440308,
-0.08459240943193436,
0.12418190389871597,
-0.058639150112867355,
-0.0848231092095375,
0.14423824846744537,
-0.004808126483112574,
-0.04157853126525879,
-0.0018159914761781693,
0.1831568330526352,
0.12703658640384674,
-0.09349552541971207,
-0.0025363913737237453,
-0.04397334158420563,
0.08249595761299133,
0.08206502348184586,
0.0933137908577919,
-0.06956229358911514,
0.032327696681022644,
-0.003966010175645351,
-0.05501411855220795,
0.01882525347173214,
-0.0648229643702507,
-0.07656271755695343,
-0.007986766286194324,
-0.050765447318553925,
-0.0022162203676998615,
0.04412337392568588,
-0.014466171152889729,
-0.08917061239480972,
-0.137797549366951,
-0.040122903883457184,
0.09504281729459763,
-0.09170159697532654,
0.013953286223113537,
-0.09614431858062744,
0.0828215479850769,
-0.02266329899430275,
0.006969408132135868,
-0.13705381751060486,
0.011907230131328106,
0.04803356155753136,
-0.06811311841011047,
0.0008163873571902514,
-0.0037178087513893843,
0.061544567346572876,
0.015548585914075375,
-0.018475355580449104,
-0.056074198335409164,
-0.015033644624054432,
-0.007421183865517378,
-0.047281112521886826,
-0.24688634276390076,
-0.06878213584423065,
-0.01976209692656994,
0.17748111486434937,
-0.19761547446250916,
0.0051326691173017025,
0.11260874569416046,
0.109639473259449,
0.00444257166236639,
-0.051653891801834106,
0.029903879389166832,
0.02271486632525921,
-0.023806797340512276,
-0.06396623700857162,
0.005548322107642889,
-0.0011580770369619131,
-0.11334259063005447,
-0.0016807422507554293,
-0.1412086933851242,
0.04177027940750122,
0.07936398684978485,
0.04345940425992012,
-0.07585179805755615,
-0.0471314862370491,
-0.06531963497400284,
-0.05361616238951683,
-0.03287559747695923,
-0.013957880437374115,
0.17002728581428528,
0.04293211176991463,
0.0804123654961586,
-0.08001357316970825,
-0.07178547233343124,
0.02660294994711876,
-0.002765029203146696,
-0.01268803421407938,
0.15842850506305695,
0.038940079510211945,
-0.04694659262895584,
0.0669059157371521,
0.03370323032140732,
-0.0647052600979805,
0.10446666181087494,
-0.08519779145717621,
-0.09193967282772064,
-0.058519601821899414,
0.06111213564872742,
0.0336494967341423,
0.10528389364480972,
-0.191831573843956,
0.002361863385885954,
0.038784850388765335,
0.010271917097270489,
0.02296067215502262,
-0.1651066541671753,
0.018630685284733772,
0.04881761223077774,
-0.0926278606057167,
-0.0042509459890425205,
0.017973391339182854,
0.0010847547091543674,
0.06706243008375168,
-0.0071719735860824585,
-0.08514295518398285,
-0.02937997691333294,
-0.061096254736185074,
-0.10876458138227463,
0.16697929799556732,
-0.06170231103897095,
-0.13119377195835114,
-0.11365329474210739,
0.009385303594172001,
-0.04259699583053589,
-0.02460860088467598,
0.046530865132808685,
-0.09018725901842117,
-0.06060611829161644,
-0.07775674760341644,
0.012986534275114536,
-0.006093519274145365,
0.030678603798151016,
0.06594846397638321,
0.005211179610341787,
0.07549416273832321,
-0.09875410795211792,
0.0012165114749222994,
-0.016001882031559944,
-0.007445952855050564,
0.012755490839481354,
0.04120131582021713,
0.07980439066886902,
0.1645820587873459,
0.03781653568148613,
0.05730472505092621,
-0.014282363466918468,
0.19598641991615295,
-0.1394709348678589,
0.01025757472962141,
0.0979832261800766,
-0.00824833195656538,
0.050082430243492126,
0.14469477534294128,
0.032709356397390366,
-0.07652612775564194,
0.006535505875945091,
0.04786510020494461,
-0.0047723217867314816,
-0.24702847003936768,
-0.01802525483071804,
-0.06759468466043472,
-0.02690107375383377,
0.07090259343385696,
0.028192643076181412,
0.0061553725972771645,
0.012535314075648785,
-0.002885655965656042,
-0.05588092282414436,
0.044465042650699615,
0.05076610669493675,
0.08544452488422394,
0.05503265559673309,
0.1028488501906395,
-0.01949373632669449,
-0.026223545894026756,
0.013099274598062038,
-0.007225793786346912,
0.19813919067382812,
-0.0036914495285600424,
0.1885615885257721,
0.05801697075366974,
0.12443649768829346,
-0.02496512420475483,
0.04504632577300072,
-0.012820706702768803,
-0.00016502919606864452,
0.03969774395227432,
-0.06366016715765,
-0.04059700667858124,
0.02631155401468277,
0.13656727969646454,
0.026403969153761864,
-0.07488168776035309,
0.0565389022231102,
0.05977518856525421,
0.32393527030944824,
0.07307669520378113,
-0.23507776856422424,
-0.05834648385643959,
0.018602514639496803,
-0.07606364041566849,
-0.02019531838595867,
0.014317126013338566,
0.10754307359457016,
-0.08424224704504013,
0.08549173176288605,
-0.05120011419057846,
0.08010613918304443,
-0.08029817044734955,
0.014671668410301208,
0.06735490262508392,
0.10411199927330017,
0.016162678599357605,
0.06986073404550552,
-0.23015964031219482,
0.23987258970737457,
-0.005297926254570484,
0.059878651052713394,
-0.0637289509177208,
0.06287746876478195,
0.02279839850962162,
-0.044666822999715805,
0.09502705186605453,
-0.010852954350411892,
-0.07208030670881271,
-0.11797469854354858,
-0.10478116571903229,
0.014325335621833801,
0.12388788908720016,
-0.07299348711967468,
0.1254795491695404,
-0.03922779858112335,
-0.056631218641996384,
0.023907607421278954,
-0.056826625019311905,
-0.09307082742452621,
-0.09676496684551239,
0.06429500132799149,
-0.018130922690033913,
0.03309204801917076,
-0.0708090290427208,
-0.08894608914852142,
-0.11144932359457016,
0.18156373500823975,
-0.1437525749206543,
-0.0340409092605114,
-0.12628233432769775,
0.02403484843671322,
0.163272887468338,
-0.06325534731149673,
0.0073228939436376095,
0.01989690214395523,
0.1367838829755783,
0.033473849296569824,
-0.0033267808612436056,
0.09691913425922394,
-0.0811966061592102,
-0.20181311666965485,
-0.03188667073845863,
0.20443762838840485,
0.016066057607531548,
0.06504266709089279,
-0.012019002810120583,
0.004375832621008158,
-0.0003400437708478421,
-0.09295383840799332,
0.08698739111423492,
0.01639578863978386,
-0.037063319236040115,
0.04722561687231064,
-0.0019236464286223054,
-0.020617591217160225,
-0.109603151679039,
-0.03883642330765724,
0.09314006567001343,
0.2652915120124817,
-0.07235313951969147,
0.043670643121004105,
0.009248712100088596,
-0.06007780507206917,
-0.12292111665010452,
-0.008342338725924492,
0.12265683710575104,
0.0370849147439003,
-0.008866120129823685,
-0.14510920643806458,
0.0270024836063385,
0.04611605405807495,
-0.018169522285461426,
0.06961970031261444,
-0.3165038526058197,
-0.12800566852092743,
0.09742044657468796,
0.03792541101574898,
-0.06651107221841812,
-0.16863833367824554,
-0.08161366730928421,
-0.001861974480561912,
-0.06542576104402542,
0.009070617146790028,
0.007634959649294615,
0.11678268760442734,
-0.0034645141568034887,
0.0183585025370121,
0.03640307858586311,
-0.05297870934009552,
0.15140998363494873,
0.04576200619339943,
0.02994869090616703,
0.00007056618778733537,
0.01946181058883667,
-0.001198760000988841,
-0.06305281072854996,
0.052702996879816055,
-0.08197613805532455,
0.01850777305662632,
-0.13955511152744293,
-0.01608884520828724,
-0.077029287815094,
0.01941148191690445,
-0.058406274765729904,
-0.0020012427121400833,
-0.01792021282017231,
0.028401363641023636,
0.0999397337436676,
0.015206949785351753,
0.08035631477832794,
-0.06668750196695328,
0.07736150920391083,
0.19225700199604034,
0.1265823245048523,
0.006603528745472431,
-0.14561395347118378,
0.013184143230319023,
0.02673429250717163,
0.017302338033914566,
-0.11380105465650558,
0.05762967839837074,
0.13154295086860657,
0.051586754620075226,
0.1492750495672226,
0.030769621953368187,
-0.10948667675256729,
-0.021154705435037613,
0.05939100310206413,
-0.077792689204216,
-0.159651979804039,
-0.01465666200965643,
0.014502117410302162,
-0.1499660611152649,
-0.007194888778030872,
0.09758244454860687,
-0.016498733311891556,
0.004280173685401678,
0.024139756336808205,
0.06768035888671875,
-0.027535326778888702,
0.21281798183918,
0.023472003638744354,
0.11998549103736877,
-0.09364337474107742,
0.07334640622138977,
0.039328642189502716,
-0.08143262565135956,
0.04443494975566864,
0.08534635603427887,
-0.03678674250841141,
-0.01892165280878544,
0.028913293033838272,
0.08180595189332962,
0.06657279282808304,
-0.039682891219854355,
-0.13578669726848602,
-0.1562056690454483,
0.07835270464420319,
0.061471812427043915,
0.04837355762720108,
0.03363744542002678,
-0.013035431504249573,
0.02420658990740776,
-0.09970186650753021,
0.12494426220655441,
0.11607973277568817,
0.05162901431322098,
-0.12014241516590118,
0.057503752410411835,
-0.012667636387050152,
0.00027477365802042186,
0.0023939884267747402,
-0.009857908822596073,
-0.09610497951507568,
0.021298393607139587,
-0.06997615844011307,
-0.00020096544176340103,
-0.05879487842321396,
0.014404105953872204,
0.03810105845332146,
-0.06424899399280548,
-0.04575013369321823,
0.023442741483449936,
-0.11169075220823288,
-0.05248856171965599,
-0.02133321203291416,
0.0791950523853302,
-0.10935768485069275,
-0.004759086761623621,
0.04721564054489136,
-0.1523495316505432,
0.10699138790369034,
0.03386731818318367,
0.003555276431143284,
-0.008286671712994576,
-0.08648762851953506,
-0.008611760102212429,
0.03744452819228172,
0.018024850636720657,
0.034141767770051956,
-0.23572982847690582,
-0.0023784346412867308,
-0.02310861460864544,
0.0035682180896401405,
-0.008670148439705372,
0.024597574025392532,
-0.12789830565452576,
0.014535498805344105,
-0.0538942888379097,
-0.053071215748786926,
-0.0474112406373024,
0.05946231260895729,
0.07100684940814972,
0.0016365725314244628,
0.17893634736537933,
-0.06754440814256668,
0.07548585534095764,
-0.21498991549015045,
0.002747712889686227,
0.000968282634858042,
-0.0325530506670475,
-0.0477319061756134,
-0.006819024682044983,
0.10483525693416595,
-0.06511665135622025,
0.06348904967308044,
-0.03300140053033829,
0.054526157677173615,
0.02921302616596222,
-0.08947371691465378,
0.04294263571500778,
0.054088134318590164,
0.1165306493639946,
0.026499243453145027,
-0.015026046894490719,
0.07988817989826202,
-0.056152667850255966,
0.043974295258522034,
0.03339033201336861,
0.13919880986213684,
0.13668256998062134,
0.035384371876716614,
0.0796598345041275,
0.0881422907114029,
-0.12960968911647797,
-0.12037873268127441,
0.179296612739563,
-0.08032330870628357,
0.1336173266172409,
-0.01857770048081875,
0.178888201713562,
0.11129903793334961,
-0.19749467074871063,
0.08349697291851044,
-0.04085426777601242,
-0.08892223984003067,
-0.09639084339141846,
-0.09201525151729584,
-0.08040869235992432,
-0.15900301933288574,
0.021018587052822113,
-0.093452587723732,
0.07225217670202255,
0.01383985299617052,
0.052035555243492126,
0.024264363572001457,
0.10665100067853928,
0.03385654091835022,
0.0005742058856412768,
0.11529245227575302,
0.006986656691879034,
-0.03997938707470894,
-0.016970688477158546,
-0.05432623624801636,
0.05116613954305649,
-0.0041682301089167595,
0.08612467348575592,
-0.013934679329395294,
-0.0743870958685875,
0.05631019175052643,
0.002382786711677909,
-0.10913734883069992,
0.034718334674835205,
-0.041685234755277634,
0.043365586549043655,
0.08509689569473267,
0.04130524396896362,
-0.004004278220236301,
-0.021954549476504326,
0.1644248366355896,
-0.08638744801282883,
-0.060278624296188354,
-0.12931527197360992,
0.16743585467338562,
0.007507564965635538,
0.02137092687189579,
0.014271937310695648,
-0.08450069278478622,
-0.01714390143752098,
0.1725909560918808,
0.12285240739583969,
-0.007691375445574522,
-0.019329087808728218,
0.02075830101966858,
-0.0031801590230315924,
-0.017731908708810806,
0.026813272386789322,
0.10905688256025314,
0.04150780662894249,
-0.02544538490474224,
0.0016880992334336042,
-0.025360509753227234,
-0.07888420671224594,
-0.024436818435788155,
0.08327573537826538,
0.029186587780714035,
-0.006746851839125156,
-0.016839461401104927,
0.11778201907873154,
-0.07134652137756348,
-0.19142383337020874,
0.012974963523447514,
-0.1647171825170517,
-0.18596652150154114,
-0.029838819056749344,
0.08155954629182816,
0.029348941519856453,
0.06253533810377121,
0.0005236956640146673,
-0.05914366990327835,
0.12907730042934418,
0.009189240634441376,
-0.037918660789728165,
-0.06553800404071808,
0.05455389246344566,
-0.13856631517410278,
0.1600050926208496,
-0.019174419343471527,
0.05686623975634575,
0.1301901787519455,
0.03167995810508728,
-0.10190040618181229,
0.028494514524936676,
0.09494572132825851,
-0.14146779477596283,
0.06654106825590134,
0.20017856359481812,
-0.006005510687828064,
0.12514758110046387,
0.05444585904479027,
-0.06147237494587898,
-0.0036065084859728813,
-0.08642943948507309,
-0.013034519739449024,
-0.0745556578040123,
-0.0034572635777294636,
-0.04888433590531349,
0.1143723800778389,
0.20672781765460968,
-0.0727531835436821,
-0.0011894344352185726,
-0.04634535312652588,
0.021630296483635902,
0.009744611568748951,
0.12609577178955078,
-0.04426804184913635,
-0.25751084089279175,
0.05130928382277489,
-0.03174954280257225,
0.028731677681207657,
-0.1758173257112503,
-0.09309511631727219,
0.04228460416197777,
-0.05137256160378456,
-0.061698637902736664,
0.11460750550031662,
0.05535770580172539,
0.05279460921883583,
-0.05091683566570282,
-0.11429352313280106,
-0.016912000253796577,
0.17484746873378754,
-0.1759422868490219,
-0.04593603312969208
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Bulgarian
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - BG dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2473
- Wer: 0.3002
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.1589 | 3.48 | 400 | 3.0830 | 1.0 |
| 2.8921 | 6.96 | 800 | 2.6605 | 0.9982 |
| 1.3049 | 10.43 | 1200 | 0.5069 | 0.5707 |
| 1.1349 | 13.91 | 1600 | 0.4159 | 0.5041 |
| 1.0686 | 17.39 | 2000 | 0.3815 | 0.4746 |
| 0.999 | 20.87 | 2400 | 0.3541 | 0.4343 |
| 0.945 | 24.35 | 2800 | 0.3266 | 0.4132 |
| 0.9058 | 27.83 | 3200 | 0.2969 | 0.3771 |
| 0.8672 | 31.3 | 3600 | 0.2802 | 0.3553 |
| 0.8313 | 34.78 | 4000 | 0.2662 | 0.3380 |
| 0.8068 | 38.26 | 4400 | 0.2528 | 0.3181 |
| 0.7796 | 41.74 | 4800 | 0.2537 | 0.3073 |
| 0.7621 | 45.22 | 5200 | 0.2503 | 0.3036 |
| 0.7611 | 48.7 | 5600 | 0.2477 | 0.2991 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-bg --dataset mozilla-foundation/common_voice_8_0 --config bg --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-bg --dataset speech-recognition-community-v2/dev_data --config bg --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-bg"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "bg", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "и надутият му ката блоонкурем взе да се събира"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 30.07 | 21.195 |
|
{"language": ["bg"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "XLS-R-300M - Bulgarian", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "bg"}, "metrics": [{"type": "wer", "value": 21.195, "name": "Test WER"}, {"type": "cer", "value": 4.786, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "bg"}, "metrics": [{"type": "wer", "value": 32.667, "name": "Test WER"}, {"type": "cer", "value": 12.452, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "bg"}, "metrics": [{"type": "wer", "value": 31.03, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-bg
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"bg",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"bg"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #bg #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Bulgarian
======================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - BG dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2473
* Wer: 0.3002
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 50.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #bg #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
111,
132,
4,
39,
60,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #bg #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.10503694415092468,
0.14951837062835693,
-0.006366066634654999,
0.02167411521077156,
0.10409969836473465,
0.03131416067481041,
0.10555198043584824,
0.16428419947624207,
-0.03923604637384415,
0.14233830571174622,
0.05767828971147537,
0.10852036625146866,
0.08806639909744263,
0.09803421795368195,
-0.0327577218413353,
-0.21015851199626923,
0.03647731989622116,
-0.06525789946317673,
-0.06967666745185852,
0.10381212830543518,
0.08995483815670013,
-0.08354352414608002,
0.02726409025490284,
-0.012360455468297005,
-0.061244044452905655,
-0.011641127988696098,
-0.030832476913928986,
-0.043584778904914856,
0.06355471909046173,
0.03034580871462822,
0.021152198314666748,
0.02764756605029106,
0.05562597140669823,
-0.3064557909965515,
-0.0014091901248320937,
0.07728344947099686,
0.025527039542794228,
0.0429081916809082,
0.0950700119137764,
-0.02659958228468895,
0.10142752528190613,
-0.095610611140728,
0.04565524682402611,
0.07916493713855743,
-0.08402806520462036,
-0.23738668859004974,
-0.10697171837091446,
0.03783446550369263,
0.15220783650875092,
0.07866668701171875,
-0.05033568665385246,
0.039712920784950256,
-0.09183035045862198,
0.09004233032464981,
0.19817006587982178,
-0.2126588523387909,
-0.04205911606550217,
0.001054273685440421,
0.02616853453218937,
0.0356222465634346,
-0.0906243696808815,
-0.01419915072619915,
0.006937080062925816,
0.00471725407987833,
0.04417090862989426,
-0.004490914288908243,
0.0485469214618206,
0.0025385261978954077,
-0.14386755228042603,
-0.07951971143484116,
0.14257913827896118,
0.06566353887319565,
-0.03925858810544014,
-0.12270034849643707,
-0.010254823602735996,
-0.1802334189414978,
-0.042105644941329956,
0.008936612866818905,
0.014855480752885342,
-0.024713808670639992,
-0.012007979676127434,
0.03144382685422897,
-0.05606577545404434,
-0.07197101414203644,
0.06216626986861229,
0.11176111549139023,
0.04492633417248726,
-0.04441088065505028,
0.011704325675964355,
0.09055034816265106,
0.02982439659535885,
-0.15580841898918152,
-0.05167591571807861,
0.03796590492129326,
-0.14030936360359192,
-0.008896115235984325,
-0.025432607159018517,
0.010702589526772499,
0.0891341120004654,
0.1907109171152115,
0.01746491715312004,
0.10270746052265167,
-0.014127054251730442,
0.009286132641136646,
-0.04541417956352234,
0.15525944530963898,
-0.029950447380542755,
-0.09483201056718826,
-0.0345027856528759,
0.12199798971414566,
-0.001961835427209735,
-0.010756713338196278,
-0.03383544087409973,
0.030240491032600403,
0.11644624173641205,
0.10047019273042679,
0.024296939373016357,
0.007663310039788485,
-0.08748847991228104,
-0.019469989463686943,
-0.018419617787003517,
-0.14968077838420868,
0.061208341270685196,
0.0797085240483284,
-0.04730257764458656,
-0.025323569774627686,
-0.017023930326104164,
0.012354590930044651,
-0.06307259947061539,
0.0940636545419693,
-0.04823477938771248,
-0.004472975153476,
-0.06568317115306854,
-0.10168725997209549,
0.05583624541759491,
-0.022137915715575218,
-0.033644214272499084,
-0.0492180734872818,
-0.07017450034618378,
-0.08846130222082138,
0.03995196148753166,
-0.055411629378795624,
-0.04854908958077431,
-0.08498714119195938,
-0.09753038734197617,
0.04646793380379677,
-0.01221480779349804,
0.13708378374576569,
-0.05780179798603058,
0.08778279274702072,
0.02154027484357357,
0.04183214530348778,
0.12693363428115845,
0.06443341821432114,
-0.017517652362585068,
0.06298168003559113,
-0.12952126562595367,
0.12418320029973984,
-0.13663004338741302,
0.04282912611961365,
-0.16331642866134644,
-0.0826030895113945,
0.01678823120892048,
-0.001560780918225646,
0.09158454090356827,
0.14485380053520203,
-0.18863657116889954,
-0.0710228979587555,
0.16188408434391022,
-0.05390813201665878,
-0.0849759429693222,
0.1405525803565979,
0.00003379098779987544,
-0.04692321643233299,
0.026932545006275177,
0.16303251683712006,
0.1284879595041275,
-0.1039256677031517,
-0.03136879950761795,
-0.06870302557945251,
0.07318636029958725,
0.06039440631866455,
0.10395815223455429,
-0.07802365720272064,
0.030641647055745125,
-0.005846619140356779,
-0.05575563758611679,
0.005455591715872288,
-0.0602596215903759,
-0.0794629454612732,
-0.003567848354578018,
-0.040588218718767166,
-0.014676358550786972,
0.019404994323849678,
-0.03661556914448738,
-0.0876724123954773,
-0.12954573333263397,
-0.044284526258707047,
0.10328032076358795,
-0.08721470832824707,
0.027602307498455048,
-0.0959349200129509,
0.0720018595457077,
0.0018615839071571827,
0.027926089242100716,
-0.14627140760421753,
-0.030111221596598625,
0.04860847443342209,
-0.07662595808506012,
0.004925748333334923,
-0.04137267544865608,
0.03496164456009865,
0.021356618031859398,
-0.006988652050495148,
-0.05222664400935173,
-0.04090423882007599,
-0.010276081040501595,
-0.04962664097547531,
-0.21668951213359833,
-0.06366506218910217,
-0.023811081424355507,
0.2046305388212204,
-0.1916375309228897,
0.02260778099298477,
0.10878852754831314,
0.09516948461532593,
0.005677422042936087,
-0.053777601569890976,
0.015203922986984253,
0.051603492349386215,
-0.02008497528731823,
-0.05819179117679596,
0.011477827094495296,
-0.009743605740368366,
-0.094856858253479,
-0.020353760570287704,
-0.15179695188999176,
0.003484871005639434,
0.08310067653656006,
0.04958266392350197,
-0.05974585562944412,
-0.045108020305633545,
-0.06232549995183945,
-0.04376300424337387,
-0.0510837584733963,
-0.052032534033060074,
0.10187121480703354,
0.05151655524969101,
0.08421605080366135,
-0.06904410570859909,
-0.06302386522293091,
0.029159532859921455,
0.007596694864332676,
-0.017194725573062897,
0.15592515468597412,
0.058211732655763626,
-0.04484869912266731,
0.08478476852178574,
0.03011312521994114,
-0.03684568032622337,
0.1022237166762352,
-0.06242403760552406,
-0.09183653444051743,
-0.06031433492898941,
0.07277797162532806,
0.04170217365026474,
0.07690152525901794,
-0.18801544606685638,
-0.010360936634242535,
0.03712084889411926,
0.034776851534843445,
0.01990940421819687,
-0.1700371354818344,
0.022405853495001793,
0.024547718465328217,
-0.09010526537895203,
-0.029490932822227478,
0.021017516031861305,
-0.000915192358661443,
0.07327334582805634,
0.01405572984367609,
-0.043268948793411255,
-0.03217453882098198,
-0.0639599859714508,
-0.1258397251367569,
0.15504111349582672,
-0.10524772852659225,
-0.14541810750961304,
-0.11778813600540161,
-0.03803315758705139,
-0.03681964799761772,
-0.02761542610824108,
0.06902025640010834,
-0.0998770222067833,
-0.06069783866405487,
-0.07933077216148376,
-0.007409440353512764,
-0.023556821048259735,
0.012896198779344559,
0.04034404084086418,
0.0164616871625185,
0.04823436960577965,
-0.11168022453784943,
-0.01273130252957344,
-0.0032868869602680206,
-0.019739562645554543,
-0.0001262138830497861,
0.051836997270584106,
0.0885128453373909,
0.16221940517425537,
0.06187743321061134,
0.06434422731399536,
-0.0183112695813179,
0.22671638429164886,
-0.1186777725815773,
-0.005683956202119589,
0.09668504446744919,
-0.0014750304399058223,
0.06416698545217514,
0.16017480194568634,
0.02808217890560627,
-0.089070625603199,
0.021537672728300095,
0.054895561188459396,
-0.0064534954726696014,
-0.2573924660682678,
-0.033279843628406525,
-0.07492037862539291,
-0.009236529469490051,
0.08243390172719955,
0.04291063919663429,
0.015187709592282772,
-0.001973414560779929,
-0.019489919766783714,
-0.01932273432612419,
0.05660063400864601,
0.06892571598291397,
0.09082219749689102,
0.03454267978668213,
0.0877571627497673,
-0.02005106396973133,
-0.0023967227898538113,
0.021267490461468697,
-0.0005501608829945326,
0.221963569521904,
0.018600618466734886,
0.19793418049812317,
0.07781822234392166,
0.130912184715271,
-0.020865241065621376,
0.03508155047893524,
-0.001819463330321014,
0.02269594930112362,
0.039616361260414124,
-0.06984441727399826,
-0.04877244308590889,
0.03900177404284477,
0.12908416986465454,
-0.0077471546828746796,
-0.0808451771736145,
0.020491860806941986,
0.05136559158563614,
0.30582988262176514,
0.08492086082696915,
-0.23476161062717438,
-0.04772809520363808,
0.030077993869781494,
-0.06641508638858795,
-0.020937731489539146,
0.004632832016795874,
0.10053495317697525,
-0.0859791710972786,
0.06718458980321884,
-0.04546013101935387,
0.08951745182275772,
-0.07195588946342468,
0.003718134481459856,
0.06192703917622566,
0.10356515645980835,
0.014771071262657642,
0.06134485453367233,
-0.25571611523628235,
0.21369321644306183,
0.0006822828436270356,
0.07583850622177124,
-0.06950541585683823,
0.05920112505555153,
0.021522607654333115,
-0.06435887515544891,
0.10399692505598068,
-0.0010348900686949492,
-0.09869177639484406,
-0.16172726452350616,
-0.10225389897823334,
0.004379215184599161,
0.1385062336921692,
-0.0733819454908371,
0.1356000155210495,
-0.03489372879266739,
-0.05772807076573372,
0.012251392938196659,
-0.01801796816289425,
-0.12841211259365082,
-0.10091040283441544,
0.07657015323638916,
0.0213785283267498,
0.07649459689855576,
-0.07677628099918365,
-0.06727567315101624,
-0.06029682978987694,
0.150204598903656,
-0.16063235700130463,
-0.02849350869655609,
-0.12842713296413422,
0.05255049467086792,
0.1563093662261963,
-0.07076762616634369,
0.024137718603014946,
0.013678773306310177,
0.1209784597158432,
0.030300868675112724,
0.002644067630171776,
0.08288148790597916,
-0.07939180731773376,
-0.19835743308067322,
-0.039714228361845016,
0.2028302401304245,
0.01989421620965004,
0.06311734765768051,
-0.01092672348022461,
0.011111735366284847,
0.006024322006851435,
-0.09344162791967392,
0.08457858115434647,
0.07305891811847687,
0.0008940822444856167,
0.07944481074810028,
-0.04962252452969551,
-0.04869869351387024,
-0.11415578424930573,
-0.04961089789867401,
0.12186464667320251,
0.26493608951568604,
-0.061792198568582535,
0.0472218282520771,
0.0335153192281723,
-0.06587634235620499,
-0.13079264760017395,
-0.021079840138554573,
0.107433021068573,
0.02623797580599785,
-0.016444914042949677,
-0.17160604894161224,
0.005618685390800238,
0.08168066293001175,
-0.011405007913708687,
0.10354862362146378,
-0.33888643980026245,
-0.1302444189786911,
0.05952787399291992,
0.05703403055667877,
-0.03360018879175186,
-0.17384904623031616,
-0.0967835858464241,
-0.02632996253669262,
-0.09190371632575989,
0.06548760831356049,
-0.011668442748486996,
0.12196420133113861,
0.014663656242191792,
0.016565779224038124,
0.026703299954533577,
-0.05597027763724327,
0.14748354256153107,
0.0626833513379097,
0.015509424731135368,
-0.01583269238471985,
0.007938966155052185,
0.027260281145572662,
-0.07141953706741333,
0.051683127880096436,
-0.07665284723043442,
0.014164955355226994,
-0.1471240073442459,
-0.02235633321106434,
-0.07397201657295227,
-0.004520919639617205,
-0.06641106307506561,
-0.0038098799996078014,
-0.021680476143956184,
0.04407921060919762,
0.1143515557050705,
0.008626162074506283,
0.06643046438694,
-0.05111522227525711,
0.08366350829601288,
0.13564081490039825,
0.10732517391443253,
0.02461095340549946,
-0.12417317181825638,
0.007249347399920225,
0.0060435812920331955,
0.014146272093057632,
-0.11122731119394302,
0.05301353707909584,
0.13185198605060577,
0.0410204641520977,
0.1708185076713562,
0.03871086612343788,
-0.10822271555662155,
-0.008264805190265179,
0.06618594378232956,
-0.06230040639638901,
-0.18096591532230377,
-0.003243084764108062,
0.006278121843934059,
-0.13722041249275208,
-0.016767723485827446,
0.1179906576871872,
-0.012015293352305889,
0.00315866875462234,
0.02080574817955494,
0.07201676070690155,
-0.036402758210897446,
0.22371241450309753,
0.013155573047697544,
0.11539291590452194,
-0.09122715145349503,
0.07587258517742157,
0.05056912451982498,
-0.09808919578790665,
0.04613565281033516,
0.11440271139144897,
-0.05688565969467163,
-0.031975846737623215,
-0.03696441277861595,
0.07020305097103119,
0.07350905984640121,
-0.0425480417907238,
-0.09993080049753189,
-0.12605459988117218,
0.09837900847196579,
0.040154825896024704,
0.02396841160953045,
0.041024573147296906,
-0.008307204581797123,
0.026212090626358986,
-0.0922752246260643,
0.1164373829960823,
0.0977555364370346,
0.043014708906412125,
-0.10202140361070633,
0.07291308790445328,
0.0015236481558531523,
0.004286996554583311,
0.01728832721710205,
-0.018586646765470505,
-0.10538490861654282,
0.03262083977460861,
-0.08216559886932373,
-0.013069969601929188,
-0.0766962394118309,
-0.010384557768702507,
0.03056548349559307,
-0.053062088787555695,
-0.04704749956727028,
0.024667127057909966,
-0.10837509483098984,
-0.08357121795415878,
-0.043843407183885574,
0.09193369001150131,
-0.12321582436561584,
-0.005702881142497063,
0.03565993905067444,
-0.151882141828537,
0.10995963215827942,
0.044414788484573364,
0.00003337908492540009,
0.0024882520083338022,
-0.07302457839250565,
-0.02070254273712635,
0.029432181268930435,
0.018575960770249367,
0.03311440348625183,
-0.2253183126449585,
-0.0018513601971790195,
-0.022018427029252052,
0.003384583629667759,
0.0019587029237300158,
0.031083399429917336,
-0.12468492239713669,
-0.0040748524479568005,
-0.04373334348201752,
-0.0501522496342659,
-0.05391103774309158,
0.04262048751115799,
0.0797996073961258,
0.01588364690542221,
0.171126589179039,
-0.05136353522539139,
0.08257881551980972,
-0.19870375096797943,
-0.003908186219632626,
0.0013190165627747774,
-0.03339647129178047,
-0.014926795847713947,
-0.0210398081690073,
0.10353634506464005,
-0.06062335520982742,
0.05348624661564827,
-0.04400581866502762,
0.048905737698078156,
0.027103064581751823,
-0.06931019574403763,
0.009147326461970806,
0.041028812527656555,
0.14356742799282074,
0.05759349465370178,
-0.025404412299394608,
0.057196732610464096,
-0.052279990166425705,
0.04535548388957977,
0.048180870711803436,
0.15601012110710144,
0.15419639647006989,
0.11638542264699936,
0.06944899260997772,
0.08208820968866348,
-0.13611289858818054,
-0.13333556056022644,
0.1614759862422943,
-0.08438489586114883,
0.1374778300523758,
-0.044594548642635345,
0.18190814554691315,
0.09474571794271469,
-0.1958865374326706,
0.08154483139514923,
-0.023800883442163467,
-0.08663514256477356,
-0.09714476764202118,
-0.10821353644132614,
-0.07570680975914001,
-0.1335122287273407,
0.012413562275469303,
-0.09748046100139618,
0.08180519938468933,
0.024611331522464752,
0.045846641063690186,
0.0343925878405571,
0.08635035902261734,
0.011847153306007385,
-0.014985991641879082,
0.11046814173460007,
-0.005865638609975576,
-0.022856157273054123,
0.0018700623186305165,
-0.06640055030584335,
0.054439399391412735,
-0.02142014354467392,
0.1047813668847084,
0.014433300122618675,
-0.0823265016078949,
0.05586028844118118,
-0.013365608640015125,
-0.10093715786933899,
0.029046908020973206,
-0.022733479738235474,
0.03085019811987877,
0.11214444786310196,
0.05432197079062462,
-0.019481111317873,
0.004511246457695961,
0.1710791140794754,
-0.07399190217256546,
-0.075550876557827,
-0.13847379386425018,
0.14475998282432556,
0.02847856469452381,
0.013401811011135578,
0.015069287270307541,
-0.0967179387807846,
-0.02720445580780506,
0.1616801768541336,
0.11403156816959381,
-0.001492599956691265,
-0.016850532963871956,
0.04005253687500954,
-0.0029637515544891357,
-0.02532634325325489,
0.06608837097883224,
0.10836546123027802,
0.10159019380807877,
-0.017133427783846855,
0.015543864108622074,
-0.021488571539521217,
-0.08308883011341095,
-0.03288422152400017,
0.07899884134531021,
0.008306274190545082,
-0.010606685653328896,
-0.009266517125070095,
0.12734469771385193,
-0.05494419485330582,
-0.15756092965602875,
0.019314981997013092,
-0.14633427560329437,
-0.17780858278274536,
-0.019616641104221344,
0.08095663040876389,
0.03753693029284477,
0.05837968364357948,
0.00505357189103961,
-0.04713042080402374,
0.15464025735855103,
-0.007978585548698902,
-0.03338483348488808,
-0.0919218584895134,
0.059339072555303574,
-0.07611321657896042,
0.16872262954711914,
-0.021062154322862625,
0.06384976208209991,
0.1455264687538147,
0.03371208533644676,
-0.11139782518148422,
0.05356220155954361,
0.09349381923675537,
-0.12542182207107544,
0.05708632990717888,
0.1889445185661316,
-0.032690227031707764,
0.12159360200166702,
0.04838244244456291,
-0.08407749980688095,
0.00688175018876791,
-0.02429826743900776,
-0.009168720804154873,
-0.08897974342107773,
-0.004925335757434368,
-0.06349620223045349,
0.11452262848615646,
0.19122864305973053,
-0.07314882427453995,
0.014164498075842857,
-0.04414685070514679,
0.014520361088216305,
0.0017055505886673927,
0.13035358488559723,
-0.046759143471717834,
-0.2669249475002289,
0.049021996557712555,
-0.002515127183869481,
0.033137623220682144,
-0.18407392501831055,
-0.08642317354679108,
0.041770074516534805,
-0.036281224340200424,
-0.06820403039455414,
0.1203530803322792,
0.05866033583879471,
0.03694521263241768,
-0.04832034558057785,
-0.18974937498569489,
-0.011523962952196598,
0.19523149728775024,
-0.1678726226091385,
-0.06027854233980179
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Hausa
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6094
- Wer: 0.5234
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 13
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 1000
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.9599 | 6.56 | 400 | 2.8650 | 1.0 |
| 2.7357 | 13.11 | 800 | 2.7377 | 0.9951 |
| 1.3012 | 19.67 | 1200 | 0.6686 | 0.7111 |
| 1.0454 | 26.23 | 1600 | 0.5686 | 0.6137 |
| 0.9069 | 32.79 | 2000 | 0.5576 | 0.5815 |
| 0.82 | 39.34 | 2400 | 0.5502 | 0.5591 |
| 0.7413 | 45.9 | 2800 | 0.5970 | 0.5586 |
| 0.6872 | 52.46 | 3200 | 0.5817 | 0.5428 |
| 0.634 | 59.02 | 3600 | 0.5636 | 0.5314 |
| 0.6022 | 65.57 | 4000 | 0.5780 | 0.5229 |
| 0.5705 | 72.13 | 4400 | 0.6036 | 0.5323 |
| 0.5408 | 78.69 | 4800 | 0.6119 | 0.5336 |
| 0.5225 | 85.25 | 5200 | 0.6105 | 0.5270 |
| 0.5265 | 91.8 | 5600 | 0.6034 | 0.5231 |
| 0.5154 | 98.36 | 6000 | 0.6094 | 0.5234 |
### Framework versions
- Transformers 4.16.1
- Pytorch 1.10.0+cu111
- Datasets 1.18.2
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-ha-cv8 --dataset mozilla-foundation/common_voice_8_0 --config ha --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-ha-cv8"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "ha", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "kakin hade ya ke da kyautar"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 47.821 | 36.295 |
|
{"language": ["ha"], "license": "apache-2.0", "tags": ["generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer"], "model-index": [{"name": "XLS-R-300M - Hausa", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "ha"}, "metrics": [{"type": "wer", "value": 36.295, "name": "Test WER"}, {"type": "cer", "value": 11.073, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-ha-cv8
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"ha",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ha"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ha #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Hausa
==================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6094
* Wer: 0.5234
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 13
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine\_with\_restarts
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 100
### Training results
### Framework versions
* Transformers 4.16.1
* Pytorch 1.10.0+cu111
* Datasets 1.18.2
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 13\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 100",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.1\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.2\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ha #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 13\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 100",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.1\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.2\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
99,
152,
4,
33,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ha #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 13\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 100### Training results### Framework versions\n\n\n* Transformers 4.16.1\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.2\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.12004068493843079,
0.127579927444458,
-0.006354670040309429,
0.04779105633497238,
0.0824429839849472,
0.03261575475335121,
0.11854729056358337,
0.17347407341003418,
-0.015935774892568588,
0.14628592133522034,
0.063412144780159,
0.07127576321363449,
0.08749507367610931,
0.12883128225803375,
-0.034704774618148804,
-0.17544710636138916,
0.04206801950931549,
-0.0644168108701706,
-0.07511810958385468,
0.10965023189783096,
0.08678556233644485,
-0.09798213839530945,
0.035628095269203186,
-0.012778034433722496,
-0.05976369231939316,
-0.02990100346505642,
-0.03954054042696953,
-0.016726570203900337,
0.08039691299200058,
0.025990162044763565,
0.050038520246744156,
0.045053597539663315,
0.06125206500291824,
-0.27100905776023865,
0.005193373188376427,
0.06870857626199722,
0.03522774949669838,
0.05664888769388199,
0.08622466772794724,
-0.009913435205817223,
0.08233626186847687,
-0.07173563539981842,
0.03774794563651085,
0.05665835738182068,
-0.09554737061262131,
-0.1924450397491455,
-0.10280756652355194,
0.0419476144015789,
0.13446079194545746,
0.06731200963258743,
-0.04805015027523041,
0.07670649141073227,
-0.09849616140127182,
0.09288182854652405,
0.2327001988887787,
-0.2042478621006012,
-0.047842543572187424,
0.02562776580452919,
0.04451620206236839,
0.017126156017184258,
-0.09632434695959091,
-0.025577522814273834,
0.007732137106359005,
-0.0022211354225873947,
0.025527717545628548,
-0.01352293137460947,
0.04670917987823486,
-0.004683595150709152,
-0.14304006099700928,
-0.08590353280305862,
0.12273471057415009,
0.065815269947052,
-0.023367807269096375,
-0.1111779436469078,
-0.0329609252512455,
-0.21799765527248383,
-0.03659718111157417,
0.00157265760935843,
0.016643183305859566,
-0.04251083359122276,
-0.011017873883247375,
0.056167371571063995,
-0.06614147126674652,
-0.07730719447135925,
0.06124322861433029,
0.06364667415618896,
0.04043308272957802,
-0.038507040590047836,
0.018461817875504494,
0.10897443443536758,
0.03371986374258995,
-0.17773418128490448,
-0.05235082656145096,
0.04016070440411568,
-0.16218870878219604,
-0.023067539557814598,
-0.022952590137720108,
0.0209813192486763,
0.10322850197553635,
0.19953522086143494,
-0.0005774120218120515,
0.0883352980017662,
0.005663164891302586,
0.020467791706323624,
-0.05469755455851555,
0.17326685786247253,
-0.02962576411664486,
-0.10552143305540085,
-0.0354284942150116,
0.1474231481552124,
-0.013221118599176407,
-0.011903089471161366,
-0.04989984631538391,
0.033662348985672,
0.1342177838087082,
0.09270606935024261,
0.013197849504649639,
0.01903100125491619,
-0.08676102757453918,
-0.01386087667196989,
-0.007298050913959742,
-0.15347571671009064,
0.04625209420919418,
0.08174985647201538,
-0.04849011078476906,
-0.03044964373111725,
0.013738814741373062,
-0.014322448521852493,
-0.04631200060248375,
0.07167134433984756,
-0.05495671182870865,
-0.01014444138854742,
-0.06605958193540573,
-0.08467139303684235,
0.045566000044345856,
-0.03685890883207321,
-0.027344800531864166,
-0.05297438055276871,
-0.0744219571352005,
-0.10520602762699127,
0.03743140771985054,
-0.0670914575457573,
-0.06218548119068146,
-0.08114147186279297,
-0.10647809505462646,
0.04450363665819168,
-0.009474231861531734,
0.10938568413257599,
-0.0641317367553711,
0.07546161860227585,
0.03690144047141075,
0.03792053833603859,
0.14496643841266632,
0.05300569534301758,
-0.04348975419998169,
0.07105571776628494,
-0.1687818020582199,
0.10398616641759872,
-0.13468697667121887,
0.05170179158449173,
-0.1605219841003418,
-0.080641008913517,
0.024381158873438835,
-0.00970093160867691,
0.08116079866886139,
0.1521548628807068,
-0.20685647428035736,
-0.06930085271596909,
0.16520020365715027,
-0.06834571063518524,
-0.09372219443321228,
0.14145050942897797,
-0.010750574059784412,
-0.023186752572655678,
0.012234161607921124,
0.14204204082489014,
0.13675037026405334,
-0.11199832707643509,
-0.009704889729619026,
-0.05010209605097771,
0.1041860356926918,
0.08633233606815338,
0.0865337923169136,
-0.06816353648900986,
0.07101334631443024,
-0.007035098038613796,
-0.08767667412757874,
0.008073162287473679,
-0.0602981299161911,
-0.0840064138174057,
-0.01091829314827919,
-0.05163763836026192,
-0.017871597781777382,
0.03449837863445282,
-0.023695379495620728,
-0.07297895103693008,
-0.13944192230701447,
-0.08450198173522949,
0.10223190486431122,
-0.08825094252824783,
0.02491290494799614,
-0.09381256252527237,
0.10149466246366501,
0.01267291884869337,
0.019139550626277924,
-0.1443677693605423,
-0.050051383674144745,
0.05667903646826744,
-0.06633026152849197,
0.013271473348140717,
0.01936635747551918,
0.04786665737628937,
0.016896896064281464,
-0.002291498938575387,
-0.05087114870548248,
-0.031139634549617767,
-0.019057434052228928,
-0.057009752839803696,
-0.2399812638759613,
-0.06849993765354156,
-0.023780107498168945,
0.18737344443798065,
-0.1942296326160431,
0.010498457588255405,
0.0989556685090065,
0.11241348087787628,
0.00011285512300673872,
-0.0533483549952507,
0.01085018739104271,
0.03945976868271828,
-0.034270644187927246,
-0.06400967389345169,
0.0022454552818089724,
-0.007941575720906258,
-0.10287251323461533,
-0.005930513143539429,
-0.17024169862270355,
0.005306584760546684,
0.07238646596670151,
0.07455279678106308,
-0.10231742262840271,
-0.0375092513859272,
-0.05749589949846268,
-0.052006207406520844,
-0.027958944439888,
-0.03866202384233475,
0.1361931413412094,
0.055677179247140884,
0.07984524965286255,
-0.06955739110708237,
-0.06517833471298218,
0.016954822465777397,
0.014242241159081459,
0.02166142500936985,
0.15947498381137848,
0.05416553094983101,
-0.06069669872522354,
0.08058836311101913,
0.03554132580757141,
-0.07518880814313889,
0.07144317775964737,
-0.08618767559528351,
-0.09803878515958786,
-0.054118141531944275,
0.0811891034245491,
0.04729342460632324,
0.09800220280885696,
-0.17446312308311462,
-0.0005586215993389487,
0.043192800134420395,
0.014558639377355576,
0.0020886415150016546,
-0.15797951817512512,
0.016553891822695732,
0.03651292622089386,
-0.09246040135622025,
-0.0015098282601684332,
0.01859801448881626,
0.013103259727358818,
0.06737164407968521,
-0.01223921962082386,
-0.10021410137414932,
-0.04216742143034935,
-0.061878617852926254,
-0.10437603294849396,
0.17811422049999237,
-0.06937935948371887,
-0.14570461213588715,
-0.11464499682188034,
-0.006726545747369528,
-0.021595846861600876,
-0.014026110991835594,
0.051179639995098114,
-0.07025576382875443,
-0.0631575807929039,
-0.08076710999011993,
0.010479169897735119,
0.02576795034110546,
0.018765652552247047,
0.03324910253286362,
0.007332692388445139,
0.06052073836326599,
-0.11480504274368286,
-0.003368955571204424,
-0.000026177207473665476,
-0.029102887958288193,
0.011078888550400734,
0.04798800125718117,
0.0929882824420929,
0.18250569701194763,
0.05866404250264168,
0.044108085334300995,
-0.015249378979206085,
0.2113211750984192,
-0.1431444138288498,
0.002972709946334362,
0.08631843328475952,
-0.03838130831718445,
0.0563850998878479,
0.1760101318359375,
0.01747225411236286,
-0.08716967701911926,
0.008304592221975327,
0.05055378004908562,
-0.010275563225150108,
-0.23642387986183167,
-0.01153554581105709,
-0.07377767562866211,
0.011081553995609283,
0.08141165226697922,
0.03631589189171791,
0.021136676892638206,
0.013164938427507877,
-0.01622840389609337,
-0.046078652143478394,
0.05222482234239578,
0.04785199090838432,
0.07485895603895187,
0.04002133384346962,
0.11368998140096664,
-0.012858763337135315,
-0.015484499745070934,
0.006034614518284798,
0.010634058155119419,
0.2087852656841278,
-0.012728475034236908,
0.21784599125385284,
0.06897088885307312,
0.13831397891044617,
-0.02457893081009388,
0.047579552978277206,
0.003463695291429758,
0.01705443672835827,
0.033448390662670135,
-0.06550426036119461,
-0.04139120876789093,
0.03383029252290726,
0.11945447325706482,
0.0012209918349981308,
-0.06793854385614395,
0.05643920600414276,
0.052334435284137726,
0.2952350080013275,
0.09197792410850525,
-0.26100242137908936,
-0.06461415439844131,
0.010781307704746723,
-0.06973810493946075,
-0.015950551256537437,
0.0014730460243299603,
0.11038126796483994,
-0.10137668997049332,
0.059890858829021454,
-0.045400723814964294,
0.08683327585458755,
-0.08105632662773132,
0.012801449745893478,
0.07137200236320496,
0.06842883676290512,
0.027306780219078064,
0.05293857678771019,
-0.22399915754795074,
0.23542486131191254,
-0.01339942216873169,
0.040900103747844696,
-0.05369483307003975,
0.06965256482362747,
0.022448286414146423,
-0.03172032907605171,
0.12119927257299423,
-0.013952379114925861,
-0.09475868195295334,
-0.1614617258310318,
-0.10409862548112869,
0.0033047778997570276,
0.12199564278125763,
-0.08956068009138107,
0.13773053884506226,
-0.041273266077041626,
-0.05310368165373802,
0.02603224478662014,
-0.017957443371415138,
-0.09197244793176651,
-0.08806540071964264,
0.08560694009065628,
-0.02065492793917656,
0.04529218748211861,
-0.06651724129915237,
-0.07756195217370987,
-0.10364966839551926,
0.17506279051303864,
-0.1453312188386917,
-0.025357889011502266,
-0.12074390053749084,
0.03824339061975479,
0.19129523634910583,
-0.0838194340467453,
0.019068999215960503,
0.003948403522372246,
0.11551525443792343,
0.022146793082356453,
0.0036065811291337013,
0.08862433582544327,
-0.07465191185474396,
-0.23395755887031555,
-0.03629434481263161,
0.1922035813331604,
0.01776249147951603,
0.06577538698911667,
-0.0020690381061285734,
0.022723058238625526,
-0.0029480732046067715,
-0.088632732629776,
0.07074696570634842,
0.033458925783634186,
-0.01376238465309143,
0.058740466833114624,
-0.018964575603604317,
-0.02380876988172531,
-0.10052580386400223,
-0.02497490495443344,
0.09490203857421875,
0.2902853190898895,
-0.06480196863412857,
0.04439388960599899,
0.004566694609820843,
-0.07162337005138397,
-0.1485324651002884,
-0.019471300765872,
0.10746163874864578,
0.036495503038167953,
-0.009850939735770226,
-0.1396566778421402,
0.02662714757025242,
0.06252643465995789,
-0.023082571104168892,
0.10883031040430069,
-0.3343459367752075,
-0.1388724446296692,
0.09746789932250977,
0.019519174471497536,
-0.06174352020025253,
-0.179861381649971,
-0.08645661920309067,
-0.003124586772173643,
-0.10100234299898148,
0.009812925010919571,
-0.005012877751141787,
0.12313112616539001,
0.006087099201977253,
0.02866428904235363,
0.029818594455718994,
-0.05832066386938095,
0.15235254168510437,
0.06524307280778885,
0.033221445977687836,
-0.022856881842017174,
0.005915079731494188,
0.025992121547460556,
-0.07729440927505493,
0.04555092751979828,
-0.0778224989771843,
0.02132796309888363,
-0.12506890296936035,
-0.008395391516387463,
-0.07485022395849228,
0.02038678713142872,
-0.06066644564270973,
0.0070825559087097645,
-0.024984296411275864,
0.046385470777750015,
0.09872789680957794,
0.01436417642980814,
0.10126268863677979,
-0.06220337003469467,
0.0995316281914711,
0.16562877595424652,
0.10427071154117584,
0.013958817347884178,
-0.1562698483467102,
0.02854043059051037,
0.023608703166246414,
0.018739843741059303,
-0.08221610635519028,
0.05901119485497475,
0.14007647335529327,
0.03987642005085945,
0.1498778909444809,
0.03790666162967682,
-0.1072009950876236,
-0.017373619601130486,
0.07046150416135788,
-0.07965065538883209,
-0.15539346635341644,
0.013257413171231747,
-0.016502046957612038,
-0.16134151816368103,
-0.006764765828847885,
0.1331748515367508,
-0.007119054906070232,
0.004801748786121607,
0.026699505746364594,
0.06985757499933243,
-0.02431483566761017,
0.23662614822387695,
0.016432588919997215,
0.12265058606863022,
-0.09922029823064804,
0.060744039714336395,
0.05453689768910408,
-0.08963888883590698,
0.03725240007042885,
0.11360469460487366,
-0.05983657017350197,
-0.040611982345581055,
0.01912447065114975,
0.09634239226579666,
0.04354692995548248,
-0.018866121768951416,
-0.1370396614074707,
-0.12231618165969849,
0.07691881060600281,
0.09970103949308395,
0.04108040779829025,
0.05941537022590637,
0.0044882153160870075,
0.02006012387573719,
-0.07608890533447266,
0.1282011866569519,
0.12323471903800964,
0.06508267670869827,
-0.10126356035470963,
0.08692899346351624,
-0.023691793903708458,
0.0008217552094720304,
0.005648319609463215,
0.010993283241987228,
-0.11475532501935959,
0.026207396760582924,
-0.07935106754302979,
0.004304906353354454,
-0.05422138050198555,
-0.01020255871117115,
0.03763996809720993,
-0.05580225586891174,
-0.03424089401960373,
0.024319401010870934,
-0.11391758918762207,
-0.0653153732419014,
-0.03658382222056389,
0.08153203129768372,
-0.13358661532402039,
-0.007939525879919529,
0.05225207284092903,
-0.1465350240468979,
0.11950782686471939,
0.02026231214404106,
-0.002723668236285448,
0.001445624977350235,
-0.09954145550727844,
-0.007519010920077562,
0.038820233196020126,
0.02181430719792843,
0.033815450966358185,
-0.20549306273460388,
0.002874744823202491,
-0.03458692133426666,
0.007728638593107462,
0.00725879380479455,
0.012636453844606876,
-0.12351009249687195,
0.003428500844165683,
-0.03553779050707817,
-0.04390811547636986,
-0.04946490377187729,
0.038880929350852966,
0.06259780377149582,
0.022999456152319908,
0.164214625954628,
-0.05652659013867378,
0.09428759664297104,
-0.2141079604625702,
-0.010937804356217384,
0.01937476173043251,
-0.038795340806245804,
-0.046174056828022,
-0.030368365347385406,
0.11911080032587051,
-0.07314395159482956,
0.056324686855077744,
-0.05022815614938736,
0.05863223597407341,
0.020236240699887276,
-0.05534731224179268,
0.04649795964360237,
0.05796927586197853,
0.09923218935728073,
0.04934922233223915,
-0.025244897231459618,
0.05496346205472946,
-0.05258963629603386,
0.038059260696172714,
-0.01553731132298708,
0.19032476842403412,
0.14000199735164642,
0.08727896213531494,
0.07672055065631866,
0.08612073212862015,
-0.14493969082832336,
-0.09998048841953278,
0.14678771793842316,
-0.09891676902770996,
0.13384705781936646,
-0.039724092930555344,
0.18704384565353394,
0.08548367023468018,
-0.18546217679977417,
0.09185329079627991,
-0.05897989124059677,
-0.08696860820055008,
-0.10285967588424683,
-0.11232071369886398,
-0.08332475274801254,
-0.1511865258216858,
0.0158960223197937,
-0.0918484553694725,
0.06380055099725723,
0.04316495731472969,
0.03981969133019447,
0.024380378425121307,
0.10278861969709396,
0.02357093244791031,
-0.0008452306501567364,
0.09450533986091614,
0.015782328322529793,
-0.0215824693441391,
-0.0036976069677621126,
-0.06921423226594925,
0.05793708190321922,
0.006893510930240154,
0.09661825746297836,
-0.014591973274946213,
-0.057575974613428116,
0.06169939786195755,
-0.013229580596089363,
-0.10482027381658554,
0.02664289064705372,
-0.023270640522241592,
0.019727610051631927,
0.08590694516897202,
0.03643962740898132,
0.019933870062232018,
-0.00734289363026619,
0.16441716253757477,
-0.07887900620698929,
-0.09180958569049835,
-0.13850019872188568,
0.15698085725307465,
0.005971542559564114,
0.012333442457020283,
0.0021140081807971,
-0.08539178222417831,
-0.012455124408006668,
0.16474874317646027,
0.13997331261634827,
-0.016605239361524582,
-0.00822885986417532,
0.04139260575175285,
0.0022248616442084312,
-0.03127116337418556,
0.03484998643398285,
0.1099715530872345,
0.06709211319684982,
-0.0250791497528553,
0.0010837034787982702,
0.0055246502161026,
-0.08378542959690094,
-0.023984890431165695,
0.05509481951594353,
0.010172564536333084,
0.01734621450304985,
-0.018095627427101135,
0.11154399812221527,
-0.082877978682518,
-0.1758696436882019,
0.028340045362710953,
-0.17357197403907776,
-0.17609742283821106,
-0.0349031537771225,
0.0949958860874176,
0.020871983841061592,
0.05127359926700592,
-0.004938167054206133,
-0.05690070986747742,
0.1543750762939453,
-0.011463074944913387,
-0.03026743046939373,
-0.10134562104940414,
0.05741393193602562,
-0.1114557608962059,
0.1557389199733734,
-0.0236042533069849,
0.0675664097070694,
0.13138025999069214,
0.030170965939760208,
-0.09868661314249039,
0.029591260477900505,
0.11299940943717957,
-0.13613058626651764,
0.05482111871242523,
0.1865951120853424,
-0.016738779842853546,
0.13318464159965515,
0.060312431305646896,
-0.08156916499137878,
0.0003317768860142678,
-0.06353449821472168,
0.005792471580207348,
-0.09208079427480698,
-0.013667270541191101,
-0.05923634395003319,
0.11974276602268219,
0.20482227206230164,
-0.06688714027404785,
0.007442709524184465,
-0.04335925728082657,
0.01624840684235096,
0.008719385601580143,
0.11495284736156464,
-0.05136384442448616,
-0.2597285509109497,
0.05952664837241173,
-0.01219850592315197,
0.041667740792036057,
-0.15822291374206543,
-0.09737574309110641,
0.05519694462418556,
-0.03719150647521019,
-0.06631486862897873,
0.12895943224430084,
0.047526028007268906,
0.045363493263721466,
-0.042731449007987976,
-0.1585467904806137,
-0.0043626753613352776,
0.17174483835697174,
-0.17722833156585693,
-0.06507731974124908
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-hi
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4156
- Wer: 0.7181
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 5.7703 | 2.72 | 400 | 2.2274 | 0.9259 |
| 0.6515 | 5.44 | 800 | 1.5812 | 0.7581 |
| 0.339 | 8.16 | 1200 | 2.0590 | 0.7825 |
| 0.2262 | 10.88 | 1600 | 2.0324 | 0.7603 |
| 0.1665 | 13.6 | 2000 | 2.1396 | 0.7481 |
| 0.1311 | 16.33 | 2400 | 2.2090 | 0.7379 |
| 0.1079 | 19.05 | 2800 | 2.3907 | 0.7612 |
| 0.0927 | 21.77 | 3200 | 2.5294 | 0.7478 |
| 0.0748 | 24.49 | 3600 | 2.5024 | 0.7452 |
| 0.0644 | 27.21 | 4000 | 2.4715 | 0.7307 |
| 0.0569 | 29.93 | 4400 | 2.4156 | 0.7181 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-hi", "results": []}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-hi
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-hi
============================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 2.4156
* Wer: 0.7181
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 30
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
65,
143,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 30### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.13605695962905884,
0.10239491611719131,
-0.0026786846574395895,
0.07665535062551498,
0.13536588847637177,
0.01284562423825264,
0.12584595382213593,
0.13119475543498993,
-0.10427722334861755,
0.07636521011590958,
0.10076141357421875,
0.10336264967918396,
0.04141414910554886,
0.09404017776250839,
-0.02573116309940815,
-0.2943366765975952,
-0.01087476871907711,
0.03317609056830406,
-0.11779751628637314,
0.1315605342388153,
0.088501937687397,
-0.12749990820884705,
0.04830952361226082,
0.03468766808509827,
-0.1711147278547287,
-0.005857060197740793,
-0.00029149831971153617,
-0.08969627320766449,
0.1418512910604477,
0.013135407119989395,
0.09932567924261093,
0.028485571965575218,
0.09925160557031631,
-0.18461495637893677,
0.005597007926553488,
0.052762024104595184,
0.0385911725461483,
0.10189276933670044,
0.0831102728843689,
-0.00703202560544014,
0.1165509968996048,
-0.06437796354293823,
0.0649537742137909,
0.038342852145433426,
-0.10067152231931686,
-0.2876797616481781,
-0.08016005903482437,
0.07612010091543198,
0.08476627618074417,
0.10822449624538422,
-0.01453719288110733,
0.10887031257152557,
-0.07690957933664322,
0.09702697396278381,
0.27027884125709534,
-0.28371185064315796,
-0.06743940711021423,
-0.020821934565901756,
0.03727772459387779,
0.02064397931098938,
-0.12196439504623413,
-0.020092889666557312,
0.033480480313301086,
0.04205891862511635,
0.11735992133617401,
-0.0016312990337610245,
-0.0488961860537529,
0.008680208586156368,
-0.14104284346103668,
-0.04406943917274475,
0.12627021968364716,
0.03981347382068634,
-0.036580804735422134,
-0.08232709765434265,
-0.05045979470014572,
-0.22561658918857574,
-0.03593229502439499,
0.0026502106338739395,
0.027682607993483543,
-0.07338247448205948,
-0.12959297001361847,
0.008916248567402363,
-0.08871240168809891,
-0.0953882560133934,
-0.010684018954634666,
0.18795771896839142,
0.046802207827568054,
0.004114482551813126,
-0.022879377007484436,
0.1201908141374588,
0.04864693433046341,
-0.15977266430854797,
0.025640975683927536,
0.05347494035959244,
-0.04830186441540718,
0.0018738718936219811,
-0.0462639182806015,
-0.027164828032255173,
0.0037481561303138733,
0.13622085750102997,
-0.06060492992401123,
0.03317798674106598,
0.026759525761008263,
0.030597053468227386,
-0.09311944991350174,
0.21115632355213165,
-0.08813326060771942,
-0.03573340177536011,
-0.01865321211516857,
0.10638561844825745,
0.02902466244995594,
-0.020254619419574738,
-0.09212665259838104,
0.007712156046181917,
0.11315334588289261,
0.037503570318222046,
-0.01648613065481186,
0.031883303076028824,
-0.03626000136137009,
-0.02785300277173519,
0.01824166253209114,
-0.10643081367015839,
0.029713261872529984,
0.026212144643068314,
-0.09288428723812103,
0.01767084002494812,
0.010093043558299541,
0.005399216897785664,
-0.025264309719204903,
0.14901107549667358,
-0.0853915736079216,
0.006065030116587877,
-0.07201535999774933,
-0.09940149635076523,
0.01896924152970314,
-0.07757503539323807,
-0.0002452850167173892,
-0.07103274017572403,
-0.1078566238284111,
-0.021552320569753647,
0.02568138763308525,
-0.04233047366142273,
-0.07735670357942581,
-0.04452931880950928,
-0.11287009716033936,
0.03866640850901604,
-0.018717942759394646,
0.1510591059923172,
-0.04757915064692497,
0.13137218356132507,
0.05919174477458,
0.058110058307647705,
0.013014313764870167,
0.06146659702062607,
-0.0696670189499855,
0.01622924767434597,
-0.15677975118160248,
0.048756908625364304,
-0.06425507366657257,
0.028935139998793602,
-0.10245729982852936,
-0.1340140551328659,
0.01630403846502304,
-0.012364747002720833,
0.09591358155012131,
0.10277808457612991,
-0.16964705288410187,
-0.11669649928808212,
0.15527918934822083,
-0.08154106140136719,
-0.0955335721373558,
0.13270890712738037,
-0.010186264291405678,
-0.03236167132854462,
0.042619768530130386,
0.1630445271730423,
0.05759001523256302,
-0.11005822569131851,
-0.023372618481516838,
-0.036181531846523285,
0.09726771712303162,
-0.013848541304469109,
0.09441940486431122,
-0.030238093808293343,
0.05816261097788811,
0.013382968492805958,
-0.04543089121580124,
0.0488956943154335,
-0.10686193406581879,
-0.09031737595796585,
-0.03667615354061127,
-0.09391535073518753,
0.05578818544745445,
0.06917215138673782,
0.04880334436893463,
-0.08591397851705551,
-0.12662190198898315,
0.03653203696012497,
0.115383081138134,
-0.0794975608587265,
0.03593605384230614,
-0.0815940573811531,
0.08300592005252838,
-0.03776697441935539,
-0.02202998287975788,
-0.19455276429653168,
0.00671153049916029,
0.024012833833694458,
-0.03789603337645531,
0.026630457490682602,
-0.027886223047971725,
0.07311876118183136,
0.06900624185800552,
-0.056135665625333786,
-0.06722376495599747,
-0.07553709298372269,
-0.017366299405694008,
-0.08455687016248703,
-0.2379463016986847,
-0.0847070962190628,
-0.009374934248626232,
0.14352145791053772,
-0.16796065866947174,
0.018478794023394585,
0.019131092354655266,
0.12436765432357788,
0.029134348034858704,
-0.03633487969636917,
-0.015695547685027122,
0.0926998108625412,
-0.025867905467748642,
-0.051067862659692764,
0.03278595581650734,
0.0028608550783246756,
-0.10104616731405258,
-0.016984669491648674,
-0.12084122747182846,
0.14290380477905273,
0.13687361776828766,
-0.030475623905658722,
-0.07506000250577927,
0.022209271788597107,
-0.08383188396692276,
-0.05003196746110916,
-0.03464868664741516,
-0.0010491431457921863,
0.17177334427833557,
0.03367258608341217,
0.13382643461227417,
-0.07762616127729416,
-0.05774146318435669,
0.042900074273347855,
0.003686982672661543,
0.006264188326895237,
0.11820534616708755,
0.08120889216661453,
-0.011120847426354885,
0.12021585553884506,
0.08222392201423645,
-0.10789736360311508,
0.14087660610675812,
-0.06650780141353607,
-0.09974949806928635,
-0.033395785838365555,
-0.021372579038143158,
0.02159574069082737,
0.1439150869846344,
-0.14272022247314453,
-0.022787300869822502,
0.0310828760266304,
-0.006790405139327049,
0.016306979581713676,
-0.22695539891719818,
-0.009444250725209713,
0.02485269494354725,
-0.06021800637245178,
-0.04196568951010704,
-0.00186703831423074,
0.015460791997611523,
0.10816733539104462,
-0.0008528516627848148,
-0.07934156060218811,
-0.004265303257852793,
-0.004062593448907137,
-0.06338826566934586,
0.19014446437358856,
-0.0747535452246666,
-0.15594618022441864,
-0.13965684175491333,
-0.01190739031881094,
-0.05826067924499512,
-0.008878281340003014,
0.05041502043604851,
-0.10103810578584671,
-0.022000430151820183,
-0.03779440000653267,
0.05118609964847565,
-0.02361445687711239,
0.05160941556096077,
0.013272619806230068,
0.004320778418332338,
0.07136386632919312,
-0.11994165182113647,
0.018109850585460663,
-0.053521495312452316,
-0.04149617254734039,
0.010982178151607513,
0.08640585094690323,
0.11176277697086334,
0.16828544437885284,
0.009908533655107021,
0.017331739887595177,
-0.029624048620462418,
0.1756455898284912,
-0.10561512410640717,
-0.035848069936037064,
0.1435987800359726,
-0.002020129468291998,
0.044633544981479645,
0.11502636224031448,
0.07584954053163528,
-0.06538429111242294,
-0.015751956030726433,
0.03309250250458717,
-0.02398618310689926,
-0.2349790334701538,
-0.044932641088962555,
-0.04570277780294418,
0.00780847342684865,
0.09524356573820114,
0.027281807735562325,
0.033361662179231644,
0.029660223051905632,
-0.008008616976439953,
0.03307690843939781,
-0.03187357634305954,
0.058467455208301544,
0.09849710017442703,
0.04382983222603798,
0.13430574536323547,
-0.020866621285676956,
-0.05998693034052849,
0.019946372136473656,
-0.02027721330523491,
0.2091505378484726,
-0.028078503906726837,
0.1571962982416153,
0.043028853833675385,
0.17954356968402863,
0.012727362103760242,
0.0886034294962883,
0.008299318142235279,
-0.022040992975234985,
0.02878522500395775,
-0.06105665862560272,
-0.029751580208539963,
0.008446482941508293,
0.05329538881778717,
0.08471213281154633,
-0.12752947211265564,
-0.0030439049005508423,
0.03241065889596939,
0.3414963185787201,
0.055799636989831924,
-0.31358101963996887,
-0.11776673793792725,
-0.026327380910515785,
-0.053150445222854614,
-0.026814624667167664,
0.02860545739531517,
0.11436879634857178,
-0.09465743601322174,
0.044661521911621094,
-0.07267492264509201,
0.07576300948858261,
-0.0628131702542305,
0.026679694652557373,
0.09269652515649796,
0.09340646117925644,
0.006218818947672844,
0.052281491458415985,
-0.25643134117126465,
0.29207843542099,
0.0033088894560933113,
0.08077815920114517,
-0.05808592215180397,
0.022377852350473404,
0.016044409945607185,
-0.011675366200506687,
0.06285165995359421,
-0.02293909527361393,
-0.006421213503926992,
-0.1925460398197174,
-0.09235963225364685,
0.011577589437365532,
0.1316409409046173,
-0.06669263541698456,
0.11743135005235672,
-0.025559114292263985,
-0.02985967881977558,
0.05116063356399536,
-0.07720690965652466,
-0.05978880822658539,
-0.08781441301107407,
0.02568228356540203,
0.0421772375702858,
0.0270999725908041,
-0.08775157481431961,
-0.13563618063926697,
-0.0792778953909874,
0.13235613703727722,
-0.10320863872766495,
-0.04022596776485443,
-0.12141509354114532,
0.08929135650396347,
0.16736827790737152,
-0.0757690966129303,
0.04693593829870224,
0.01683514378964901,
0.11542639881372452,
0.017229072749614716,
-0.03717080503702164,
0.09659630060195923,
-0.07881394028663635,
-0.23221230506896973,
-0.04584871977567673,
0.185688778758049,
0.021228820085525513,
0.07424402981996536,
-0.03619571775197983,
0.03380158543586731,
-0.0270957313477993,
-0.07575058192014694,
0.052303142845630646,
-0.0034636708442121744,
0.026715703308582306,
0.028200004249811172,
-0.01913668029010296,
-0.017130373045802116,
-0.07975330948829651,
-0.033578600734472275,
0.1512901484966278,
0.25874629616737366,
-0.0952153429389,
0.032731205224990845,
0.0679997131228447,
-0.030681682750582695,
-0.16199782490730286,
0.006863605231046677,
0.11347652226686478,
0.02592809870839119,
-0.0015694403555244207,
-0.19448810815811157,
0.08129028230905533,
0.07780350744724274,
-0.02994876354932785,
0.09849273413419724,
-0.3295871317386627,
-0.14263220131397247,
0.10845334082841873,
0.09477347880601883,
0.009885520674288273,
-0.158455491065979,
-0.053199950605630875,
-0.0038517978973686695,
-0.1055208221077919,
0.08746841549873352,
-0.03862294554710388,
0.12660135328769684,
-0.02406269684433937,
0.08263763040304184,
0.019755393266677856,
-0.0601358637213707,
0.11742973327636719,
0.007381190080195665,
0.05372411012649536,
-0.00370333855971694,
-0.019094226881861687,
0.05406370013952255,
-0.02974538318812847,
0.016605325043201447,
-0.056601230055093765,
0.037317074835300446,
-0.07737818360328674,
-0.018708940595388412,
-0.1164223700761795,
0.03726210072636604,
-0.0458039864897728,
-0.05375391244888306,
-0.011999668553471565,
0.016728827729821205,
0.02111993357539177,
-0.015904299914836884,
0.13218605518341064,
0.007011016830801964,
0.16731998324394226,
0.11412090808153152,
0.0794367790222168,
-0.02311405912041664,
-0.11873766034841537,
-0.015170780010521412,
-0.024151509627699852,
0.07039958238601685,
-0.11019694060087204,
0.013704436831176281,
0.14084243774414062,
0.08382617682218552,
0.10954851657152176,
0.07298646122217178,
-0.07550136744976044,
0.011294559575617313,
0.06472120434045792,
-0.1436174511909485,
-0.10123240202665329,
-0.024090969935059547,
-0.011345910839736462,
-0.13010868430137634,
0.06761287152767181,
0.10799771547317505,
-0.06579140573740005,
-0.020860105752944946,
0.011101609095931053,
0.004985006991773844,
-0.04137556999921799,
0.2396412342786789,
0.051925573498010635,
0.08281935751438141,
-0.12191285192966461,
0.0721748024225235,
0.04785218834877014,
-0.1399976909160614,
0.019872061908245087,
0.07641155272722244,
-0.061212215572595596,
-0.007730850018560886,
0.012105618603527546,
0.07608634978532791,
-0.0423651747405529,
-0.05315939709544182,
-0.15307676792144775,
-0.13742302358150482,
0.08075080811977386,
0.13859252631664276,
0.05966074392199516,
0.03169307857751846,
-0.0614265538752079,
0.04335540905594826,
-0.1408320665359497,
0.11314355581998825,
0.07038422673940659,
0.07435433566570282,
-0.15546034276485443,
0.17454560101032257,
0.02206384763121605,
0.03550809249281883,
-0.005524214822798967,
0.013311530463397503,
-0.09292453527450562,
0.021093279123306274,
-0.09815220534801483,
-0.04477229341864586,
-0.030217422172427177,
-0.0068687861785292625,
-0.003672074992209673,
-0.06543091684579849,
-0.05886315926909447,
0.033280886709690094,
-0.11352900415658951,
-0.041427504271268845,
0.007898147217929363,
0.02955017425119877,
-0.13708756864070892,
-0.0010219392133876681,
0.03753900155425072,
-0.10813798755407333,
0.1011262983083725,
0.08713695406913757,
0.02545839175581932,
0.06037246435880661,
-0.06919526308774948,
-0.017313463613390923,
0.04890661686658859,
0.0010162513935938478,
0.05619041994214058,
-0.12533023953437805,
-0.008779949508607388,
-0.025176584720611572,
0.042783863842487335,
0.004459843505173922,
0.060618557035923004,
-0.14334094524383545,
0.00013456273882184178,
-0.013045188039541245,
-0.04359987750649452,
-0.07026823610067368,
0.03407030552625656,
0.09066317230463028,
0.030182551592588425,
0.17324362695217133,
-0.0798678770661354,
0.05254368856549263,
-0.22363609075546265,
0.010101991705596447,
-0.04550158232450485,
-0.09135206788778305,
-0.11034657806158066,
-0.014724234119057655,
0.0896105095744133,
-0.06405670940876007,
0.08436229079961777,
-0.028976716101169586,
0.08910979330539703,
0.031693004071712494,
-0.0396939255297184,
-0.015010488219559193,
0.04197276383638382,
0.20037996768951416,
0.04464622959494591,
-0.028769630938768387,
0.071703240275383,
0.01702709123492241,
0.07456768304109573,
0.13929717242717743,
0.1674216091632843,
0.12318792194128036,
0.04157830402255058,
0.08602017164230347,
0.09144621342420578,
-0.09219691157341003,
-0.1906067281961441,
0.056072164326906204,
-0.054029546678066254,
0.12087639421224594,
-0.01713385246694088,
0.2322724461555481,
0.10108864307403564,
-0.1662161946296692,
0.057296715676784515,
-0.03369240090250969,
-0.07835713028907776,
-0.09957101941108704,
-0.023621203377842903,
-0.05993763357400894,
-0.15781563520431519,
0.014414597302675247,
-0.11725849658250809,
0.039895400404930115,
0.07333557307720184,
0.02371862903237343,
0.010996864177286625,
0.150360107421875,
0.04566650465130806,
0.01370087917894125,
0.0933636948466301,
0.03748143091797829,
-0.033684391528367996,
-0.059608470648527145,
-0.07022830098867416,
0.01947004348039627,
-0.016836170107126236,
0.04998663440346718,
-0.058300167322158813,
-0.12291464954614639,
0.05879542604088783,
0.0062131634913384914,
-0.10777003318071365,
0.03653734177350998,
-0.01663435809314251,
0.09166327118873596,
0.06650987267494202,
0.013196207582950592,
0.0051308926194906235,
-0.027603115886449814,
0.2541216015815735,
-0.11651084572076797,
-0.07769760489463806,
-0.11394571512937546,
0.2638086676597595,
0.01592622697353363,
-0.035711925476789474,
0.03794771432876587,
-0.08524362742900848,
-0.03986123949289322,
0.19177056849002838,
0.18491128087043762,
-0.02174147218465805,
-0.01778492145240307,
0.026635807007551193,
-0.01436215452849865,
-0.06273052841424942,
0.0702790766954422,
0.14130397140979767,
0.1250072419643402,
-0.07097984105348587,
-0.022574443370103836,
-0.058193519711494446,
-0.04283289611339569,
-0.019293950870633125,
0.08770202100276947,
0.015490634366869926,
-0.025038208812475204,
-0.03984344005584717,
0.07554324716329575,
-0.06518705934286118,
-0.15810750424861908,
0.04514399170875549,
-0.22970212996006012,
-0.18708986043930054,
-0.01705465279519558,
0.10384690016508102,
0.02804560400545597,
0.06913267821073532,
0.007583111524581909,
-0.01782197132706642,
0.09874231368303299,
-0.0006384752341546118,
-0.07236038893461227,
-0.09507904201745987,
0.0927719920873642,
-0.09742465615272522,
0.17850853502750397,
-0.05312737077474594,
0.07208110392093658,
0.11523633450269699,
0.08075427263975143,
-0.08665690571069717,
0.03420770913362503,
0.07142234593629837,
-0.1545378565788269,
0.027502184733748436,
0.19687232375144958,
-0.02228252962231636,
0.10155563801527023,
0.023890217766165733,
-0.12942802906036377,
0.008010745979845524,
-0.07429429143667221,
-0.03728441148996353,
-0.05541176348924637,
-0.026877230033278465,
-0.03281424567103386,
0.12499657273292542,
0.20998157560825348,
-0.05702705308794975,
-0.012711060233414173,
-0.05944878235459328,
0.004578939639031887,
0.0722796618938446,
0.07509079575538635,
-0.039562009274959564,
-0.28285497426986694,
0.02225811779499054,
-0.00678151985630393,
-0.007798443082720041,
-0.2505411207675934,
-0.08194083720445633,
0.05097193643450737,
-0.07228125631809235,
-0.08751131594181061,
0.07262127846479416,
0.05459529533982277,
0.049800317734479904,
-0.03919244930148125,
-0.02293914556503296,
-0.051654916256666183,
0.17410729825496674,
-0.20637930929660797,
-0.07141242921352386
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-mr
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5479
- Wer: 0.5740
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 3.7378 | 18.18 | 400 | 3.5047 | 1.0 |
| 3.1707 | 36.36 | 800 | 2.6166 | 0.9912 |
| 1.4942 | 54.55 | 1200 | 0.5778 | 0.6927 |
| 1.2058 | 72.73 | 1600 | 0.5168 | 0.6362 |
| 1.0558 | 90.91 | 2000 | 0.5105 | 0.6069 |
| 0.9488 | 109.09 | 2400 | 0.5151 | 0.6089 |
| 0.8588 | 127.27 | 2800 | 0.5157 | 0.5989 |
| 0.7991 | 145.45 | 3200 | 0.5179 | 0.5740 |
| 0.7545 | 163.64 | 3600 | 0.5348 | 0.5740 |
| 0.7144 | 181.82 | 4000 | 0.5518 | 0.5724 |
| 0.7041 | 200.0 | 4400 | 0.5479 | 0.5740 |
### Framework versions
- Transformers 4.16.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-mr --dataset mozilla-foundation/common_voice_8_0 --config mr --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-mr"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "mr", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "या पानास लेखाचे स्वरूप यायला हावे"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 49.177 | 32.811 |
|
{"language": ["mr"], "license": "apache-2.0", "tags": ["generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-mr", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "mr"}, "metrics": [{"type": "wer", "value": 32.811, "name": "Test WER"}, {"type": "cer", "value": 7.692, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-mr
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"mr",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mr"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #mr #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-mr
============================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5479
* Wer: 0.5740
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 200
### Training results
### Framework versions
* Transformers 4.16.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #mr #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
99,
143,
4,
35,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #mr #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200### Training results### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.10975132882595062,
0.14081859588623047,
-0.006386423017829657,
0.05950456112623215,
0.08650501817464828,
0.019948182627558708,
0.11409088224172592,
0.17628790438175201,
-0.030270228162407875,
0.12522822618484497,
0.07593602687120438,
0.09648412466049194,
0.07438722252845764,
0.10391324013471603,
-0.01855769194662571,
-0.228324756026268,
0.0030055351089686155,
-0.05424068495631218,
-0.08967257291078568,
0.10916206240653992,
0.09292592108249664,
-0.090910404920578,
0.04835518077015877,
0.0028573856689035892,
-0.06361926347017288,
-0.010654487647116184,
-0.03661249950528145,
-0.030583476647734642,
0.07859405875205994,
0.03267978876829147,
0.011875145137310028,
0.020230945199728012,
0.07490827888250351,
-0.29157328605651855,
0.0035768109373748302,
0.06613890081644058,
0.043770238757133484,
0.05764566734433174,
0.113744355738163,
-0.026841096580028534,
0.10512915998697281,
-0.06898054480552673,
0.03678817301988602,
0.0700063556432724,
-0.08631899952888489,
-0.21436789631843567,
-0.0873718336224556,
0.02823072113096714,
0.14830870926380157,
0.06203344091773033,
-0.03525247424840927,
0.04469317942857742,
-0.06966965645551682,
0.10092753171920776,
0.246455118060112,
-0.20991264283657074,
-0.057725172489881516,
0.0020404416136443615,
0.029669512063264847,
0.0011082901619374752,
-0.1122741848230362,
-0.009076900780200958,
0.012346195988357067,
0.004283074289560318,
0.0696345716714859,
-0.015827199444174767,
0.05215608328580856,
0.014756088145077229,
-0.13559651374816895,
-0.05509885773062706,
0.13047274947166443,
0.07010287046432495,
-0.025404108688235283,
-0.12642595171928406,
-0.025499893352389336,
-0.1670263111591339,
-0.03891461342573166,
0.03063190169632435,
0.01949651911854744,
-0.0402565598487854,
-0.005194309167563915,
0.035634204745292664,
-0.045882079750299454,
-0.0757010281085968,
0.05494493618607521,
0.12064303457736969,
0.036202896386384964,
-0.036446407437324524,
0.019869700074195862,
0.10089575499296188,
0.04673021659255028,
-0.17516443133354187,
-0.021183328703045845,
0.03642027825117111,
-0.13710878789424896,
-0.007748597301542759,
0.0015649889828637242,
0.024197615683078766,
0.0881531834602356,
0.14626233279705048,
0.03156285732984543,
0.09624936431646347,
0.023664306849241257,
0.007924042642116547,
-0.044987719506025314,
0.16136792302131653,
-0.056218139827251434,
-0.11765703558921814,
-0.04564138129353523,
0.13804182410240173,
-0.0075762830674648285,
-0.002258897991850972,
-0.04819713532924652,
0.048163846135139465,
0.09834049642086029,
0.08578906208276749,
0.023137299343943596,
0.019889770075678825,
-0.07923231273889542,
-0.020022116601467133,
-0.015639975666999817,
-0.1472519189119339,
0.06058146432042122,
0.0845169648528099,
-0.07676322013139725,
-0.03911804407835007,
-0.013399798423051834,
-0.007706313859671354,
-0.04599636420607567,
0.07221301645040512,
-0.052040066570043564,
0.007714299485087395,
-0.07481436431407928,
-0.0837818831205368,
0.029349680989980698,
-0.026844743639230728,
-0.01259784959256649,
-0.03608696907758713,
-0.08743966370820999,
-0.08148767799139023,
0.06024254485964775,
-0.07798643410205841,
-0.051085248589515686,
-0.07484672963619232,
-0.1006401851773262,
0.045403700321912766,
-0.0020529702305793762,
0.14443925023078918,
-0.053972531110048294,
0.08449475467205048,
0.03035358525812626,
0.04224101081490517,
0.15644782781600952,
0.060911357402801514,
-0.028580317273736,
0.06929841637611389,
-0.14341934025287628,
0.1006365418434143,
-0.1348164975643158,
0.05621081590652466,
-0.1553894281387329,
-0.08272601664066315,
0.03661718964576721,
-0.0006341672851704061,
0.08613147586584091,
0.15530480444431305,
-0.19404597580432892,
-0.06315433233976364,
0.12511327862739563,
-0.03712696209549904,
-0.09514646977186203,
0.13628140091896057,
-0.012093083932995796,
-0.02745744213461876,
0.0039879740215837955,
0.17718379199504852,
0.13710114359855652,
-0.10969889163970947,
0.003471308620646596,
-0.03784214332699776,
0.09325402975082397,
0.0860166996717453,
0.08707966655492783,
-0.0606149397790432,
0.0526575967669487,
-0.0018973236437886953,
-0.056955866515636444,
0.015185872092843056,
-0.06763342022895813,
-0.07992598414421082,
-0.010058612562716007,
-0.05457578971982002,
-0.012598167173564434,
0.038373447954654694,
-0.021572217345237732,
-0.08071282505989075,
-0.13089022040367126,
-0.03905940800905228,
0.10443009436130524,
-0.0828707218170166,
0.007167642470449209,
-0.0838412195444107,
0.08307770639657974,
-0.007647163700312376,
0.010419032536447048,
-0.13237792253494263,
-0.022705763578414917,
0.047120485454797745,
-0.0815124660730362,
0.007724304683506489,
-0.0111216576769948,
0.0562613308429718,
0.032464783638715744,
-0.013319728896021843,
-0.059185463935136795,
-0.017233604565262794,
-0.012006409466266632,
-0.052251383662223816,
-0.2525762617588043,
-0.07088359445333481,
-0.019852284342050552,
0.14773403108119965,
-0.1946791410446167,
0.010848957113921642,
0.07890588790178299,
0.11093871295452118,
0.014673244208097458,
-0.05198274552822113,
0.04018118605017662,
0.021397072821855545,
-0.031908780336380005,
-0.07356207072734833,
0.011309843510389328,
-0.005928593687713146,
-0.09343600273132324,
-0.004639190621674061,
-0.1471664160490036,
0.04993622004985809,
0.06900806725025177,
0.03206908330321312,
-0.07969339191913605,
-0.039780616760253906,
-0.06077258288860321,
-0.0532052144408226,
-0.04193738475441933,
-0.02950213849544525,
0.14745543897151947,
0.03532407432794571,
0.08886981755495071,
-0.079337939620018,
-0.07542874664068222,
0.024030646309256554,
-0.005871723871678114,
-0.008104433305561543,
0.17645269632339478,
0.06118018925189972,
-0.03467075526714325,
0.0818454772233963,
0.02382657304406166,
-0.040894970297813416,
0.11404919624328613,
-0.072563037276268,
-0.08315810561180115,
-0.0688377097249031,
0.059391554445028305,
0.029881520196795464,
0.11543629318475723,
-0.17771907150745392,
-0.008557895198464394,
0.03933404013514519,
0.02056996338069439,
0.018642036244273186,
-0.16422516107559204,
0.008266039192676544,
0.03410585969686508,
-0.09398064017295837,
-0.0040556457825005054,
0.016070539131760597,
-0.006061336491256952,
0.07990679144859314,
-0.011479641310870647,
-0.07394817471504211,
-0.040015242993831635,
-0.05681680887937546,
-0.10471680760383606,
0.16432124376296997,
-0.08829915523529053,
-0.11480309069156647,
-0.11795099079608917,
-0.016806496307253838,
-0.05441029369831085,
-0.021112024784088135,
0.042900726199150085,
-0.07632710784673691,
-0.051027487963438034,
-0.08481273055076599,
0.02075943350791931,
-0.0013428807724267244,
0.021346217021346092,
0.03989914804697037,
0.014904795214533806,
0.06068923696875572,
-0.10696055740118027,
-0.0008376940386369824,
-0.0017718434100970626,
-0.002327544381842017,
-0.0009287483408115804,
0.0508134625852108,
0.09717723727226257,
0.15624798834323883,
0.06174414977431297,
0.052618589252233505,
-0.0045897881500422955,
0.21100105345249176,
-0.1335318237543106,
0.021833349019289017,
0.10733991861343384,
0.0007185251452028751,
0.051795393228530884,
0.15421970188617706,
0.0400107279419899,
-0.0917469710111618,
0.008907323703169823,
0.0590948723256588,
-0.009953780099749565,
-0.2377922534942627,
-0.018275048583745956,
-0.06570449471473694,
-0.0009193448349833488,
0.08060984313488007,
0.03421112895011902,
-0.027266046032309532,
0.006931532174348831,
0.00009450000652577728,
-0.04495833441615105,
0.04561043530702591,
0.04978537932038307,
0.0501667745411396,
0.03492671996355057,
0.09300757199525833,
-0.01876174472272396,
-0.024799613282084465,
0.027170579880475998,
0.011398677714169025,
0.21942804753780365,
-0.0185096375644207,
0.1994779258966446,
0.07095114141702652,
0.1083860769867897,
-0.03962358459830284,
0.03956471011042595,
-0.013237424194812775,
0.007626794278621674,
0.034236516803503036,
-0.06594079732894897,
-0.021999575197696686,
0.029590606689453125,
0.10786207765340805,
0.024653296917676926,
-0.07231534272432327,
0.051207754760980606,
0.06402173638343811,
0.32223251461982727,
0.07542252540588379,
-0.23305495083332062,
-0.044880546629428864,
0.030530031770467758,
-0.07321978360414505,
-0.026796026155352592,
0.020402127876877785,
0.12193150073289871,
-0.08072447031736374,
0.07802601158618927,
-0.04767102003097534,
0.08528157323598862,
-0.07934574037790298,
0.009784693829715252,
0.08367221802473068,
0.09289862960577011,
0.008942639455199242,
0.0707310140132904,
-0.24053195118904114,
0.2504713535308838,
0.0031068320386111736,
0.05738518759608269,
-0.06799665093421936,
0.06189116835594177,
0.028170442208647728,
-0.044220682233572006,
0.11210817098617554,
-0.00881129875779152,
-0.08378515392541885,
-0.14522431790828705,
-0.09587442874908447,
0.00018814712530001998,
0.1248302087187767,
-0.07299012690782547,
0.12717051804065704,
-0.04082869738340378,
-0.060338009148836136,
0.015931840986013412,
-0.07335489988327026,
-0.09373913705348969,
-0.08649357408285141,
0.0661582350730896,
-0.023219995200634003,
0.01385505311191082,
-0.06974904984235764,
-0.07432148605585098,
-0.07724350690841675,
0.16372786462306976,
-0.14419065415859222,
-0.030368942767381668,
-0.12945140898227692,
0.02267337404191494,
0.1690922975540161,
-0.07128019630908966,
0.007925310172140598,
0.01254142913967371,
0.12675824761390686,
0.0384676568210125,
-0.002793235005810857,
0.10195889323949814,
-0.08555267006158829,
-0.20570489764213562,
-0.044856902211904526,
0.19482065737247467,
0.021236935630440712,
0.060424335300922394,
-0.019456874579191208,
0.00860838033258915,
-0.017055803909897804,
-0.09853256493806839,
0.08783335983753204,
0.025663862004876137,
-0.01909010484814644,
0.04905560985207558,
-0.03734006732702255,
-0.003229840425774455,
-0.09515007585287094,
-0.044883713126182556,
0.08514387160539627,
0.27637723088264465,
-0.0764794796705246,
0.04620383679866791,
0.0106684984639287,
-0.07900955528020859,
-0.13646869361400604,
-0.014991873875260353,
0.10141687095165253,
0.028983788564801216,
0.003613278502598405,
-0.15696769952774048,
0.024780798703432083,
0.05170907452702522,
-0.018798045814037323,
0.11170165240764618,
-0.34207743406295776,
-0.12916743755340576,
0.08081464469432831,
0.05082111433148384,
-0.08195289224386215,
-0.18467175960540771,
-0.07851845026016235,
-0.009067674167454243,
-0.07023666799068451,
0.015793997794389725,
-0.0030239252373576164,
0.12202711403369904,
-0.0031833346001803875,
0.030915342271327972,
0.03144630044698715,
-0.05779283866286278,
0.15938837826251984,
0.05345470830798149,
0.04578704014420509,
-0.01498552318662405,
-0.0024580853059887886,
0.011773735284805298,
-0.06996668875217438,
0.05432255566120148,
-0.08168993890285492,
0.02158639021217823,
-0.16519416868686676,
-0.010346981696784496,
-0.0727478638291359,
0.014954417943954468,
-0.06636711210012436,
-0.014507276937365532,
-0.02064920961856842,
0.045830294489860535,
0.09706912189722061,
0.015466718003153801,
0.07989567518234253,
-0.06662805378437042,
0.07515618205070496,
0.1675775647163391,
0.10462507605552673,
0.010638045147061348,
-0.1424189656972885,
0.013257312588393688,
0.031707245856523514,
0.008149423636496067,
-0.12496207654476166,
0.06219743937253952,
0.14103859663009644,
0.052633363753557205,
0.1479337513446808,
0.028690841048955917,
-0.0978640466928482,
-0.010548915714025497,
0.05465613678097725,
-0.08656219393014908,
-0.1462884396314621,
-0.008804293349385262,
-0.007121735252439976,
-0.14162510633468628,
-0.012011811137199402,
0.1109859049320221,
-0.003880536649376154,
0.007522581610828638,
0.023100990802049637,
0.07897516340017319,
-0.024630503728985786,
0.2311444729566574,
0.027920203283429146,
0.11349529027938843,
-0.09129559993743896,
0.07191672921180725,
0.028525400906801224,
-0.0618828721344471,
0.03343943879008293,
0.11460346728563309,
-0.042230892926454544,
-0.03556183725595474,
0.012049051001667976,
0.1099771186709404,
0.05692392960190773,
-0.027619047090411186,
-0.14842961728572845,
-0.13435670733451843,
0.08581636846065521,
0.06012093648314476,
0.041838955134153366,
0.025861505419015884,
-0.011204053647816181,
0.018061049282550812,
-0.08445915579795837,
0.1355494260787964,
0.11606907844543457,
0.05253422632813454,
-0.11726505309343338,
0.06646440923213959,
-0.010742386803030968,
0.017760291695594788,
0.002292768796905875,
0.0062677557580173016,
-0.11477428674697876,
0.02859046868979931,
-0.07976321876049042,
-0.009853710420429707,
-0.06224674731492996,
0.00962903629988432,
0.0259383674710989,
-0.0623711422085762,
-0.05523908883333206,
0.017729206010699272,
-0.1114443838596344,
-0.056597910821437836,
-0.02738596498966217,
0.0834910124540329,
-0.11270637810230255,
-0.02122824266552925,
0.04183994606137276,
-0.14259470999240875,
0.10822787880897522,
0.017532220110297203,
0.003954828716814518,
-0.016392134130001068,
-0.08648008108139038,
0.004049147013574839,
0.01791025698184967,
0.018751977011561394,
0.035341527312994,
-0.2269396334886551,
-0.0007323708850890398,
-0.04151711240410805,
0.010186385363340378,
-0.007688269950449467,
0.019312920048832893,
-0.13017231225967407,
0.01697380654513836,
-0.044033195823431015,
-0.05654934048652649,
-0.049900121986866,
0.05572095885872841,
0.0717979297041893,
0.00875153299421072,
0.16862034797668457,
-0.052331604063510895,
0.08559399843215942,
-0.21715235710144043,
-0.0001758259895723313,
0.011712496168911457,
-0.047448642551898956,
-0.020033804699778557,
-0.011886084452271461,
0.10980617254972458,
-0.07363046705722809,
0.09149174392223358,
-0.013958504423499107,
0.04385581612586975,
0.027993198484182358,
-0.07293371111154556,
0.037232983857393265,
0.05526211857795715,
0.0879787728190422,
0.010771412402391434,
-0.006288130301982164,
0.0652923658490181,
-0.040555618703365326,
0.043915022164583206,
0.058325476944446564,
0.14725792407989502,
0.13384348154067993,
0.07180553674697876,
0.057395655661821365,
0.09321844577789307,
-0.14586471021175385,
-0.13201238214969635,
0.16331623494625092,
-0.07592564076185226,
0.14957955479621887,
-0.038932155817747116,
0.17847882211208344,
0.10283362120389938,
-0.19653931260108948,
0.08712733536958694,
-0.04002569988369942,
-0.09199236333370209,
-0.10127773135900497,
-0.09863916039466858,
-0.07616641372442245,
-0.1492615044116974,
0.016059236600995064,
-0.09802208095788956,
0.07995514571666718,
0.03090483695268631,
0.044273700565099716,
0.02467772178351879,
0.09933114796876907,
0.016827646642923355,
-0.0000027834930733661167,
0.09909723699092865,
0.015491521917283535,
-0.03329169750213623,
-0.02054278366267681,
-0.06543023139238358,
0.033084820955991745,
-0.02785336598753929,
0.0721466988325119,
-0.013725345022976398,
-0.07826860249042511,
0.04761914536356926,
-0.015312130562961102,
-0.09358086436986923,
0.021766671910881996,
-0.04394855350255966,
0.035839471966028214,
0.0893157422542572,
0.0446295291185379,
-0.009301627986133099,
-0.012224521487951279,
0.19237883388996124,
-0.08359139412641525,
-0.06168796494603157,
-0.1339462697505951,
0.13719762861728668,
-0.005324316211044788,
0.012898125685751438,
0.005607607774436474,
-0.07711579650640488,
-0.019700637087225914,
0.18126781284809113,
0.14228154718875885,
-0.024105284363031387,
-0.021627960726618767,
0.021880941465497017,
0.001592464279383421,
-0.017520722001791,
0.044176213443279266,
0.10837628692388535,
0.06349500268697739,
-0.029516225680708885,
-0.00996200181543827,
-0.03195841982960701,
-0.07395981252193451,
-0.04637174308300018,
0.10083706676959991,
0.029881073161959648,
0.010527901351451874,
-0.01507579255849123,
0.11566442251205444,
-0.06411430239677429,
-0.1785055547952652,
0.01820620708167553,
-0.16776283085346222,
-0.1814383715391159,
-0.028971560299396515,
0.08055280894041061,
0.031256675720214844,
0.06035039573907852,
0.0011584772728383541,
-0.047799400985240936,
0.1315453201532364,
-0.000030471735954051837,
-0.04580957815051079,
-0.09887143224477768,
0.05638297647237778,
-0.1410108208656311,
0.19093534350395203,
-0.028060972690582275,
0.03412424400448799,
0.14020861685276031,
0.0060656871646642685,
-0.11285234242677689,
0.03812071681022644,
0.10661400854587555,
-0.1308378279209137,
0.06353193521499634,
0.18612922728061676,
-0.02197498083114624,
0.13389836251735687,
0.04682820662856102,
-0.04355475679039955,
-0.0024569567758589983,
-0.05656053498387337,
-0.019090723246335983,
-0.08314381539821625,
0.003917728550732136,
-0.0548969991505146,
0.125034362077713,
0.20639348030090332,
-0.07583726197481155,
0.00598691264167428,
-0.04847147315740585,
0.02474375069141388,
0.007773225661367178,
0.11828526854515076,
-0.0436737984418869,
-0.26354360580444336,
0.04798765107989311,
-0.006397296208888292,
0.03015541099011898,
-0.17421787977218628,
-0.09480205923318863,
0.04543124884366989,
-0.04847642034292221,
-0.0614466592669487,
0.12060975283384323,
0.060913849622011185,
0.04832494258880615,
-0.052771005779504776,
-0.13713975250720978,
-0.010276108980178833,
0.17220447957515717,
-0.165293350815773,
-0.0567108616232872
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-or
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6618
- Wer: 0.5166
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.12
- num_epochs: 240
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 6.0493 | 23.53 | 400 | 2.9728 | 1.0 |
| 0.5306 | 47.06 | 800 | 1.2895 | 0.6138 |
| 0.1253 | 70.59 | 1200 | 1.6854 | 0.5703 |
| 0.0763 | 94.12 | 1600 | 1.9433 | 0.5870 |
| 0.0552 | 117.65 | 2000 | 1.4393 | 0.5575 |
| 0.0382 | 141.18 | 2400 | 1.4665 | 0.5537 |
| 0.0286 | 164.71 | 2800 | 1.5441 | 0.5320 |
| 0.0212 | 188.24 | 3200 | 1.6502 | 0.5115 |
| 0.0168 | 211.76 | 3600 | 1.6411 | 0.5332 |
| 0.0129 | 235.29 | 4000 | 1.6618 | 0.5166 |
### Framework versions
- Transformers 4.16.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_7_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-or --dataset mozilla-foundation/common_voice_7_0 --config or --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-or"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_7_0", "or", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "ପରରାଏ ବାଲା ଗସ୍ତି ଫାଣ୍ଡି ଗୋପାଳ ପରଠାରୁ ଦେଢ଼କଶ ଦୂର"
```
### Eval results on Common Voice 7 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 51.92 | 47.186 |
|
{"language": ["or"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-or", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "or"}, "metrics": [{"type": "wer", "value": 47.186, "name": "Test WER"}, {"type": "cer", "value": 11.82, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-or
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"robust-speech-event",
"hf-asr-leaderboard",
"or",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"or"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #robust-speech-event #hf-asr-leaderboard #or #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-or
============================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 1.6618
* Wer: 0.5166
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.12
* num\_epochs: 240
### Training results
### Framework versions
* Transformers 4.16.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_7\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 7 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #robust-speech-event #hf-asr-leaderboard #or #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
92,
144,
4,
37,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #robust-speech-event #hf-asr-leaderboard #or #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240### Training results### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
-0.11357257515192032,
0.1292908489704132,
-0.007125264033675194,
0.03496796637773514,
0.07095500826835632,
0.012948902323842049,
0.10793504118919373,
0.17058531939983368,
-0.036167532205581665,
0.13729740679264069,
0.06947556138038635,
0.08158063143491745,
0.0816439613699913,
0.10279182344675064,
-0.04226253554224968,
-0.23462001979351044,
0.024253496900200844,
-0.04775441810488701,
-0.08031626045703888,
0.11312384903430939,
0.09832635521888733,
-0.08936414867639542,
0.0222820695489645,
0.014520514756441116,
-0.0658329650759697,
-0.016647906973958015,
-0.040434591472148895,
-0.05179455131292343,
0.07766308635473251,
0.04130098223686218,
0.026171695441007614,
0.051740147173404694,
0.04975365847349167,
-0.28644976019859314,
0.005741935223340988,
0.0550285167992115,
0.040458742529153824,
0.06110469624400139,
0.12222155183553696,
-0.033477380871772766,
0.11379573494195938,
-0.06445787847042084,
0.03541510924696922,
0.06967092305421829,
-0.07510219514369965,
-0.23290574550628662,
-0.07269471883773804,
0.029264375567436218,
0.12834005057811737,
0.061696626245975494,
-0.04643884673714638,
0.0644908994436264,
-0.09104800224304199,
0.09612800180912018,
0.2325468361377716,
-0.22305838763713837,
-0.06292091310024261,
-0.005354156717658043,
0.030098268762230873,
-0.0020774169825017452,
-0.11789800226688385,
-0.015194850042462349,
0.009272733703255653,
-0.0041231983341276646,
0.07230192422866821,
0.001899557071737945,
0.060663800686597824,
0.00220674742013216,
-0.13276110589504242,
-0.05496823787689209,
0.11322261393070221,
0.07711607217788696,
-0.03272758424282074,
-0.13888388872146606,
-0.01728922128677368,
-0.15406188368797302,
-0.03890165686607361,
0.008196542970836163,
0.01857512816786766,
-0.03521246835589409,
-0.012262290343642235,
0.05210652947425842,
-0.05470336601138115,
-0.06560937315225601,
0.03938604146242142,
0.1501137912273407,
0.041545938700437546,
-0.03574173152446747,
0.02322344109416008,
0.0862974002957344,
0.037703126668930054,
-0.18134357035160065,
-0.03936168551445007,
0.0402669757604599,
-0.1423734873533249,
-0.019137494266033173,
-0.012596079148352146,
-0.0021549288649111986,
0.09498763084411621,
0.15344442427158356,
0.005138558801263571,
0.1083420142531395,
0.008038579486310482,
0.003415229031816125,
-0.045825473964214325,
0.1731647253036499,
-0.028182534500956535,
-0.09858497977256775,
-0.04369646683335304,
0.1324886977672577,
-0.012391768395900726,
-0.005526619963347912,
-0.038966186344623566,
0.03933867812156677,
0.12371496856212616,
0.07216037809848785,
0.01228212472051382,
0.008855152875185013,
-0.07322138547897339,
-0.02342156507074833,
-0.00951351411640644,
-0.1456160992383957,
0.053822051733732224,
0.09846216440200806,
-0.05262438952922821,
-0.00026038289070129395,
-0.019770046696066856,
-0.005330689251422882,
-0.04450387507677078,
0.08663953095674515,
-0.03476552292704582,
-0.005276128184050322,
-0.059836432337760925,
-0.07043305784463882,
0.03589572757482529,
-0.02808174304664135,
-0.017720814794301987,
-0.04270603135228157,
-0.05348437651991844,
-0.07981842011213303,
0.05986596271395683,
-0.08694396167993546,
-0.06792407482862473,
-0.06792335212230682,
-0.10597183555364609,
0.04911649972200394,
-0.010373753495514393,
0.1431901454925537,
-0.051433682441711426,
0.08503156900405884,
0.049924395978450775,
0.048655085265636444,
0.1633584201335907,
0.0569048672914505,
-0.018028460443019867,
0.05728884041309357,
-0.16234701871871948,
0.12702703475952148,
-0.13342684507369995,
0.03797198086977005,
-0.15313975512981415,
-0.09137497842311859,
0.015178867615759373,
-0.008080282248556614,
0.07975363731384277,
0.15272390842437744,
-0.1661054790019989,
-0.07668913155794144,
0.12876947224140167,
-0.06219290941953659,
-0.08652574568986893,
0.14809806644916534,
-0.008642679080367088,
-0.041119176894426346,
0.007747043389827013,
0.18585239350795746,
0.1263195276260376,
-0.10929935425519943,
-0.0031688292510807514,
-0.046672239899635315,
0.09164861589670181,
0.09245418757200241,
0.09831896424293518,
-0.05916144698858261,
0.03781451657414436,
-0.002506470074877143,
-0.050509076565504074,
0.01238308660686016,
-0.06184106692671776,
-0.07871957123279572,
-0.010864506475627422,
-0.049783844500780106,
0.002825882751494646,
0.04041131213307381,
-0.01182099711149931,
-0.09302211552858353,
-0.1404581367969513,
-0.04517928883433342,
0.0926753580570221,
-0.08929120004177094,
0.009804882109165192,
-0.10091913491487503,
0.08294669538736343,
-0.007123312912881374,
0.005119164939969778,
-0.13823772966861725,
-0.0026976868975907564,
0.05356050282716751,
-0.07105041295289993,
0.004042632412165403,
-0.008156346157193184,
0.06261095404624939,
0.01817222498357296,
-0.023740217089653015,
-0.05655480548739433,
-0.011243032291531563,
-0.010123251006007195,
-0.04581160098314285,
-0.24047701060771942,
-0.06672284007072449,
-0.023528438061475754,
0.16144445538520813,
-0.1876281052827835,
0.0054835425689816475,
0.11717921495437622,
0.11451643705368042,
0.010572008788585663,
-0.049412112683057785,
0.02918180637061596,
0.021082328632473946,
-0.02071605995297432,
-0.06007205694913864,
0.0046818796545267105,
0.0012624366208910942,
-0.10879404097795486,
0.0035688120406121016,
-0.13568641245365143,
0.060486141592264175,
0.07623177766799927,
0.05896182358264923,
-0.07825244963169098,
-0.05090481787919998,
-0.061259765177965164,
-0.049041930586099625,
-0.03861324116587639,
-0.01726233959197998,
0.16793201863765717,
0.04358884319663048,
0.07872499525547028,
-0.08949317783117294,
-0.07247630506753922,
0.019128737971186638,
0.002036381047219038,
-0.01049285288900137,
0.15873940289020538,
0.039799291640520096,
-0.04131840914487839,
0.06995903700590134,
0.01639573834836483,
-0.06892788410186768,
0.10560226440429688,
-0.08906156569719315,
-0.08929149061441422,
-0.06438316404819489,
0.06757716089487076,
0.03970765694975853,
0.10037614405155182,
-0.17631317675113678,
0.0067995586432516575,
0.04184393584728241,
0.015042407438158989,
0.016621585935354233,
-0.16260285675525665,
0.014807459898293018,
0.045679040253162384,
-0.0989365205168724,
0.003653998952358961,
0.02176768332719803,
0.0034439321607351303,
0.06852611154317856,
-0.00480955746024847,
-0.08114393800497055,
-0.03488604724407196,
-0.06422432512044907,
-0.10515730828046799,
0.1629534512758255,
-0.06504960358142853,
-0.13312320411205292,
-0.10903751105070114,
-0.007290879730135202,
-0.040895216166973114,
-0.02426544763147831,
0.05263480171561241,
-0.08699659258127213,
-0.06340447813272476,
-0.07989582419395447,
0.00556051405146718,
0.004470338113605976,
0.03411863371729851,
0.051479730755090714,
0.0076681338250637054,
0.07178221642971039,
-0.09595303237438202,
-0.002775011584162712,
-0.010631621815264225,
-0.00016719022823963314,
0.014133634977042675,
0.038208719342947006,
0.08057007938623428,
0.1562880128622055,
0.052180178463459015,
0.054835546761751175,
-0.017723288387060165,
0.2030867338180542,
-0.14175307750701904,
0.017598524689674377,
0.10860072076320648,
-0.010832656174898148,
0.051863040775060654,
0.15611694753170013,
0.03554821386933327,
-0.08387386053800583,
0.007136936765164137,
0.046268679201602936,
-0.005927707999944687,
-0.2500368654727936,
-0.01934373937547207,
-0.0722176805138588,
-0.026596345007419586,
0.07015310972929001,
0.03365643694996834,
0.005323635879904032,
0.008820927701890469,
-0.007826410233974457,
-0.06157851964235306,
0.05480274558067322,
0.05287586897611618,
0.08745331317186356,
0.05432042106986046,
0.10715416818857193,
-0.017939360812306404,
-0.02832609973847866,
0.019691908732056618,
-0.0008989623747766018,
0.21053393185138702,
-0.004125908948481083,
0.19541138410568237,
0.05865579470992088,
0.12025165557861328,
-0.01911924220621586,
0.035138826817274094,
-0.014946913346648216,
0.00023600210261065513,
0.03629681095480919,
-0.0653294175863266,
-0.030303852632641792,
0.029636241495609283,
0.13062624633312225,
0.020864589139819145,
-0.06834851205348969,
0.043219730257987976,
0.062295686453580856,
0.32315248250961304,
0.06941891461610794,
-0.235558420419693,
-0.05809813365340233,
0.021101538091897964,
-0.09386822581291199,
-0.0191437266767025,
0.0132680619135499,
0.11639931052923203,
-0.09129063040018082,
0.08980761468410492,
-0.05013388767838478,
0.0857764482498169,
-0.07440880686044693,
0.013502473011612892,
0.0657840445637703,
0.09836610406637192,
0.014720824547111988,
0.0615931861102581,
-0.23244719207286835,
0.24297679960727692,
-0.008418072946369648,
0.062153350561857224,
-0.06448390334844589,
0.06623224169015884,
0.033549435436725616,
-0.04560975730419159,
0.0968630462884903,
-0.019457265734672546,
-0.0719490721821785,
-0.10902964323759079,
-0.10294987261295319,
0.015260111540555954,
0.12177760154008865,
-0.07035333663225174,
0.12196303904056549,
-0.03469071164727211,
-0.05792338401079178,
0.019087594002485275,
-0.05629386380314827,
-0.08819423615932465,
-0.08821960538625717,
0.06352904438972473,
-0.015138181857764721,
0.0442771390080452,
-0.07397893071174622,
-0.08714209496974945,
-0.11330220848321915,
0.1795213222503662,
-0.14323250949382782,
-0.036727532744407654,
-0.1318439543247223,
0.0170181542634964,
0.1676299273967743,
-0.07165533304214478,
0.014800372533500195,
0.015781190246343613,
0.14202183485031128,
0.027837591245770454,
-0.0018362280679866672,
0.09516341984272003,
-0.08086162805557251,
-0.19895635545253754,
-0.0349012166261673,
0.20061834156513214,
0.01771363988518715,
0.07032674551010132,
-0.01054059062153101,
0.009444914758205414,
-0.004775326233357191,
-0.09488259255886078,
0.0774335041642189,
0.013111154548823833,
-0.03175519034266472,
0.043337494134902954,
-0.0032897116616368294,
-0.024170685559511185,
-0.10705816000699997,
-0.03682760149240494,
0.09927647560834885,
0.2629610300064087,
-0.0729006677865982,
0.05797477066516876,
0.00914249662309885,
-0.05272833630442619,
-0.1266210377216339,
-0.012031872756779194,
0.13177509605884552,
0.039496082812547684,
-0.01863691955804825,
-0.15403005480766296,
0.02536187507212162,
0.05044630169868469,
-0.020456377416849136,
0.07037505507469177,
-0.31215718388557434,
-0.13329774141311646,
0.08688458055257797,
0.042047493159770966,
-0.07261373847723007,
-0.17231254279613495,
-0.09023471176624298,
-0.0034702292177826166,
-0.07038601487874985,
0.011196397244930267,
-0.00030486853211186826,
0.11407843232154846,
0.002706998959183693,
0.009453783743083477,
0.033412206918001175,
-0.05056500434875488,
0.16338057816028595,
0.04422058165073395,
0.030268555507063866,
-0.0029210331849753857,
0.02775704860687256,
-0.009196402505040169,
-0.06505395472049713,
0.051458053290843964,
-0.09239186346530914,
0.014029247686266899,
-0.14338168501853943,
-0.022681599482893944,
-0.07677137851715088,
0.013972445391118526,
-0.05684284865856171,
-0.00007175902283051983,
-0.022471744567155838,
0.03525076434016228,
0.10498512536287308,
0.01582670398056507,
0.07770853489637375,
-0.0746971145272255,
0.08681392669677734,
0.18421995639801025,
0.1272978037595749,
-0.004030718468129635,
-0.1499377191066742,
0.014778169803321362,
0.025195902213454247,
0.01888323947787285,
-0.11758182942867279,
0.055475108325481415,
0.13150683045387268,
0.04443274065852165,
0.1554594486951828,
0.028680291026830673,
-0.10555111616849899,
-0.016292590647935867,
0.05785304680466652,
-0.07502663880586624,
-0.1566917896270752,
-0.011357717216014862,
0.011428984813392162,
-0.14937040209770203,
-0.02129853330552578,
0.10287586599588394,
-0.008299204520881176,
0.002921187086030841,
0.028506699949502945,
0.06333290040493011,
-0.019928516820073128,
0.2165437489748001,
0.01992839202284813,
0.1196463331580162,
-0.0961296409368515,
0.05642140656709671,
0.03783784434199333,
-0.082957923412323,
0.04118046537041664,
0.1045391783118248,
-0.0425976924598217,
-0.019920341670513153,
0.031572092324495316,
0.07762563228607178,
0.07361111044883728,
-0.035218965262174606,
-0.13823114335536957,
-0.16001775860786438,
0.08632124960422516,
0.07960811257362366,
0.04962151497602463,
0.02925449050962925,
-0.012161250226199627,
0.030553752556443214,
-0.08319190889596939,
0.12851521372795105,
0.11642283201217651,
0.057413436472415924,
-0.1159483790397644,
0.061304546892642975,
-0.010745974257588387,
-0.0030510048381984234,
0.0005281531484797597,
-0.006577055435627699,
-0.1053972840309143,
0.024790503084659576,
-0.08309826254844666,
0.009450671263039112,
-0.06383466720581055,
0.0112961046397686,
0.034256696701049805,
-0.07331525534391403,
-0.048978082835674286,
0.022957082837820053,
-0.11291809380054474,
-0.049563318490982056,
-0.02382819354534149,
0.07922019064426422,
-0.1148240938782692,
-0.006338007282465696,
0.05414208397269249,
-0.15658123791217804,
0.11236080527305603,
0.035036906599998474,
-0.008306536823511124,
-0.00807906873524189,
-0.08685845881700516,
-0.011231444776058197,
0.03477159142494202,
0.023318011313676834,
0.027483651414513588,
-0.2402019053697586,
0.0049817198887467384,
-0.02057904191315174,
0.004009007476270199,
-0.00025510575505904853,
0.026050159707665443,
-0.1252758800983429,
0.009662331081926823,
-0.05301346629858017,
-0.05460092052817345,
-0.0435696542263031,
0.05306704714894295,
0.07642703503370285,
-0.003543068189173937,
0.1714242696762085,
-0.06490375101566315,
0.08364815264940262,
-0.2142437994480133,
0.0020791273564100266,
0.0022831414826214314,
-0.03584235906600952,
-0.045478612184524536,
-0.009362632408738136,
0.10861247032880783,
-0.0657743513584137,
0.06504105031490326,
-0.03312671184539795,
0.04104255884885788,
0.03076554462313652,
-0.08385996520519257,
0.04276267811655998,
0.05451502650976181,
0.11521903425455093,
0.03753221407532692,
-0.012892276048660278,
0.0804855227470398,
-0.06266969442367554,
0.03668028861284256,
0.02140658162534237,
0.14683425426483154,
0.14917711913585663,
0.05885247513651848,
0.07255157083272934,
0.09604115784168243,
-0.11512050032615662,
-0.12319473922252655,
0.1772928237915039,
-0.0874759778380394,
0.13098149001598358,
-0.015407483093440533,
0.17261552810668945,
0.1078532338142395,
-0.2015129029750824,
0.092622309923172,
-0.033640310168266296,
-0.08712494373321533,
-0.10067299753427505,
-0.10241328179836273,
-0.0870923101902008,
-0.15513746440410614,
0.023288246244192123,
-0.09393848478794098,
0.06994269788265228,
0.01408678013831377,
0.053706057369709015,
0.02633405290544033,
0.1094372496008873,
0.026137765496969223,
-0.006209214683622122,
0.12301141023635864,
0.00795053131878376,
-0.03864435479044914,
-0.006346229929476976,
-0.05858876556158066,
0.04635990411043167,
-0.0018316981149837375,
0.08230128884315491,
-0.008482041768729687,
-0.07157032936811447,
0.047104112803936005,
-0.005971732549369335,
-0.10513516515493393,
0.034450750797986984,
-0.03838657587766647,
0.0398196205496788,
0.09703399986028671,
0.04398100823163986,
-0.005637172609567642,
-0.015039213001728058,
0.16724009811878204,
-0.08408127725124359,
-0.06948751956224442,
-0.14239567518234253,
0.15851756930351257,
-0.0008588689379394054,
0.01638217829167843,
0.01499788649380207,
-0.07423218339681625,
-0.025341885164380074,
0.17436671257019043,
0.12280255556106567,
-0.015642013400793076,
-0.014837311580777168,
0.029610106721520424,
-0.0022835189010947943,
-0.007377029862254858,
0.0327429361641407,
0.11086912453174591,
0.06918809562921524,
-0.023603305220603943,
-0.002843711758032441,
-0.024918805807828903,
-0.08250320702791214,
-0.031823426485061646,
0.08133287727832794,
0.021765872836112976,
-0.011726479046046734,
-0.018362518399953842,
0.12139532715082169,
-0.07378179579973221,
-0.1827755570411682,
0.02026032656431198,
-0.16410870850086212,
-0.1882876455783844,
-0.02683546207845211,
0.07640106976032257,
0.029623519629240036,
0.04943184554576874,
0.006280712317675352,
-0.06042902171611786,
0.12849968671798706,
0.007453481201082468,
-0.029548928141593933,
-0.06801874190568924,
0.06114939972758293,
-0.1353052407503128,
0.15308959782123566,
-0.016321495175361633,
0.05975545197725296,
0.13178379833698273,
0.028097644448280334,
-0.09737509489059448,
0.03694002330303192,
0.10398149490356445,
-0.1558920443058014,
0.06267452985048294,
0.20128348469734192,
-0.008298361673951149,
0.13162748515605927,
0.06483093649148941,
-0.05642839893698692,
-0.004631847143173218,
-0.06751807034015656,
-0.017816515639424324,
-0.08090440183877945,
0.0085421372205019,
-0.06062621250748634,
0.10868354141712189,
0.20904643833637238,
-0.07217439264059067,
-0.003242375561967492,
-0.04108886793255806,
0.026471614837646484,
0.009714799001812935,
0.11976049095392227,
-0.047572072595357895,
-0.2650066018104553,
0.04936657473444939,
-0.026861945167183876,
0.03626187890768051,
-0.17901338636875153,
-0.10488341748714447,
0.037886880338191986,
-0.04291689395904541,
-0.0641426369547844,
0.11310601234436035,
0.05479076877236366,
0.044742874801158905,
-0.049167923629283905,
-0.12282826751470566,
-0.015125374309718609,
0.17328308522701263,
-0.1763557642698288,
-0.04257585480809212
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Punjabi
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2548
- Wer: 0.5677
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.12
- num_epochs: 120
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 6.4804 | 16.65 | 400 | 1.8461 | 1.0 |
| 0.474 | 33.33 | 800 | 1.1018 | 0.6624 |
| 0.1389 | 49.98 | 1200 | 1.1918 | 0.6103 |
| 0.0919 | 66.65 | 1600 | 1.1889 | 0.6058 |
| 0.0657 | 83.33 | 2000 | 1.2266 | 0.5931 |
| 0.0479 | 99.98 | 2400 | 1.2512 | 0.5902 |
| 0.0355 | 116.65 | 2800 | 1.2548 | 0.5677 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_7_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-pa-in --dataset mozilla-foundation/common_voice_7_0 --config pa-IN --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-pa-in"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_7_0", "pa-IN", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "ਉਨ੍ਹਾਂ ਨੇ ਸਾਰੇ ਤੇਅਰਵੇ ਵੱਖਰੀ ਕਿਸਮ ਦੇ ਕੀਤੇ ਹਨ"
```
### Eval results on Common Voice 7 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 51.968 | 45.611 |
|
{"language": ["pa"], "license": "apache-2.0", "tags": ["generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer"], "model-index": [{"name": "XLS-R-300M - Punjabi", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "pa-IN"}, "metrics": [{"type": "wer", "value": 45.611, "name": "Test WER"}, {"type": "cer", "value": 15.584, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-pa-in
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"pa",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"pa"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #pa #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Punjabi
====================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2548
* Wer: 0.5677
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.12
* num\_epochs: 120
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_7\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 7 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 120\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #pa #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 120\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
99,
159,
4,
35,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #pa #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 120\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
-0.11245634406805038,
0.14045056700706482,
-0.00672190310433507,
0.05004144087433815,
0.06940706074237823,
0.022515803575515747,
0.10258011519908905,
0.1744121015071869,
-0.030650513246655464,
0.12924687564373016,
0.06367570906877518,
0.0751115083694458,
0.0794990211725235,
0.10107619315385818,
-0.028302013874053955,
-0.2467578947544098,
0.017067739740014076,
-0.055273935198783875,
-0.06561286002397537,
0.10759612917900085,
0.09699016809463501,
-0.08664147555828094,
0.02629329450428486,
0.01324778888374567,
-0.038915812969207764,
-0.007026415318250656,
-0.04954807460308075,
-0.043344397097826004,
0.08249439299106598,
0.027065496891736984,
0.0199599526822567,
0.036082904785871506,
0.06556247919797897,
-0.2887963354587555,
0.0049750227481126785,
0.05155511945486069,
0.036396969109773636,
0.060046762228012085,
0.1253655105829239,
-0.036260947585105896,
0.104182168841362,
-0.06968729943037033,
0.043424781411886215,
0.060822710394859314,
-0.08779750019311905,
-0.23304229974746704,
-0.07891663908958435,
0.04443212226033211,
0.1396932452917099,
0.054969873279333115,
-0.04263925552368164,
0.048293519765138626,
-0.07465485483407974,
0.10282433032989502,
0.24567663669586182,
-0.22974279522895813,
-0.054323893040418625,
-0.027588779106736183,
0.034885358065366745,
0.010763837024569511,
-0.10323233157396317,
-0.021927090361714363,
0.0024011770728975534,
0.015013355761766434,
0.05705874040722847,
-0.008304258808493614,
0.022177832201123238,
0.0056321085430681705,
-0.13831129670143127,
-0.05994615703821182,
0.1122799888253212,
0.06632481515407562,
-0.029091227799654007,
-0.1276095062494278,
-0.020142370834946632,
-0.17925111949443817,
-0.04367257282137871,
0.01706126146018505,
0.016744190827012062,
-0.03897278383374214,
-0.01669379509985447,
0.04597422853112221,
-0.05371829494833946,
-0.0714874342083931,
0.052393510937690735,
0.12055506557226181,
0.0506608709692955,
-0.04076574742794037,
0.022239886224269867,
0.0967799499630928,
0.03471849113702774,
-0.17256200313568115,
-0.02318393439054489,
0.04323524981737137,
-0.12307507544755936,
-0.008691461756825447,
-0.004985221195966005,
-0.02290564402937889,
0.09064342081546783,
0.16078519821166992,
-0.0006410173373296857,
0.11156098544597626,
0.012479733675718307,
0.0009422143921256065,
-0.040417082607746124,
0.1602443903684616,
-0.046398553997278214,
-0.0901256874203682,
-0.04836001619696617,
0.12801863253116608,
-0.0060109905898571014,
-0.0024035267997533083,
-0.04337774217128754,
0.04320317879319191,
0.12468218803405762,
0.08107805997133255,
0.010786814615130424,
0.012496305629611015,
-0.08051120489835739,
-0.023991920053958893,
0.006882714573293924,
-0.14464335143566132,
0.05462834611535072,
0.09089572727680206,
-0.06534317880868912,
-0.03284454718232155,
-0.003953608684241772,
-0.003245751606300473,
-0.05465424060821533,
0.09251643717288971,
-0.03535405173897743,
0.006233677268028259,
-0.046742722392082214,
-0.07791467010974884,
0.03726823255419731,
-0.029665004462003708,
-0.021975819021463394,
-0.02971755713224411,
-0.06845061480998993,
-0.08273348212242126,
0.050076860934495926,
-0.08042620867490768,
-0.05902274325489998,
-0.07908090949058533,
-0.08621159195899963,
0.049405649304389954,
-0.011331590823829174,
0.12795163691043854,
-0.05742824822664261,
0.080889031291008,
0.029704000800848007,
0.04150449112057686,
0.15173940360546112,
0.0655321553349495,
-0.033191122114658356,
0.06530273705720901,
-0.15269827842712402,
0.11174158751964569,
-0.12698765099048615,
0.03650471195578575,
-0.15962421894073486,
-0.09733737260103226,
0.030378947034478188,
-0.006583949085325003,
0.09178431332111359,
0.1453694999217987,
-0.1877889335155487,
-0.0725560113787651,
0.14146198332309723,
-0.05098005756735802,
-0.0799875557422638,
0.14578215777873993,
-0.012499484233558178,
-0.048275016248226166,
0.009074525907635689,
0.2043195217847824,
0.12224485725164413,
-0.11862750351428986,
0.007338433992117643,
-0.056986719369888306,
0.09102804958820343,
0.10183688253164291,
0.08087235689163208,
-0.07178660482168198,
0.05040428042411804,
-0.003323990385979414,
-0.04064280167222023,
0.015095277689397335,
-0.0631430596113205,
-0.08115639537572861,
-0.005171910859644413,
-0.0518733449280262,
0.0014340318739414215,
0.03607972338795662,
-0.016422918066382408,
-0.09051652252674103,
-0.13727901875972748,
-0.027993183583021164,
0.09772031009197235,
-0.09588884562253952,
0.016001615673303604,
-0.08962088078260422,
0.08063482493162155,
0.000804746407084167,
0.00028235907666385174,
-0.13623380661010742,
-0.009298770688474178,
0.053720440715551376,
-0.09664763510227203,
0.02621694654226303,
-0.037588056176900864,
0.06775562465190887,
0.030456561595201492,
-0.017798254266381264,
-0.06904982775449753,
-0.002745463978499174,
-0.005033280234783888,
-0.04791593179106712,
-0.24707655608654022,
-0.0735137015581131,
-0.013981075957417488,
0.15803849697113037,
-0.18073804676532745,
0.0011649079388007522,
0.0792606994509697,
0.12669244408607483,
0.005631860811263323,
-0.05083651840686798,
0.04815458506345749,
0.038406018167734146,
-0.01361959706991911,
-0.06261864304542542,
0.011773129925131798,
-0.008616579696536064,
-0.10593659430742264,
0.006340019404888153,
-0.15630820393562317,
0.01847062259912491,
0.058907605707645416,
0.03700202330946922,
-0.08746848255395889,
-0.03275582566857338,
-0.056478675454854965,
-0.061623819172382355,
-0.026241259649395943,
-0.01956660859286785,
0.1881571114063263,
0.04468943923711777,
0.08523272722959518,
-0.07799221575260162,
-0.07078060507774353,
0.015783989802002907,
-0.009857785888016224,
-0.006784431170672178,
0.17073489725589752,
0.028056101873517036,
-0.050536803901195526,
0.07450298219919205,
0.01198025792837143,
-0.05543499439954758,
0.1203010305762291,
-0.08284074068069458,
-0.08883623778820038,
-0.05988456308841705,
0.06668800860643387,
0.030814262107014656,
0.09257695078849792,
-0.15978319942951202,
-0.00431053014472127,
0.03762273117899895,
0.0145104443654418,
0.01500229723751545,
-0.16503198444843292,
0.01797417923808098,
0.03244154527783394,
-0.10210499167442322,
-0.0031584277749061584,
0.020655380561947823,
0.01163454633206129,
0.070149265229702,
-0.009648209437727928,
-0.08699280768632889,
-0.036008287221193314,
-0.056253187358379364,
-0.10158663988113403,
0.1629098653793335,
-0.08468776941299438,
-0.1402529627084732,
-0.113399438560009,
-0.006503035314381123,
-0.039728179574012756,
-0.03191514313220978,
0.05327799543738365,
-0.07926063984632492,
-0.05613083392381668,
-0.0809083878993988,
0.00885444600135088,
0.005897382739931345,
0.014448810368776321,
0.01808829791843891,
-0.00901571661233902,
0.0851585641503334,
-0.1146194338798523,
-0.00019794850959442556,
-0.0070798653177917,
-0.008935500867664814,
0.008099104277789593,
0.04844490438699722,
0.08528806269168854,
0.15859393775463104,
0.0592525340616703,
0.04755334183573723,
-0.010614276863634586,
0.2074868232011795,
-0.13032546639442444,
0.011525428853929043,
0.08886340260505676,
-0.030886828899383545,
0.06051582098007202,
0.1660110056400299,
0.03606991469860077,
-0.07513175159692764,
-0.0025385767221450806,
0.049354299902915955,
-0.0003276266506873071,
-0.24713774025440216,
-0.030300848186016083,
-0.06300315260887146,
-0.001430955482646823,
0.08038225769996643,
0.04461485147476196,
-0.002557019703090191,
-0.0051619005389511585,
-0.009402718394994736,
-0.06620501726865768,
0.04964839294552803,
0.0550570972263813,
0.08681250363588333,
0.04863565042614937,
0.09206277132034302,
-0.012893861159682274,
-0.03745071962475777,
0.021604206413030624,
0.007318247575312853,
0.19850389659404755,
-0.009011231362819672,
0.21757863461971283,
0.06126993149518967,
0.12678448855876923,
-0.021923739463090897,
0.036001045256853104,
-0.02291872352361679,
0.01800602674484253,
0.03906123712658882,
-0.06212371587753296,
-0.014772066846489906,
0.026134783402085304,
0.12861734628677368,
0.005441139917820692,
-0.06442048400640488,
0.020387127995491028,
0.061649199575185776,
0.32746952772140503,
0.07803133130073547,
-0.2357099950313568,
-0.04832262545824051,
0.017119161784648895,
-0.0916917473077774,
-0.02819383516907692,
0.011407779529690742,
0.10212969034910202,
-0.09363039582967758,
0.0960041955113411,
-0.051092084497213364,
0.08364000916481018,
-0.08647947758436203,
0.00842986535280943,
0.0758126899600029,
0.09560301899909973,
0.010382615961134434,
0.061123114079236984,
-0.2214011698961258,
0.24624307453632355,
-0.006224492099136114,
0.05756525695323944,
-0.05538330599665642,
0.05531027540564537,
0.023979175835847855,
-0.07582004368305206,
0.10077223926782608,
-0.010207371786236763,
-0.07440449297428131,
-0.1401628851890564,
-0.1115417554974556,
0.006117525976151228,
0.12497298419475555,
-0.059121739119291306,
0.12389809638261795,
-0.026602886617183685,
-0.06497231125831604,
0.016872141510248184,
-0.08635558933019638,
-0.09490044414997101,
-0.08154220879077911,
0.06851081550121307,
-0.005579586606472731,
0.04509405046701431,
-0.06512537598609924,
-0.08235415071249008,
-0.10161108523607254,
0.1483224332332611,
-0.12904593348503113,
-0.04588289558887482,
-0.11831609159708023,
0.008898249827325344,
0.19519782066345215,
-0.07419474422931671,
0.027216698974370956,
0.01987386867403984,
0.11614291369915009,
0.03488237038254738,
-0.004769372753798962,
0.10095962882041931,
-0.07478746026754379,
-0.21371091902256012,
-0.036124248057603836,
0.21705973148345947,
0.030139438807964325,
0.06762523204088211,
-0.022817203775048256,
0.024158397689461708,
-0.011456659995019436,
-0.08512549102306366,
0.08099406957626343,
0.0031768344342708588,
-0.029457494616508484,
0.044419508427381516,
-0.013798994943499565,
-0.028635799884796143,
-0.10357546806335449,
-0.03933241590857506,
0.10455001890659332,
0.2664998471736908,
-0.07158651947975159,
0.07858521491289139,
0.018268857151269913,
-0.0698678269982338,
-0.11858959496021271,
-0.04378610476851463,
0.13105592131614685,
0.02790982276201248,
0.005543719977140427,
-0.16792190074920654,
0.024966703727841377,
0.05390065908432007,
-0.01891671121120453,
0.08570022881031036,
-0.332704097032547,
-0.13784977793693542,
0.10449747741222382,
0.03084203042089939,
-0.09058986604213715,
-0.17049913108348846,
-0.08488649874925613,
0.0027095135301351547,
-0.07761450856924057,
0.013845874927937984,
0.011756157502532005,
0.12278129160404205,
0.0009843313600867987,
0.053862862288951874,
0.03562416508793831,
-0.04602830484509468,
0.15144740045070648,
0.039887990802526474,
0.032862745225429535,
-0.013308539986610413,
0.006057560909539461,
-0.03131125494837761,
-0.06293518841266632,
0.0660700872540474,
-0.09824154525995255,
0.011137272231280804,
-0.14364396035671234,
-0.01267995499074459,
-0.07883813232183456,
0.01885625533759594,
-0.05673668533563614,
0.00023565918672829866,
-0.010344413109123707,
0.032116856426000595,
0.09784960001707077,
0.013881350867450237,
0.08832640945911407,
-0.06591694802045822,
0.09278222173452377,
0.16298694908618927,
0.11822676658630371,
0.023655647411942482,
-0.15534822642803192,
0.01118767261505127,
0.021963272243738174,
0.005723385140299797,
-0.09930756688117981,
0.06332306563854218,
0.14612187445163727,
0.04474986717104912,
0.15787751972675323,
0.021155381575226784,
-0.10940172523260117,
-0.01340445876121521,
0.05657130852341652,
-0.094312883913517,
-0.1442098319530487,
-0.0030712883453816175,
-0.006585125345736742,
-0.13720862567424774,
-0.01136010978370905,
0.11093033850193024,
-0.009695151820778847,
-0.010933837853372097,
0.028465887531638145,
0.07676367461681366,
-0.02679833397269249,
0.2322431206703186,
0.022890061140060425,
0.1072256788611412,
-0.09862816333770752,
0.07014428079128265,
0.049356963485479355,
-0.07164106518030167,
0.053922541439533234,
0.12861204147338867,
-0.03427950292825699,
-0.02389654330909252,
0.023255927488207817,
0.11174075305461884,
0.06214878708124161,
-0.0421031191945076,
-0.13529632985591888,
-0.13889893889427185,
0.08277104049921036,
0.08432275056838989,
0.03083704598248005,
0.027282044291496277,
-0.008413740433752537,
0.022304361686110497,
-0.07594148814678192,
0.13951584696769714,
0.14130456745624542,
0.056691378355026245,
-0.11064379662275314,
0.08268177509307861,
-0.006218964233994484,
0.0021686777472496033,
0.007149994373321533,
-0.008011238649487495,
-0.10778632014989853,
0.026555173099040985,
-0.10433085262775421,
0.007684343960136175,
-0.06634923070669174,
0.01536859292536974,
0.033127255737781525,
-0.05807272344827652,
-0.04126893728971481,
0.028130708262324333,
-0.11535592377185822,
-0.05406241491436958,
-0.02866152487695217,
0.08002467453479767,
-0.12268040329217911,
-0.01810983568429947,
0.04284881800413132,
-0.13598409295082092,
0.11342363804578781,
0.04293788969516754,
-0.010275966487824917,
-0.00550507428124547,
-0.10305380821228027,
-0.014275636523962021,
0.02621980756521225,
0.014210829511284828,
0.024354618042707443,
-0.229151651263237,
-0.009497380815446377,
-0.028619101271033287,
-0.0013073254376649857,
-0.008624991402029991,
0.026842588558793068,
-0.1314302384853363,
0.014058292843401432,
-0.04213867709040642,
-0.055654097348451614,
-0.04802204668521881,
0.04381652921438217,
0.06670571118593216,
0.014528483152389526,
0.16294145584106445,
-0.07426740974187851,
0.09281470626592636,
-0.21221670508384705,
-0.0027960599400103092,
0.001689846976660192,
-0.05053297430276871,
-0.05257152020931244,
-0.005679132416844368,
0.10735156387090683,
-0.07201151549816132,
0.0736897885799408,
-0.027182143181562424,
0.0332387238740921,
0.01940663903951645,
-0.057889021933078766,
0.02403273433446884,
0.05405494198203087,
0.09176427870988846,
0.011450247839093208,
-0.02729588747024536,
0.05769334360957146,
-0.05206817388534546,
0.03114047460258007,
0.06052408367395401,
0.1339629590511322,
0.13190104067325592,
0.07145209610462189,
0.055502790957689285,
0.10307155549526215,
-0.13665327429771423,
-0.12291905283927917,
0.15284772217273712,
-0.06785032898187637,
0.14804880321025848,
-0.02438388019800186,
0.2009839415550232,
0.08285267651081085,
-0.20002664625644684,
0.09127886593341827,
-0.039270855486392975,
-0.09637121856212616,
-0.09897544980049133,
-0.10745395720005035,
-0.08333611488342285,
-0.14716662466526031,
0.01751581020653248,
-0.0950690284371376,
0.08324170857667923,
0.027580825611948967,
0.04766019433736801,
0.026339512318372726,
0.11007364839315414,
0.029417362064123154,
-0.00942905806005001,
0.10731200873851776,
0.015805872157216072,
-0.02331484481692314,
-0.010269073769450188,
-0.057194389402866364,
0.04596858099102974,
-0.009202132001519203,
0.0723760798573494,
-0.015387899242341518,
-0.06511266529560089,
0.04612809047102928,
-0.018657518550753593,
-0.11267267912626266,
0.030197303742170334,
-0.03550956770777702,
0.044295504689216614,
0.09735417366027832,
0.052922323346138,
-0.0027769175358116627,
-0.012242435477674007,
0.17088323831558228,
-0.08128952980041504,
-0.07209061831235886,
-0.13058245182037354,
0.17403405904769897,
-0.007115475367754698,
-0.0033050929196178913,
0.01693473756313324,
-0.0840734988451004,
-0.011230146512389183,
0.15025359392166138,
0.14611980319023132,
-0.00740245683118701,
-0.014705430716276169,
0.010800282470881939,
0.0006451696390286088,
-0.0024829371832311153,
0.030756399035453796,
0.10516491532325745,
0.07908362150192261,
-0.033066123723983765,
0.003625663463026285,
-0.03044547699391842,
-0.07479395717382431,
-0.04769953712821007,
0.09015580266714096,
0.022617952898144722,
-0.0037558069452643394,
-0.013564562425017357,
0.1229727566242218,
-0.08976138383150101,
-0.18122996389865875,
-0.003481443738564849,
-0.17521394789218903,
-0.1897459328174591,
-0.03160497546195984,
0.08712006360292435,
0.05133780837059021,
0.044303979724645615,
0.007949157617986202,
-0.051930394023656845,
0.12966269254684448,
0.008229522034525871,
-0.03720531612634659,
-0.08463000506162643,
0.0619136206805706,
-0.15265081822872162,
0.17391137778759003,
-0.011210743337869644,
0.06469902396202087,
0.12247860431671143,
0.02699039690196514,
-0.09433416277170181,
0.040696244686841965,
0.10541443526744843,
-0.15046989917755127,
0.05747905373573303,
0.20214535295963287,
-0.014818795956671238,
0.13183455169200897,
0.04857923835515976,
-0.04781961813569069,
-0.0012796216178685427,
-0.05115547776222229,
-0.015386510640382767,
-0.07288754731416702,
0.00021402562560979277,
-0.051242031157016754,
0.11685330420732498,
0.2064172923564911,
-0.07335717231035233,
-0.004300159402191639,
-0.04931723698973656,
0.009760569781064987,
0.021351207047700882,
0.11064627766609192,
-0.04552316665649414,
-0.266695499420166,
0.05245436355471611,
-0.010887286625802517,
0.04393012821674347,
-0.16261780261993408,
-0.09700090438127518,
0.038180045783519745,
-0.04112456366419792,
-0.06559102982282639,
0.11738674342632294,
0.05846186354756355,
0.047365326434373856,
-0.04390159249305725,
-0.08001263439655304,
-0.01007922738790512,
0.17385458946228027,
-0.17300182580947876,
-0.0446573905646801
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-ur-cv8
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1443
- Wer: 0.5677
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 3.6269 | 15.98 | 400 | 3.3246 | 1.0 |
| 3.0546 | 31.98 | 800 | 2.8148 | 0.9963 |
| 1.4589 | 47.98 | 1200 | 1.0237 | 0.6584 |
| 1.0911 | 63.98 | 1600 | 0.9524 | 0.5966 |
| 0.8879 | 79.98 | 2000 | 0.9827 | 0.5822 |
| 0.7467 | 95.98 | 2400 | 0.9923 | 0.5840 |
| 0.6427 | 111.98 | 2800 | 0.9988 | 0.5714 |
| 0.5685 | 127.98 | 3200 | 1.0872 | 0.5807 |
| 0.5068 | 143.98 | 3600 | 1.1194 | 0.5822 |
| 0.463 | 159.98 | 4000 | 1.1138 | 0.5692 |
| 0.4212 | 175.98 | 4400 | 1.1232 | 0.5714 |
| 0.4056 | 191.98 | 4800 | 1.1443 | 0.5677 |
### Framework versions
- Transformers 4.16.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-large-xls-r-300m-ur-cv8 --dataset mozilla-foundation/common_voice_8_0 --config ur --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-large-xls-r-300m-ur-cv8"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "ur", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "اب نے ٹ پیس ان لیتے ہیں"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 52.146 | 42.376 |
|
{"language": ["ur"], "license": "apache-2.0", "tags": ["generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-ur-cv8", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "ur"}, "metrics": [{"type": "wer", "value": 42.376, "name": "Test WER"}, {"type": "cer", "value": 18.18, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-ur-cv8
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"ur",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ur"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ur #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-ur-cv8
================================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 1.1443
* Wer: 0.5677
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 200
### Training results
### Framework versions
* Transformers 4.16.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ur #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
99,
143,
4,
35,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ur #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200### Training results### Framework versions\n\n\n* Transformers 4.16.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.10757747292518616,
0.14408108592033386,
-0.006464700680226088,
0.058358222246170044,
0.0882740393280983,
0.02091686800122261,
0.11482715606689453,
0.1756567507982254,
-0.030490919947624207,
0.12567636370658875,
0.07496508210897446,
0.09425866603851318,
0.07624086737632751,
0.10209965705871582,
-0.01845158077776432,
-0.2282084971666336,
0.0036660071928054094,
-0.05748185142874718,
-0.09075962007045746,
0.1095021441578865,
0.09184785187244415,
-0.09143803268671036,
0.04797894507646561,
0.004145409911870956,
-0.06518387049436569,
-0.010324294678866863,
-0.038306184113025665,
-0.028818927705287933,
0.07784882932901382,
0.03391546756029129,
0.015429848805069923,
0.019552676007151604,
0.07595376670360565,
-0.29467344284057617,
0.003403717651963234,
0.06753074377775192,
0.04242760315537453,
0.05656161159276962,
0.11306365579366684,
-0.029463117942214012,
0.10565105080604553,
-0.06906453520059586,
0.03481170907616615,
0.0698208436369896,
-0.08744000643491745,
-0.2121034562587738,
-0.08654235303401947,
0.03034208156168461,
0.14424413442611694,
0.05952318385243416,
-0.03677384927868843,
0.0427345409989357,
-0.07240239530801773,
0.09994751214981079,
0.24355973303318024,
-0.21140719950199127,
-0.05877956375479698,
-0.0014602208975702524,
0.02830866165459156,
0.0031171594746410847,
-0.11324532330036163,
-0.010186218656599522,
0.014204973354935646,
0.004029125440865755,
0.0655580684542656,
-0.013807340525090694,
0.04840444028377533,
0.01435869187116623,
-0.13624198734760284,
-0.05732749029994011,
0.1294671893119812,
0.070633165538311,
-0.027128733694553375,
-0.12206510454416275,
-0.025531353428959846,
-0.16623544692993164,
-0.03877748176455498,
0.03165440261363983,
0.02001669630408287,
-0.03860308602452278,
-0.005757217761129141,
0.03651830926537514,
-0.04833080247044563,
-0.07534590363502502,
0.056173522025346756,
0.12199821323156357,
0.0395657904446125,
-0.03443465381860733,
0.02015076018869877,
0.10094667226076126,
0.049440596252679825,
-0.175608828663826,
-0.020868655294179916,
0.03579539805650711,
-0.1372351199388504,
-0.006858071777969599,
0.0016044595977291465,
0.02445540763437748,
0.09070781618356705,
0.152878999710083,
0.03087634965777397,
0.09543570131063461,
0.02204941026866436,
0.009270790964365005,
-0.04323296993970871,
0.16340267658233643,
-0.05399494618177414,
-0.11740931123495102,
-0.046704087406396866,
0.13715553283691406,
-0.006735030561685562,
-0.0016193927731364965,
-0.048510853201150894,
0.04816232621669769,
0.0963229238986969,
0.08581320196390152,
0.02342338301241398,
0.018432198092341423,
-0.08042681217193604,
-0.0196117851883173,
-0.013109195977449417,
-0.14612050354480743,
0.061336684972047806,
0.08659293502569199,
-0.0743376761674881,
-0.03568415343761444,
-0.013306626118719578,
-0.008138217963278294,
-0.04714992642402649,
0.07096131145954132,
-0.05250727757811546,
0.007093525025993586,
-0.07482893019914627,
-0.08451614528894424,
0.029529664665460587,
-0.0280449241399765,
-0.012570232152938843,
-0.038496095687150955,
-0.08566024899482727,
-0.08213016390800476,
0.062089696526527405,
-0.07904637604951859,
-0.050514884293079376,
-0.07554460316896439,
-0.10092750936746597,
0.04548059031367302,
-0.00013566332927439362,
0.1386173516511917,
-0.0540221706032753,
0.08572398126125336,
0.029602322727441788,
0.04147988557815552,
0.15445879101753235,
0.060064103454351425,
-0.03038240410387516,
0.06862933188676834,
-0.14312072098255157,
0.09976058453321457,
-0.1363883912563324,
0.05868600308895111,
-0.1562924087047577,
-0.0838865265250206,
0.03482744097709656,
-0.0006896950071677566,
0.08794272691011429,
0.15429902076721191,
-0.19150498509407043,
-0.06341800838708878,
0.12707003951072693,
-0.039615318179130554,
-0.09265217185020447,
0.14022009074687958,
-0.010737596079707146,
-0.03089338168501854,
0.0037031699903309345,
0.17895561456680298,
0.13733498752117157,
-0.1084432378411293,
0.0013595864875242114,
-0.03721979260444641,
0.09387092292308807,
0.08642715215682983,
0.08500120788812637,
-0.06083673983812332,
0.05480077490210533,
-0.0018823111895471811,
-0.05388493090867996,
0.017012542113661766,
-0.06738278269767761,
-0.08111342787742615,
-0.011540133506059647,
-0.05687011778354645,
-0.01153893768787384,
0.03925534337759018,
-0.022310873493552208,
-0.07997782528400421,
-0.13348840177059174,
-0.03944214805960655,
0.10675643384456635,
-0.08264102786779404,
0.007476592902094126,
-0.08427475392818451,
0.08793777972459793,
-0.006362949497997761,
0.011620293371379375,
-0.13074354827404022,
-0.022587662562727928,
0.0474935807287693,
-0.07952302694320679,
0.0070307874120771885,
-0.012337147258222103,
0.05585391819477081,
0.031237032264471054,
-0.011392293497920036,
-0.05934828519821167,
-0.018964996561408043,
-0.014547315426170826,
-0.05153565853834152,
-0.2521437704563141,
-0.07011409103870392,
-0.019195904955267906,
0.14674384891986847,
-0.1973729282617569,
0.010585225187242031,
0.07844589650630951,
0.11126149445772171,
0.013737855479121208,
-0.05245937407016754,
0.03857937827706337,
0.021839899942278862,
-0.03329439088702202,
-0.07288936525583267,
0.010295486077666283,
-0.00556570291519165,
-0.09051376581192017,
-0.005873830057680607,
-0.14134526252746582,
0.04899973049759865,
0.06820026785135269,
0.033058784902095795,
-0.08026949316263199,
-0.03464656323194504,
-0.06157492473721504,
-0.05270351469516754,
-0.03945845365524292,
-0.028779860585927963,
0.14625446498394012,
0.03470587357878685,
0.08728715777397156,
-0.07778699696063995,
-0.07279287278652191,
0.02409191057085991,
-0.006431506015360355,
-0.006535131949931383,
0.1762990653514862,
0.06250029057264328,
-0.03510657325387001,
0.08210310339927673,
0.024081440642476082,
-0.043102093040943146,
0.11398568749427795,
-0.07338180392980576,
-0.08426835387945175,
-0.06613121926784515,
0.061554618179798126,
0.02965659834444523,
0.1156754195690155,
-0.1771344542503357,
-0.009867283515632153,
0.0381828248500824,
0.019341791048645973,
0.018079329282045364,
-0.16511039435863495,
0.009339535608887672,
0.03225600719451904,
-0.09435827285051346,
-0.007399470079690218,
0.01386390533298254,
-0.00660604378208518,
0.07819265127182007,
-0.00995100662112236,
-0.07496369630098343,
-0.04061255604028702,
-0.05636131763458252,
-0.10353086143732071,
0.16467435657978058,
-0.08702009171247482,
-0.1137501522898674,
-0.11868519335985184,
-0.01387450285255909,
-0.05200164020061493,
-0.020888084545731544,
0.044310543686151505,
-0.07702162861824036,
-0.05176953598856926,
-0.08114001154899597,
0.0193652231246233,
-0.001527052721939981,
0.021774213761091232,
0.03918728604912758,
0.015923401340842247,
0.06069698929786682,
-0.10622148215770721,
-0.0004247030010446906,
-0.0031856982968747616,
-0.006072748452425003,
-0.0011044733691960573,
0.05179281160235405,
0.09944064170122147,
0.15650403499603271,
0.061253391206264496,
0.05065315589308739,
-0.0027565297205001116,
0.21158263087272644,
-0.13504712283611298,
0.022118333727121353,
0.10560714453458786,
-0.003451734781265259,
0.055352408438920975,
0.15517808496952057,
0.040212422609329224,
-0.09206854552030563,
0.00930482055991888,
0.0595201775431633,
-0.008929978124797344,
-0.24037876725196838,
-0.01704457774758339,
-0.0660325437784195,
-0.0019732469227164984,
0.07996773719787598,
0.036497507244348526,
-0.030918769538402557,
0.009154125116765499,
0.00040887610521167517,
-0.0442156046628952,
0.04194730147719383,
0.04970569163560867,
0.0488014854490757,
0.03472685441374779,
0.09294700622558594,
-0.020189478993415833,
-0.027117295190691948,
0.0271061509847641,
0.010435940697789192,
0.22145697474479675,
-0.01842930167913437,
0.20062217116355896,
0.0725182369351387,
0.11192727833986282,
-0.038858577609062195,
0.03679270297288895,
-0.013532816432416439,
0.008588694036006927,
0.0341382659971714,
-0.0646681860089302,
-0.022558964788913727,
0.030619265511631966,
0.10804899781942368,
0.02295706793665886,
-0.07006178051233292,
0.04992295429110527,
0.06634046137332916,
0.3263765573501587,
0.07757969945669174,
-0.23234428465366364,
-0.046770885586738586,
0.030839215964078903,
-0.07500720024108887,
-0.02646399661898613,
0.020461568608880043,
0.121223583817482,
-0.08058421313762665,
0.07558898627758026,
-0.0463729053735733,
0.08500057458877563,
-0.08149150758981705,
0.010170506313443184,
0.08275490999221802,
0.09567577391862869,
0.008643395267426968,
0.07135855406522751,
-0.23914983868598938,
0.24792605638504028,
0.00159467663615942,
0.05799005553126335,
-0.06605523824691772,
0.06266865134239197,
0.028485551476478577,
-0.0458567850291729,
0.11131950467824936,
-0.007583832833915949,
-0.08361883461475372,
-0.14369556307792664,
-0.09624607115983963,
0.002627464011311531,
0.12650546431541443,
-0.0710148960351944,
0.1276097297668457,
-0.04297645390033722,
-0.05956451594829559,
0.015508385375142097,
-0.0737459808588028,
-0.09558546543121338,
-0.0866202861070633,
0.06457267701625824,
-0.02528884820640087,
0.013640234246850014,
-0.06876921653747559,
-0.07326298952102661,
-0.07692117989063263,
0.1674133986234665,
-0.1447107195854187,
-0.03124702163040638,
-0.12710745632648468,
0.021631285548210144,
0.1687318980693817,
-0.0714225247502327,
0.009417256340384483,
0.011897437274456024,
0.12486850470304489,
0.0392010360956192,
-0.002499809255823493,
0.10031003504991531,
-0.08610022068023682,
-0.20680227875709534,
-0.04449421167373657,
0.1918099820613861,
0.02290222980082035,
0.05851329490542412,
-0.020596938207745552,
0.00862107053399086,
-0.019760815426707268,
-0.09878826141357422,
0.08756152540445328,
0.02606154978275299,
-0.02143782377243042,
0.0508795864880085,
-0.03648762404918671,
-0.005983872804790735,
-0.09507298469543457,
-0.044313423335552216,
0.08401773124933243,
0.2726829946041107,
-0.07595560699701309,
0.04660160467028618,
0.015364292077720165,
-0.07921891659498215,
-0.13866767287254333,
-0.017189424484968185,
0.10126395523548126,
0.02922866679728031,
0.0017128295730799437,
-0.160374715924263,
0.024820268154144287,
0.05098797380924225,
-0.019764866679906845,
0.11639932543039322,
-0.33960118889808655,
-0.12747575342655182,
0.08232567459344864,
0.04845801368355751,
-0.07855556905269623,
-0.1853754222393036,
-0.07932103425264359,
-0.00898623839020729,
-0.07065945118665695,
0.015035162679851055,
0.0001862603094195947,
0.12178084254264832,
-0.004474109038710594,
0.031137023121118546,
0.03199534863233566,
-0.0580255500972271,
0.1583709418773651,
0.05225382372736931,
0.043669313192367554,
-0.017145399004220963,
-0.0022017669398337603,
0.010185386054217815,
-0.06924830377101898,
0.05302803963422775,
-0.08330613374710083,
0.022344347089529037,
-0.16349323093891144,
-0.00877483282238245,
-0.07230795919895172,
0.013853809796273708,
-0.06516898423433304,
-0.012764030136168003,
-0.022528158500790596,
0.045542825013399124,
0.09700196236371994,
0.018329260870814323,
0.08360325545072556,
-0.06241149827837944,
0.07300079613924026,
0.16575653851032257,
0.10132681578397751,
0.01451276708394289,
-0.1419866383075714,
0.012139381840825081,
0.030068689957261086,
0.00881699938327074,
-0.1281881183385849,
0.06237832084298134,
0.14131057262420654,
0.05326411873102188,
0.1477005034685135,
0.02906077355146408,
-0.10004866123199463,
-0.009798102080821991,
0.057650644332170486,
-0.08642219007015228,
-0.14811457693576813,
-0.007601221092045307,
-0.013263192027807236,
-0.14421498775482178,
-0.012324814684689045,
0.1110752522945404,
-0.004124731756746769,
0.0067032454535365105,
0.022584009915590286,
0.08027875423431396,
-0.02531678043305874,
0.22813330590724945,
0.028589636087417603,
0.11212082207202911,
-0.09125503152608871,
0.07405366003513336,
0.030414359644055367,
-0.05695997551083565,
0.03518490120768547,
0.11373865604400635,
-0.04227743297815323,
-0.035682644695043564,
0.011713828891515732,
0.10692642629146576,
0.05672736465930939,
-0.029477152973413467,
-0.14850777387619019,
-0.1326638162136078,
0.0846685990691185,
0.055787019431591034,
0.041804272681474686,
0.026714086532592773,
-0.00976475141942501,
0.020781705155968666,
-0.08361242711544037,
0.1354900300502777,
0.11716112494468689,
0.0525643415749073,
-0.11768583208322525,
0.06255132704973221,
-0.00949015375226736,
0.0176945049315691,
0.0033565633930265903,
0.007011534180492163,
-0.11732476949691772,
0.02857157774269581,
-0.08350543677806854,
-0.008647048845887184,
-0.0611053965985775,
0.009944849647581577,
0.027078205719590187,
-0.06438661366701126,
-0.05608895421028137,
0.018467212095856667,
-0.11246246099472046,
-0.05711996927857399,
-0.025817980989813805,
0.08410273492336273,
-0.11499685049057007,
-0.019261149689555168,
0.04369517043232918,
-0.14330239593982697,
0.10902296751737595,
0.01919718086719513,
0.005288832820951939,
-0.01405448466539383,
-0.08871209621429443,
0.0036647010128945112,
0.0180224422365427,
0.018299024552106857,
0.03440456837415695,
-0.22296158969402313,
0.00022208812879398465,
-0.0423775240778923,
0.00920204445719719,
-0.008704772219061852,
0.019810905680060387,
-0.13137586414813995,
0.02004496566951275,
-0.04393991455435753,
-0.058099281042814255,
-0.049329955130815506,
0.055352602154016495,
0.06879248470067978,
0.009316395036876202,
0.16743025183677673,
-0.05250097066164017,
0.08348938822746277,
-0.2164972871541977,
0.00019849547243211418,
0.011982135474681854,
-0.046102214604616165,
-0.02193743735551834,
-0.014371059834957123,
0.10777409374713898,
-0.07368101179599762,
0.08813139796257019,
-0.014392731711268425,
0.044318705797195435,
0.02653258852660656,
-0.0722595825791359,
0.03299187123775482,
0.05538037419319153,
0.08484061062335968,
0.01123817078769207,
-0.007621478289365768,
0.059405747801065445,
-0.044414911419153214,
0.043790824711322784,
0.05304263159632683,
0.14596202969551086,
0.13379983603954315,
0.07249534875154495,
0.058064550161361694,
0.09381629526615143,
-0.14572939276695251,
-0.13430200517177582,
0.165110245347023,
-0.07892578095197678,
0.14936095476150513,
-0.03908616676926613,
0.1800820529460907,
0.10202600806951523,
-0.19361726939678192,
0.08665910363197327,
-0.04054299369454384,
-0.09087002277374268,
-0.10118266940116882,
-0.09809454530477524,
-0.07526426017284393,
-0.15049944818019867,
0.017727931961417198,
-0.09720492362976074,
0.08065757155418396,
0.03205240145325661,
0.0427926667034626,
0.02372693456709385,
0.09822934120893478,
0.023119531571865082,
0.000766894721891731,
0.09836971014738083,
0.01451166346669197,
-0.03402582183480263,
-0.016936112195253372,
-0.06741315871477127,
0.03374660015106201,
-0.026223205029964447,
0.07431352138519287,
-0.01223370898514986,
-0.0774151161313057,
0.049091171473264694,
-0.014932515099644661,
-0.09348541498184204,
0.021739933639764786,
-0.04184642806649208,
0.034414831548929214,
0.08641866594552994,
0.04718952253460884,
-0.007482192013412714,
-0.011792146600782871,
0.19234733283519745,
-0.08336540311574936,
-0.060720618814229965,
-0.13474026322364807,
0.14111216366291046,
-0.006089285481721163,
0.012969768606126308,
0.006079475861042738,
-0.08080574125051498,
-0.01978529989719391,
0.18346400558948517,
0.14106327295303345,
-0.023343510925769806,
-0.0216364786028862,
0.02276957966387272,
0.0013642699923366308,
-0.017612086609005928,
0.043601781129837036,
0.10935734212398529,
0.06501543521881104,
-0.02946236915886402,
-0.01088548731058836,
-0.03082779236137867,
-0.07403858751058578,
-0.04672287777066231,
0.09938789904117584,
0.028427330777049065,
0.011336137540638447,
-0.01799163967370987,
0.11535775661468506,
-0.0687180906534195,
-0.18121853470802307,
0.019795624539256096,
-0.16802820563316345,
-0.18238703906536102,
-0.029298849403858185,
0.08210702985525131,
0.03327011689543724,
0.06248289719223976,
-0.0008312232093885541,
-0.04979183152318001,
0.13297174870967865,
-0.0008111000061035156,
-0.044475000351667404,
-0.09773227572441101,
0.055469244718551636,
-0.14113079011440277,
0.19135235249996185,
-0.02733459137380123,
0.03267006203532219,
0.14070332050323486,
0.007566956337541342,
-0.1127932146191597,
0.0348251648247242,
0.10454117506742477,
-0.1313386857509613,
0.06194941699504852,
0.18650034070014954,
-0.022919312119483948,
0.13054773211479187,
0.04937601834535599,
-0.04021186754107475,
-0.003764211433008313,
-0.055646903812885284,
-0.01718817465007305,
-0.08309563994407654,
0.0031059577595442533,
-0.05544744431972504,
0.12314736098051071,
0.20748762786388397,
-0.07469107955694199,
0.007837769575417042,
-0.04912877455353737,
0.02571818418800831,
0.008650042116641998,
0.11916550993919373,
-0.043065592646598816,
-0.2626388370990753,
0.04788418859243393,
-0.002156444825232029,
0.03214592486619949,
-0.17244850099086761,
-0.09227835386991501,
0.04611066356301308,
-0.04459280148148537,
-0.06215761974453926,
0.1210128590464592,
0.0630514919757843,
0.04916921630501747,
-0.0525679886341095,
-0.13638940453529358,
-0.011445970274508,
0.1706196665763855,
-0.16308709979057312,
-0.05651654675602913
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-ur
This model is a fine-tuned version of [anuragshas/wav2vec2-large-xls-r-300m-ur](https://huggingface.co/anuragshas/wav2vec2-large-xls-r-300m-ur) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0508
- Wer: 0.7328
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.12
- num_epochs: 240
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 0.0719 | 66.67 | 400 | 1.8510 | 0.7432 |
| 0.0284 | 133.33 | 800 | 2.0088 | 0.7415 |
| 0.014 | 200.0 | 1200 | 2.0508 | 0.7328 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-ur", "results": []}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xls-r-300m-ur
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-ur
============================
This model is a fine-tuned version of anuragshas/wav2vec2-large-xls-r-300m-ur on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 2.0508
* Wer: 0.7328
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 64
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.12
* num\_epochs: 240
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
65,
145,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.12\n* num\\_epochs: 240### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.1430255025625229,
0.10313771665096283,
-0.0019136489136144519,
0.0701155811548233,
0.13828518986701965,
0.0176943801343441,
0.11748827993869781,
0.1283492147922516,
-0.11486749351024628,
0.07878625392913818,
0.10219380259513855,
0.09410135447978973,
0.03837212547659874,
0.09915472567081451,
-0.029828839004039764,
-0.2887478470802307,
-0.009611048735678196,
0.03523338586091995,
-0.12711161375045776,
0.125694140791893,
0.08906815201044083,
-0.13095858693122864,
0.050631847232580185,
0.038744889199733734,
-0.18093255162239075,
0.00177321070805192,
-0.002230173908174038,
-0.09319806098937988,
0.1315717101097107,
0.02561854012310505,
0.10244947671890259,
0.021339263767004013,
0.10102376341819763,
-0.17652466893196106,
0.007001847960054874,
0.058498844504356384,
0.036224428564310074,
0.10281676054000854,
0.08498384803533554,
-0.0009374228538945317,
0.13001343607902527,
-0.06665626913309097,
0.06569579988718033,
0.03902803361415863,
-0.09268276393413544,
-0.28490889072418213,
-0.0873311385512352,
0.07578562945127487,
0.08672115206718445,
0.10344186425209045,
-0.0140309426933527,
0.10710716247558594,
-0.0742780789732933,
0.0905262902379036,
0.2620447874069214,
-0.27948519587516785,
-0.07241225242614746,
-0.022756529971957207,
0.03703627362847328,
0.017718849703669548,
-0.12532657384872437,
-0.019569428637623787,
0.043479323387145996,
0.03937217965722084,
0.10233756899833679,
0.0022483589127659798,
-0.051232222467660904,
0.013467478565871716,
-0.13997237384319305,
-0.04334649443626404,
0.12962530553340912,
0.045866310596466064,
-0.03808450326323509,
-0.07363393157720566,
-0.05221147462725639,
-0.22777676582336426,
-0.03668444976210594,
0.0013296814868226647,
0.033188410103321075,
-0.07927810400724411,
-0.13056515157222748,
0.005133578088134527,
-0.08653941005468369,
-0.09996411204338074,
-0.016185060143470764,
0.198866605758667,
0.04998057335615158,
0.004653219133615494,
-0.017259078100323677,
0.11709357053041458,
0.040361832827329636,
-0.15476785600185394,
0.020673496648669243,
0.054658908396959305,
-0.04800097644329071,
-0.0008937862585298717,
-0.05248064547777176,
-0.029619405046105385,
0.00867290049791336,
0.1395302414894104,
-0.07260600477457047,
0.03905700519680977,
0.02415464259684086,
0.02862798050045967,
-0.09624107927083969,
0.21412067115306854,
-0.08688917011022568,
-0.03453144058585167,
-0.014763989485800266,
0.09851354360580444,
0.024335989728569984,
-0.012395706959068775,
-0.09253470599651337,
0.004045130219310522,
0.10959091037511826,
0.03275943547487259,
-0.02637622319161892,
0.03362421691417694,
-0.03068803809583187,
-0.030811680480837822,
0.02574833482503891,
-0.10270044952630997,
0.026345016434788704,
0.020289557054638863,
-0.09687881916761398,
0.01718558929860592,
0.009106092154979706,
0.0060615395195782185,
-0.024271788075566292,
0.15641625225543976,
-0.08788266777992249,
0.0014634489780291915,
-0.08162752538919449,
-0.09560856223106384,
0.021436240524053574,
-0.07405019551515579,
0.0010499731870368123,
-0.07404778152704239,
-0.11458740383386612,
-0.023777499794960022,
0.02346990257501602,
-0.03924014791846275,
-0.08066751807928085,
-0.04199250042438507,
-0.11124864220619202,
0.04087849706411362,
-0.021402765065431595,
0.16444143652915955,
-0.051403194665908813,
0.12434171885251999,
0.06514739245176315,
0.05239494889974594,
0.004333093762397766,
0.058653585612773895,
-0.06101251393556595,
0.014287125319242477,
-0.16280214488506317,
0.037397246807813644,
-0.0671551525592804,
0.03265363350510597,
-0.09948714822530746,
-0.13313736021518707,
0.014024179428815842,
-0.006179457530379295,
0.09203996509313583,
0.0985306054353714,
-0.15726380050182343,
-0.11778269708156586,
0.14347153902053833,
-0.08256732672452927,
-0.09941092878580093,
0.131003275513649,
-0.007705003023147583,
-0.04258444532752037,
0.043519631028175354,
0.15424028038978577,
0.05593225359916687,
-0.11823870241641998,
-0.029444079846143723,
-0.03733080253005028,
0.10451710224151611,
-0.018732458353042603,
0.09487870335578918,
-0.03114345110952854,
0.04872952029109001,
0.017028138041496277,
-0.046435773372650146,
0.042472340166568756,
-0.10833649337291718,
-0.08845525979995728,
-0.0406520776450634,
-0.10095921158790588,
0.0498579777777195,
0.06796912848949432,
0.05462035536766052,
-0.09046047180891037,
-0.12578144669532776,
0.04509009048342705,
0.1246824637055397,
-0.0796007439494133,
0.030461933463811874,
-0.08298313617706299,
0.08581110090017319,
-0.04223538935184479,
-0.02472800575196743,
-0.18908946216106415,
-0.000891605915967375,
0.019422205165028572,
-0.028306489810347557,
0.02510339394211769,
-0.024088595062494278,
0.0720917135477066,
0.068015456199646,
-0.0667293518781662,
-0.06390045583248138,
-0.07980927079916,
-0.0204169899225235,
-0.08461867272853851,
-0.2333867996931076,
-0.08373759686946869,
-0.007762879598885775,
0.14002180099487305,
-0.18228060007095337,
0.014206786639988422,
0.015166372992098331,
0.1319885104894638,
0.03138233721256256,
-0.031345680356025696,
-0.018726136535406113,
0.09485471993684769,
-0.025082148611545563,
-0.05279373377561569,
0.02766169235110283,
0.0021918541751801968,
-0.09203945845365524,
-0.021940531209111214,
-0.11251591891050339,
0.15491946041584015,
0.13943533599376678,
-0.03223170340061188,
-0.07459168881177902,
0.023244718089699745,
-0.08656267821788788,
-0.051797494292259216,
-0.03679075092077255,
-0.003468794049695134,
0.1703980416059494,
0.021982019767165184,
0.13826850056648254,
-0.0827920064330101,
-0.05519290640950203,
0.04204718396067619,
0.007786380592733622,
0.006200646515935659,
0.11273801326751709,
0.08785626292228699,
-0.0036983699537813663,
0.12147510796785355,
0.08058816194534302,
-0.12289433926343918,
0.15056616067886353,
-0.07139269262552261,
-0.10378110408782959,
-0.023902669548988342,
-0.020378615707159042,
0.020104700699448586,
0.14549580216407776,
-0.13847798109054565,
-0.022736554965376854,
0.02777898870408535,
-0.004063621629029512,
0.02525661699473858,
-0.22378629446029663,
-0.00422125868499279,
0.01608525589108467,
-0.0630304291844368,
-0.02610471285879612,
-0.002175048226490617,
0.01362791657447815,
0.10731100291013718,
0.0009756771614775062,
-0.07230562716722488,
-0.003050507279112935,
0.0009286392596550286,
-0.05848591402173042,
0.19105041027069092,
-0.07953371107578278,
-0.15250326693058014,
-0.14639905095100403,
-0.003840862540528178,
-0.06186511740088463,
-0.009100460447371006,
0.04455901309847832,
-0.10588078945875168,
-0.027915121987462044,
-0.03735165670514107,
0.05686451122164726,
-0.030007269233465195,
0.05430154874920845,
0.012759031727910042,
0.013043083250522614,
0.08083555847406387,
-0.1205260157585144,
0.02255871146917343,
-0.05599571764469147,
-0.05635395646095276,
0.004335773177444935,
0.08601804077625275,
0.11381515860557556,
0.16184143722057343,
0.007338874973356724,
0.015548468567430973,
-0.03026202879846096,
0.16932499408721924,
-0.10600905865430832,
-0.04397071525454521,
0.13683278858661652,
0.0008527294266968966,
0.03608664125204086,
0.10521810501813889,
0.07070796191692352,
-0.07017906755208969,
-0.011907657608389854,
0.04059121757745743,
-0.019220983609557152,
-0.2405238151550293,
-0.04608703777194023,
-0.04474532604217529,
-0.004966967739164829,
0.09926498681306839,
0.035038407891988754,
0.03671753779053688,
0.03511819615960121,
-0.013717873953282833,
0.04175540804862976,
-0.04432979226112366,
0.06003948301076889,
0.09991718828678131,
0.050665829330682755,
0.13436496257781982,
-0.027825113385915756,
-0.05481980741024017,
0.026684002950787544,
-0.018350088968873024,
0.21337036788463593,
-0.029668135568499565,
0.15638402104377747,
0.042786743491888046,
0.17984677851200104,
0.015666460618376732,
0.08590662479400635,
0.008411807008087635,
-0.026088479906320572,
0.026646310463547707,
-0.06027120351791382,
-0.02922739088535309,
0.009763109497725964,
0.05170738324522972,
0.08514074236154556,
-0.12554442882537842,
-0.014519833028316498,
0.034689802676439285,
0.33927392959594727,
0.04974743723869324,
-0.3184763193130493,
-0.11776168644428253,
-0.02696377784013748,
-0.0635644719004631,
-0.03073921799659729,
0.030043654143810272,
0.1143796443939209,
-0.09061960875988007,
0.052070051431655884,
-0.06865931302309036,
0.0827779546380043,
-0.06018872931599617,
0.022977890446782112,
0.09776252508163452,
0.10354331880807877,
0.006199433468282223,
0.05182928591966629,
-0.2554526925086975,
0.28974631428718567,
0.005450000986456871,
0.08965322375297546,
-0.052514128386974335,
0.02580973692238331,
0.023285377770662308,
-0.0023509073071181774,
0.05212026089429855,
-0.02703964151442051,
-0.017590345814824104,
-0.181186243891716,
-0.08241279423236847,
0.020959902554750443,
0.12874935567378998,
-0.0596219040453434,
0.11875699460506439,
-0.024647586047649384,
-0.03157695755362511,
0.05369693413376808,
-0.08523385226726532,
-0.06734427064657211,
-0.0845024511218071,
0.020901571959257126,
0.04080598056316376,
0.030630800873041153,
-0.09031102061271667,
-0.1371113508939743,
-0.08457987010478973,
0.1307172328233719,
-0.09923464804887772,
-0.03541410714387894,
-0.1211928203701973,
0.09662708640098572,
0.17102190852165222,
-0.070717953145504,
0.05466172844171524,
0.018827980384230614,
0.11817203462123871,
0.024372786283493042,
-0.03767077252268791,
0.09016340970993042,
-0.07951834052801132,
-0.24091403186321259,
-0.042771924287080765,
0.1735285073518753,
0.01631646789610386,
0.06712012737989426,
-0.037392765283584595,
0.03797583654522896,
-0.029670020565390587,
-0.07645875215530396,
0.040800657123327255,
-0.0012469952926039696,
0.03899762034416199,
0.03300086036324501,
-0.02174973301589489,
-0.018957719206809998,
-0.06687857210636139,
-0.03890284523367882,
0.13943250477313995,
0.2540757358074188,
-0.09268444031476974,
0.019693441689014435,
0.06736748665571213,
-0.037446774542331696,
-0.16050472855567932,
0.019256796687841415,
0.11970335245132446,
0.031201915815472603,
-0.01242835447192192,
-0.19761554896831512,
0.09145405888557434,
0.08809024095535278,
-0.03222877159714699,
0.10846340656280518,
-0.30354809761047363,
-0.1450391560792923,
0.10988569259643555,
0.09732145816087723,
0.008234376087784767,
-0.1591678410768509,
-0.05509272962808609,
-0.014510472305119038,
-0.09954161942005157,
0.0883227288722992,
-0.04804350808262825,
0.12460599094629288,
-0.022306136786937714,
0.08076751232147217,
0.017152944579720497,
-0.05809056758880615,
0.12414281815290451,
0.0010012161219492555,
0.05379842221736908,
-0.00447270181030035,
-0.0019537967164069414,
0.05319485440850258,
-0.030928125604987144,
0.016309751197695732,
-0.06087614223361015,
0.036037374287843704,
-0.08224783092737198,
-0.023202599957585335,
-0.11025115847587585,
0.04285674914717674,
-0.044090837240219116,
-0.04543101042509079,
-0.015764307230710983,
0.015980858355760574,
0.010116567835211754,
-0.018549589440226555,
0.1375727355480194,
0.007020019926130772,
0.16960659623146057,
0.1136048436164856,
0.08039171993732452,
-0.02841794490814209,
-0.11117304861545563,
-0.005468560848385096,
-0.023946557193994522,
0.07457927614450455,
-0.11726284772157669,
0.009838408790528774,
0.1392662078142166,
0.0774727314710617,
0.10918190330266953,
0.07260729372501373,
-0.07122518867254257,
0.012524611316621304,
0.07095465809106827,
-0.14452886581420898,
-0.0910010039806366,
-0.024074802175164223,
-0.017058784142136574,
-0.13129377365112305,
0.0712641179561615,
0.10718825459480286,
-0.0704115629196167,
-0.018721239641308784,
0.011415933258831501,
0.00021762697724625468,
-0.045400455594062805,
0.24396519362926483,
0.059802472591400146,
0.08026827871799469,
-0.11953288316726685,
0.07563675940036774,
0.04405321925878525,
-0.14359663426876068,
0.020027024671435356,
0.07371249794960022,
-0.05983957275748253,
-0.008775644935667515,
0.013005800545215607,
0.08646755665540695,
-0.041166771203279495,
-0.06656743586063385,
-0.15669220685958862,
-0.13064338266849518,
0.08443146198987961,
0.14798064529895782,
0.06509225815534592,
0.02908931113779545,
-0.05214099586009979,
0.04803256690502167,
-0.14079678058624268,
0.10726795345544815,
0.06673253327608109,
0.08091549575328827,
-0.15689922869205475,
0.17029984295368195,
0.023935895413160324,
0.0345575176179409,
-0.006596750114113092,
0.019609183073043823,
-0.09478481858968735,
0.017320241779088974,
-0.10959211736917496,
-0.03726718947291374,
-0.02732883393764496,
-0.005113022867590189,
-0.00046311880578286946,
-0.06463763862848282,
-0.0699041485786438,
0.03725941479206085,
-0.11631692945957184,
-0.036469485610723495,
0.012125696986913681,
0.029512275010347366,
-0.12944737076759338,
0.006142112426459789,
0.03559108078479767,
-0.10494258999824524,
0.0980156734585762,
0.0865200012922287,
0.029294710606336594,
0.06824296712875366,
-0.07183065265417099,
-0.017347058281302452,
0.04805862158536911,
0.0017673125257715583,
0.05669211223721504,
-0.11597652733325958,
-0.00367078953422606,
-0.020087210461497307,
0.05821002274751663,
0.0014794028829783201,
0.06351419538259506,
-0.14372901618480682,
-0.009151746518909931,
-0.010686924681067467,
-0.044086676090955734,
-0.07072142511606216,
0.033732861280441284,
0.0941167026758194,
0.030305784195661545,
0.17339932918548584,
-0.0784546434879303,
0.034288689494132996,
-0.21985718607902527,
0.013044511899352074,
-0.04743547365069389,
-0.09544596821069717,
-0.10888603329658508,
-0.014193523675203323,
0.09100280702114105,
-0.06141923740506172,
0.09691236913204193,
-0.029233524575829506,
0.09220542013645172,
0.034219469875097275,
-0.044727303087711334,
-0.018758730962872505,
0.04756888374686241,
0.2134983092546463,
0.04413938149809837,
-0.021534012630581856,
0.06383755803108215,
0.01807166449725628,
0.0732528567314148,
0.13500316441059113,
0.1641928255558014,
0.1285676807165146,
0.030132170766592026,
0.09472090750932693,
0.09090780466794968,
-0.09640879929065704,
-0.17336584627628326,
0.0669543445110321,
-0.06381210684776306,
0.12765516340732574,
-0.013516922481358051,
0.22368866205215454,
0.10178548842668533,
-0.1650693267583847,
0.049573685973882675,
-0.04000604897737503,
-0.07721564918756485,
-0.09914858639240265,
-0.005207465961575508,
-0.06775680184364319,
-0.17300093173980713,
0.02105492167174816,
-0.12097836285829544,
0.03900793939828873,
0.08311974257230759,
0.026286916807293892,
0.009229768067598343,
0.15679731965065002,
0.04480776563286781,
0.008852562867105007,
0.09504584968090057,
0.035864125937223434,
-0.03132415562868118,
-0.06520480662584305,
-0.0781654492020607,
0.01854192279279232,
-0.017257964238524437,
0.050961580127477646,
-0.0627264603972435,
-0.1246282160282135,
0.060743771493434906,
0.014266452752053738,
-0.0997173935174942,
0.03899473324418068,
-0.011108524166047573,
0.09409964829683304,
0.062002040445804596,
0.009694531559944153,
0.006842858158051968,
-0.026191098615527153,
0.25200390815734863,
-0.11681530624628067,
-0.0747145563364029,
-0.1227383017539978,
0.24918848276138306,
0.0179207231849432,
-0.03200923651456833,
0.039316680282354355,
-0.08333232998847961,
-0.03956734761595726,
0.19130927324295044,
0.1805884689092636,
-0.01991787552833557,
-0.022147729992866516,
0.026356443762779236,
-0.016247162595391273,
-0.06771276891231537,
0.07284779101610184,
0.1393316686153412,
0.11748179793357849,
-0.0672231912612915,
-0.03846874460577965,
-0.05226126313209534,
-0.04068809747695923,
-0.015155710279941559,
0.08137750625610352,
0.005742533598095179,
-0.029792118817567825,
-0.03477512300014496,
0.07409818470478058,
-0.0584338903427124,
-0.1681341677904129,
0.053843576461076736,
-0.22450923919677734,
-0.18980683386325836,
-0.015809962525963783,
0.10916869342327118,
0.023837050423026085,
0.0620420016348362,
0.004792424384504557,
-0.02297748252749443,
0.09397339075803757,
-0.005550832021981478,
-0.06950653344392776,
-0.11235301941633224,
0.0949883684515953,
-0.09319143742322922,
0.18385165929794312,
-0.05129953473806381,
0.07002641260623932,
0.11431825160980225,
0.07402357459068298,
-0.08691295236349106,
0.03204638883471489,
0.06834536790847778,
-0.1667511910200119,
0.033005114644765854,
0.1959330290555954,
-0.027182303369045258,
0.10361621528863907,
0.018787089735269547,
-0.1306229680776596,
0.00341538293287158,
-0.07962311804294586,
-0.035919494926929474,
-0.05825630947947502,
-0.022399315610527992,
-0.04776730388402939,
0.12536916136741638,
0.21370498836040497,
-0.05192086845636368,
-0.011089588515460491,
-0.06315618008375168,
0.017814883962273598,
0.06854065507650375,
0.07798295468091965,
-0.03434344381093979,
-0.29626092314720154,
0.021716535091400146,
0.009155101142823696,
-0.012446005828678608,
-0.2540765702724457,
-0.07481039315462112,
0.05055376887321472,
-0.07091878354549408,
-0.08071395754814148,
0.07461689412593842,
0.05926287919282913,
0.045855578035116196,
-0.043593864887952805,
-0.026802657172083855,
-0.05473614111542702,
0.17100466787815094,
-0.20732155442237854,
-0.07553283125162125
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Dhivehi
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Dhivehi using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "dv", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-dv")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-dv")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Dhivehi test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "dv", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-dv")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-dv")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\،\.\؟\–\'\’]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 55.68 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "dv", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Dhivehi", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice dv", "type": "common_voice", "args": "dv"}, "metrics": [{"type": "wer", "value": 55.68, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-dv
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"dv",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"dv"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Dhivehi
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Dhivehi using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Dhivehi test data of Common Voice.
Test Result: 55.68 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Dhivehi\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Dhivehi using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Dhivehi test data of Common Voice.\n\nTest Result: 55.68 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Dhivehi\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Dhivehi using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Dhivehi test data of Common Voice.\n\nTest Result: 55.68 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
66,
20,
30,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dv #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Dhivehi\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Dhivehi using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Dhivehi test data of Common Voice.\n\nTest Result: 55.68 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.16175693273544312,
0.02417694590985775,
-0.002375354990363121,
-0.02769365720450878,
0.07277286052703857,
-0.03990280255675316,
0.18423956632614136,
0.09542214870452881,
-0.005787815898656845,
-0.016645357012748718,
0.03777255117893219,
0.01980850100517273,
0.059402111917734146,
0.08505889773368835,
-0.01958378776907921,
-0.20978105068206787,
0.007716924883425236,
0.013921432197093964,
0.021558815613389015,
0.11027141660451889,
0.10574742406606674,
-0.09635264426469803,
0.0027157110162079334,
0.06642130017280579,
-0.14949268102645874,
0.026276201009750366,
0.02354768104851246,
-0.09657923877239227,
0.14594855904579163,
0.0583600178360939,
0.0652228370308876,
0.07299790531396866,
0.11110614985227585,
-0.20066361129283905,
0.02886774018406868,
0.04364593327045441,
0.03949590399861336,
0.03431348502635956,
0.03519497811794281,
0.013120527379214764,
0.07477588951587677,
0.09574700146913528,
-0.025676392018795013,
0.0730326771736145,
-0.04710282385349274,
-0.21979957818984985,
-0.02275465801358223,
0.01585756614804268,
0.10522906482219696,
0.1192345917224884,
-0.05879294499754906,
0.0917438492178917,
-0.13057927787303925,
0.08858776837587357,
0.08569317311048508,
-0.14905713498592377,
0.010268354788422585,
0.11487922072410583,
0.04246846213936806,
0.07435349375009537,
-0.06875079870223999,
0.03145863488316536,
0.04891392961144447,
0.03595985472202301,
0.0209964532405138,
-0.02994615212082863,
-0.15110069513320923,
-0.008905733935534954,
-0.12462862581014633,
-0.02900838851928711,
0.22515934705734253,
-0.0025664917193353176,
-0.07211781293153763,
-0.12221705168485641,
-0.03590187802910805,
0.011985533870756626,
-0.010729684494435787,
-0.08538499474525452,
-0.010876073502004147,
0.035466182976961136,
-0.017264101654291153,
-0.007303252816200256,
-0.11652496457099915,
-0.15831679105758667,
-0.0009087348007597029,
0.050920519977808,
0.02143821306526661,
0.01970057561993599,
-0.13414695858955383,
0.08038558810949326,
-0.05516178905963898,
-0.08008003234863281,
-0.016941385343670845,
0.008058087900280952,
-0.05170530825853348,
0.015630094334483147,
-0.08181429654359818,
-0.1805778294801712,
0.035465747117996216,
-0.03747856616973877,
0.05293356254696846,
0.023218780755996704,
-0.038701917976140976,
0.02911129966378212,
0.051867466419935226,
0.08534210175275803,
-0.07962901145219803,
-0.0159233920276165,
0.041821371763944626,
0.029763678088784218,
-0.03878701478242874,
-0.012919751927256584,
-0.05784580484032631,
-0.030354881659150124,
0.024434296414256096,
0.04702140763401985,
-0.0367860272526741,
0.012991176918148994,
-0.02986070327460766,
-0.05345957353711128,
0.04251013696193695,
-0.10368397831916809,
-0.0635535791516304,
0.08120523393154144,
0.0030655143782496452,
0.09048789739608765,
0.04854899272322655,
0.05425233766436577,
-0.051722146570682526,
-0.023939503356814384,
0.013020705431699753,
0.025261761620640755,
-0.017815403640270233,
-0.10611341893672943,
0.005045476835221052,
-0.02591153420507908,
-0.009283327497541904,
-0.0897175669670105,
-0.1325751543045044,
-0.05860624462366104,
0.0015187493991106749,
0.03345821052789688,
-0.013821649365127087,
-0.10115590691566467,
-0.0034786721225827932,
-0.021539228036999702,
-0.052187275141477585,
0.05227215588092804,
-0.037424951791763306,
0.08028598874807358,
0.02416132390499115,
0.062267884612083435,
0.0282160434871912,
0.0939699187874794,
-0.0603967048227787,
-0.029186327010393143,
0.044030796736478806,
0.14124643802642822,
-0.03579743951559067,
-0.06282288581132889,
-0.09495415538549423,
-0.07988084852695465,
-0.04230789840221405,
0.06123991310596466,
0.056709930300712585,
0.12638314068317413,
-0.29498183727264404,
-0.08537788689136505,
0.17186865210533142,
-0.10665414482355118,
-0.034499961882829666,
0.18510107696056366,
-0.028385262936353683,
0.1146121397614479,
0.14399364590644836,
0.2071060985326767,
0.10198129713535309,
-0.20117634534835815,
0.04321615397930145,
0.026554197072982788,
0.007730554323643446,
-0.03376302123069763,
0.062224891036748886,
-0.0421430766582489,
-0.006044235546141863,
0.03330657258629799,
-0.026424409821629524,
0.05159653350710869,
-0.03977154940366745,
-0.072829969227314,
-0.005999257788062096,
-0.0949874222278595,
0.0475662425160408,
0.055147022008895874,
0.014583045616745949,
-0.007305134553462267,
-0.03519126772880554,
0.059698686003685,
0.14321684837341309,
-0.14180046319961548,
0.03393379598855972,
-0.105522520840168,
0.04385221004486084,
-0.11536142230033875,
-0.009847869165241718,
-0.14073964953422546,
0.17259559035301208,
-0.005512997508049011,
0.08727555721998215,
0.03904624655842781,
0.21133995056152344,
0.019599370658397675,
0.009202660992741585,
-0.04196680709719658,
-0.024729574099183083,
0.002618200145661831,
-0.013670696876943111,
-0.024980109184980392,
-0.10421278327703476,
-0.019659601151943207,
-0.07560271769762039,
0.09063425660133362,
-0.17141227424144745,
-0.009917491115629673,
0.046081382781267166,
0.015609617345035076,
0.019606485962867737,
-0.009290268644690514,
0.07231982052326202,
0.07439712435007095,
0.0013083117082715034,
0.005065493285655975,
0.03516024351119995,
-0.004216572269797325,
-0.04250893369317055,
0.10232122242450714,
-0.11989551037549973,
0.0005791051662527025,
0.1016671359539032,
-0.04674580320715904,
0.007954961620271206,
0.03778858110308647,
-0.01975194364786148,
-0.024808185175061226,
-0.07814832031726837,
-0.005498853046447039,
0.23860076069831848,
-0.00431561516597867,
0.10917540639638901,
-0.0974322259426117,
-0.0002470405597705394,
0.026300711557269096,
-0.051702238619327545,
0.04346976801753044,
0.04439559206366539,
0.010502830147743225,
0.029178153723478317,
0.026101425290107727,
-0.036413758993148804,
-0.06311853975057602,
0.24033460021018982,
-0.03666131570935249,
-0.09142766147851944,
0.01602383889257908,
-0.03449093922972679,
-0.03569827601313591,
0.10219116508960724,
-0.15159904956817627,
-0.036262739449739456,
0.041276272386312485,
0.04912780597805977,
0.05049212649464607,
-0.14709261059761047,
0.019491443410515785,
0.015271330252289772,
-0.12518419325351715,
-0.17389893531799316,
0.07021480053663254,
-0.053048938512802124,
0.041800569742918015,
-0.09989640861749649,
-0.04193950444459915,
0.00424059946089983,
-0.05185089632868767,
-0.16963490843772888,
0.1317911446094513,
-0.07541295140981674,
-0.21972480416297913,
-0.09922152757644653,
-0.008118046447634697,
-0.004448363091796637,
0.03264550864696503,
0.08339078724384308,
-0.13489782810211182,
-0.016822071745991707,
-0.024817772209644318,
0.10607066750526428,
0.0001479390193708241,
-0.01924261823296547,
-0.050211746245622635,
0.012016644701361656,
0.06217417120933533,
-0.16745072603225708,
0.024597110226750374,
-0.06397821009159088,
-0.015739435330033302,
0.012085411697626114,
-0.001238789758644998,
0.0028494198340922594,
0.17545700073242188,
0.05502267926931381,
0.01793788932263851,
-0.022074313834309578,
0.20263263583183289,
-0.08094610273838043,
-0.021775195375084877,
0.19541063904762268,
-0.013436232693493366,
-0.026792876422405243,
0.1312023252248764,
0.02821626141667366,
-0.08725974708795547,
-0.00023778632748872042,
-0.011106991209089756,
-0.06996551901102066,
-0.23998692631721497,
-0.10776156932115555,
-0.07545282691717148,
-0.06713271886110306,
-0.024463005363941193,
-0.00010797059803735465,
0.07250332832336426,
0.02277463674545288,
-0.032270122319459915,
-0.022306980565190315,
0.030435848981142044,
-0.019785992801189423,
0.11395803838968277,
-0.02262244187295437,
0.0964459776878357,
-0.04641774296760559,
-0.03465567156672478,
0.00834155548363924,
0.0458042174577713,
0.16179579496383667,
0.036656703799963,
0.07864419370889664,
0.08992721885442734,
0.10428492724895477,
0.10198594629764557,
0.08416163921356201,
-0.06689437478780746,
-0.02413647435605526,
0.004341159947216511,
-0.06142006069421768,
-0.05318897217512131,
0.05620330199599266,
0.14444740116596222,
-0.03556279093027115,
-0.03492460772395134,
-0.018902448937296867,
-0.009132497943937778,
0.22255221009254456,
0.09433379024267197,
-0.19873306155204773,
-0.07731729745864868,
-0.019395234063267708,
-0.046164873987436295,
0.0036223470233380795,
0.056633319705724716,
0.14509393274784088,
-0.10124145448207855,
0.014745957218110561,
-0.0026304719503968954,
0.08401200920343399,
-0.03025968372821808,
0.0340786837041378,
-0.09446170181035995,
0.031263988465070724,
-0.0010616177460178733,
0.08405519276857376,
-0.29982706904411316,
0.20242983102798462,
0.016033239662647247,
0.1212940588593483,
-0.05552135035395622,
-0.004619861952960491,
0.00010420052421977744,
0.05286041647195816,
0.09680728614330292,
0.006090803071856499,
0.06681166589260101,
-0.08187425881624222,
-0.06869103014469147,
0.05972742661833763,
-0.00550069147720933,
0.03854523226618767,
0.03211448714137077,
0.01983882114291191,
0.0010110355215147138,
0.02789083868265152,
-0.03231549635529518,
-0.16745643317699432,
-0.041207440197467804,
0.022210394963622093,
0.13376763463020325,
0.10918407142162323,
-0.027367690578103065,
-0.08107609301805496,
-0.09159838408231735,
0.03236306086182594,
-0.0967414602637291,
-0.058726705610752106,
-0.0636654645204544,
-0.029537444934248924,
0.08419715613126755,
-0.04668476805090904,
0.0053373780101537704,
0.09090568870306015,
0.10248826444149017,
-0.03362508490681648,
-0.05434208735823631,
0.04754449054598808,
-0.11792310327291489,
-0.11921326816082001,
-0.03471006080508232,
0.17805343866348267,
0.11325943470001221,
0.07373026013374329,
0.05593649297952652,
-0.004444989841431379,
-0.003772533731535077,
-0.032205354422330856,
0.015728818252682686,
0.13609641790390015,
-0.09852267056703568,
-0.0053654806688427925,
0.021491315215826035,
-0.13446004688739777,
-0.10142160952091217,
-0.05442585423588753,
0.1480206847190857,
0.10011474788188934,
-0.0479523241519928,
0.20779907703399658,
0.22119766473770142,
-0.0804613009095192,
-0.21576781570911407,
-0.02322249300777912,
0.09434066712856293,
0.11940250545740128,
-0.02723209746181965,
-0.20836827158927917,
0.07951905578374863,
-0.034147556871175766,
-0.028356920927762985,
-0.039547212421894073,
-0.26134443283081055,
-0.14584657549858093,
0.17152869701385498,
-0.016960378736257553,
0.15786728262901306,
0.015340473502874374,
-0.024908047169446945,
-0.03660375624895096,
-0.048882611095905304,
0.035691361874341965,
-0.08875586837530136,
0.09556411951780319,
0.0020737918093800545,
0.08615231513977051,
0.045247882604599,
-0.01957850530743599,
0.08747554570436478,
0.09705917537212372,
0.010042907670140266,
-0.004009980242699385,
0.0682440847158432,
0.041137758642435074,
0.022965332493185997,
0.1243402287364006,
-0.0816330760717392,
0.04925686493515968,
-0.09084047377109528,
-0.0973743349313736,
-0.07969946414232254,
0.03770166262984276,
0.01744883507490158,
-0.07346523553133011,
0.011400165036320686,
-0.04461154714226723,
0.03240751847624779,
-0.0038505112752318382,
-0.03482362627983093,
-0.1131126657128334,
0.03769838437438011,
0.1446886956691742,
0.19640971720218658,
-0.054055433720350266,
-0.0809101089835167,
-0.03236983343958855,
-0.032814860343933105,
0.14265935122966766,
-0.12785515189170837,
0.03627625107765198,
0.06543920189142227,
0.05997784063220024,
0.13695617020130157,
0.018809616565704346,
-0.10590272396802902,
0.07179103046655655,
0.0241104606539011,
-0.05777124688029289,
-0.1357080191373825,
-0.022191163152456284,
-0.03374675661325455,
-0.03800695016980171,
0.049076370894908905,
0.11674076318740845,
-0.08741596341133118,
-0.0144589738920331,
-0.021796029061079025,
0.023539163172245026,
-0.12527810037136078,
0.22931961715221405,
0.040976639837026596,
0.07784653455018997,
-0.11589738726615906,
0.021711336448788643,
-0.004164688289165497,
-0.0462542399764061,
0.03594418242573738,
-0.05548999458551407,
-0.08245262503623962,
-0.05998237431049347,
-0.03241163119673729,
0.0815594494342804,
0.066973976790905,
-0.11372107267379761,
-0.06459582597017288,
-0.10979101061820984,
0.012503772042691708,
0.06850665807723999,
0.04961346834897995,
0.035637401044368744,
-0.1315823793411255,
-0.052131734788417816,
-0.11550968885421753,
0.07848489284515381,
0.06556303799152374,
-0.012504247017204762,
-0.12199568003416061,
0.15320368111133575,
0.05260854586958885,
0.04807219281792641,
-0.05108516290783882,
-0.07657310366630554,
-0.00947050005197525,
0.08452580124139786,
-0.1581396460533142,
-0.011478138156235218,
-0.047850772738456726,
0.008747531101107597,
-0.006101880222558975,
-0.05574755370616913,
-0.006677648518234491,
0.08488019555807114,
-0.08983280509710312,
0.0731804147362709,
0.0025255344808101654,
0.061646729707717896,
-0.08244331926107407,
0.012417194433510303,
0.005314078647643328,
-0.05696441978216171,
0.09324272722005844,
0.11918386071920395,
-0.10679672658443451,
0.10496218502521515,
-0.1614316701889038,
-0.05594898387789726,
0.07528172433376312,
0.08064109086990356,
-0.025226987898349762,
-0.09918814897537231,
0.02044150047004223,
0.08163375407457352,
0.03205636143684387,
-0.01905241049826145,
0.10082826018333435,
-0.04593215882778168,
-0.008977944031357765,
-0.0839981660246849,
0.017680441960692406,
-0.047633327543735504,
0.03484693914651871,
0.05387340858578682,
0.15804041922092438,
0.14847201108932495,
-0.08936763554811478,
0.0928959995508194,
-0.14431995153427124,
-0.004984559956938028,
-0.0496048741042614,
-0.018437853083014488,
-0.15875166654586792,
-0.07510050386190414,
0.07499736547470093,
-0.05729399994015694,
0.11676649749279022,
0.0015805413713678718,
-0.00884236115962267,
-0.036344047635793686,
-0.08067107945680618,
0.027148280292749405,
-0.017355073243379593,
0.28200986981391907,
0.041926778852939606,
0.035674504935741425,
-0.03674178570508957,
-0.006139340344816446,
0.03192022442817688,
0.1773025393486023,
-0.012993994168937206,
0.15258336067199707,
0.03640708327293396,
0.06075765937566757,
0.08745111525058746,
-0.06514342874288559,
-0.024708058685064316,
-0.0035749985836446285,
-0.11722461879253387,
0.04069824144244194,
-0.04820563644170761,
0.12903591990470886,
0.1277204304933548,
-0.09709750860929489,
0.07758988440036774,
-0.0007525946130044758,
-0.08608394116163254,
-0.1443929374217987,
-0.10795792192220688,
-0.059526603668928146,
-0.16084596514701843,
0.032247938215732574,
-0.10028323531150818,
0.00866568274796009,
0.04461988806724548,
0.04054806008934975,
-0.028622258454561234,
0.15281380712985992,
0.017837781459093094,
-0.09004569053649902,
0.0689033642411232,
-0.07462584227323532,
-0.04564438760280609,
-0.06684184074401855,
0.04975973814725876,
0.16773612797260284,
-0.008773942478001118,
0.044416751712560654,
-0.006127006374299526,
-0.05855361372232437,
0.027390025556087494,
-0.051417019218206406,
-0.06865351647138596,
-0.005793070886284113,
-0.021481899544596672,
0.07163787633180618,
0.18053942918777466,
0.12442722171545029,
-0.05628114193677902,
0.010233771055936813,
0.07699675112962723,
-0.03680220618844032,
-0.1489989012479782,
-0.14979764819145203,
0.18634067475795746,
0.029939131811261177,
0.03286042436957359,
0.025335416197776794,
-0.04643799364566803,
-0.013663902878761292,
0.21164144575595856,
0.2333940863609314,
0.022716613486409187,
0.026406271383166313,
-0.02101009152829647,
-0.010784738697111607,
-0.06931377202272415,
0.07079360634088516,
0.0849086195230484,
0.23106402158737183,
-0.01664690114557743,
0.006494596600532532,
-0.07472841441631317,
-0.09157001972198486,
0.02293066494166851,
0.060180019587278366,
-0.04655393958091736,
-0.10753485560417175,
0.015549112111330032,
0.1207035481929779,
-0.05991000309586525,
-0.10929139703512192,
-0.0890355259180069,
-0.0720275267958641,
-0.07752566039562225,
-0.02589181438088417,
-0.021917246282100677,
0.10359127819538116,
-0.008131072856485844,
-0.0894344225525856,
0.04737585783004761,
0.2103521227836609,
-0.020212162286043167,
-0.07381723076105118,
-0.033921655267477036,
0.07680626213550568,
-0.09391187876462936,
-0.017049171030521393,
-0.014578321017324924,
0.18380853533744812,
0.010947419330477715,
0.10370714962482452,
-0.013121671974658966,
0.14419017732143402,
-0.0030206479132175446,
-0.07249646633863449,
0.0015898171113803983,
0.15622477233409882,
-0.027430221438407898,
0.11380630731582642,
0.018211662769317627,
-0.11731377989053726,
0.04002466797828674,
-0.09063985198736191,
-0.041687242686748505,
-0.09358744323253632,
0.06300166994333267,
-0.03370640054345131,
0.0871201753616333,
0.05671490356326103,
-0.06513175368309021,
-0.03814948722720146,
-0.05940919741988182,
0.07952933013439178,
0.03318953886628151,
-0.05251472070813179,
-0.04823002219200134,
-0.24191346764564514,
-0.0055909366346895695,
-0.08557315915822983,
-0.014541229233145714,
-0.20676814019680023,
-0.039291899651288986,
0.0011044034035876393,
-0.08503155410289764,
0.01599309965968132,
0.03491472080349922,
0.07565958052873611,
0.01989564299583435,
-0.008500659838318825,
0.026933453977108,
0.024629151448607445,
0.10817858576774597,
-0.166174054145813,
-0.1137155070900917
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Sorbian, Upper
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Sorbian, Upper using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "hsb", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-hsb")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-hsb")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Sorbian, Upper test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "hsb", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-hsb")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-hsb")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\”\„\–\…\«\»]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 65.05 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "hsb", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Sorbian, Upper", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice hsb", "type": "common_voice", "args": "hsb"}, "metrics": [{"type": "wer", "value": 65.05, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-hsb
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"hsb",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hsb"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hsb #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Sorbian, Upper
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Sorbian, Upper using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Sorbian, Upper test data of Common Voice.
Test Result: 65.05 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Sorbian, Upper\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sorbian, Upper using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Sorbian, Upper test data of Common Voice.\n\nTest Result: 65.05 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hsb #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Sorbian, Upper\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sorbian, Upper using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Sorbian, Upper test data of Common Voice.\n\nTest Result: 65.05 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
82,
69,
20,
32,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hsb #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Sorbian, Upper\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sorbian, Upper using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Sorbian, Upper test data of Common Voice.\n\nTest Result: 65.05 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.16388091444969177,
0.06452203541994095,
-0.0031005863565951586,
-0.04436875134706497,
0.07510839402675629,
-0.04708290472626686,
0.16945378482341766,
0.09715107828378677,
-0.04344925284385681,
0.016287723556160927,
0.003629008075222373,
0.005968957673758268,
0.05897187814116478,
0.10922052711248398,
-0.013449369929730892,
-0.18017175793647766,
0.015543361194431782,
0.00864274799823761,
0.06678768247365952,
0.09873828291893005,
0.10990000516176224,
-0.06052134558558464,
-0.006124185398221016,
0.07991807907819748,
-0.15716585516929626,
0.019187619909644127,
0.07121127098798752,
-0.11127714067697525,
0.14127007126808167,
0.07753755152225494,
0.046684373170137405,
0.06705377250909805,
0.11282763630151749,
-0.2167271226644516,
0.025934875011444092,
0.046794719994068146,
0.06512400507926941,
0.059367191046476364,
0.039579324424266815,
-0.027946967631578445,
0.06295885145664215,
0.10663014650344849,
-0.020464053377509117,
0.08837354183197021,
-0.050066299736499786,
-0.19775892794132233,
-0.025157475844025612,
0.007713876664638519,
0.14007671177387238,
0.08500666171312332,
-0.05912763997912407,
0.0761738047003746,
-0.11782380938529968,
0.09319161623716354,
0.063376784324646,
-0.1655655950307846,
0.014036289416253567,
0.09438028931617737,
0.046564981341362,
0.11039028316736221,
-0.08015729486942291,
0.020647745579481125,
0.06245173513889313,
0.002611891133710742,
0.0152291189879179,
-0.02581276372075081,
-0.1924157738685608,
-0.019726110622286797,
-0.1334773302078247,
-0.023651182651519775,
0.2344440221786499,
0.025253668427467346,
-0.09490738064050674,
-0.10375023633241653,
-0.04564415663480759,
0.009088878519833088,
0.03119739145040512,
-0.08385632932186127,
-0.014697296544909477,
0.04858919233083725,
-0.006883945781737566,
-0.058217670768499374,
-0.12458641082048416,
-0.16553528606891632,
-0.0010976254707202315,
0.044105008244514465,
0.026682578027248383,
0.016853364184498787,
-0.0953352227807045,
0.07850321382284164,
-0.11182527244091034,
-0.060296833515167236,
0.012434235773980618,
0.0035018459893763065,
-0.1073073297739029,
-0.008949508890509605,
-0.08135964721441269,
-0.136243999004364,
0.06588482856750488,
0.01134808361530304,
0.05267718434333801,
0.021534860134124756,
-0.06321520358324051,
0.039360612630844116,
0.05143812671303749,
0.09618805348873138,
-0.05388857424259186,
-0.008790001273155212,
0.057829346507787704,
0.010792690329253674,
-0.03501257672905922,
-0.024147851392626762,
-0.03960251435637474,
-0.03637569397687912,
0.04133055359125137,
0.033799488097429276,
-0.021241245791316032,
-0.03637002408504486,
-0.03353096917271614,
-0.02089088410139084,
0.017856519669294357,
-0.13014689087867737,
-0.019668515771627426,
0.088838130235672,
0.000766246987041086,
0.06791886687278748,
0.02281997725367546,
0.09471405297517776,
-0.07914521545171738,
0.027845965698361397,
0.025117255747318268,
0.03569239377975464,
0.005766028538346291,
-0.07338149845600128,
0.015802733600139618,
-0.0634317621588707,
0.0014534140937030315,
-0.06679517030715942,
-0.10814103484153748,
-0.09745769202709198,
-0.0020228552166372538,
0.03364349156618118,
-0.02096097357571125,
-0.08617735654115677,
-0.007087757345288992,
-0.009166483767330647,
-0.08072319626808167,
0.04957069829106331,
-0.05303327739238739,
0.07258699834346771,
0.043588366359472275,
0.03156653419137001,
0.002143162302672863,
0.0715724304318428,
-0.09148027002811432,
-0.04737533628940582,
0.01960347592830658,
0.1322557032108307,
-0.03029077686369419,
-0.06705527007579803,
-0.080471470952034,
-0.059530314058065414,
-0.05581167712807655,
0.08302265405654907,
0.03386411815881729,
0.09995215386152267,
-0.2596622705459595,
-0.09658023715019226,
0.1996585577726364,
-0.12948106229305267,
0.003534262767061591,
0.22179244458675385,
-0.02542378008365631,
0.09012482315301895,
0.17226561903953552,
0.2479255199432373,
0.0838417261838913,
-0.1551523208618164,
0.01400666031986475,
0.009067879058420658,
-0.002747100545093417,
-0.003785032546147704,
0.08610886335372925,
-0.051133058965206146,
0.013587134890258312,
0.024655545130372047,
-0.04378264397382736,
0.03481636196374893,
-0.016053881496191025,
-0.05291523039340973,
-0.04089117422699928,
-0.07173813879489899,
0.02742081508040428,
0.038376811891794205,
-0.009434050880372524,
-0.04507838934659958,
-0.07642698287963867,
0.09984488040208817,
0.1196557879447937,
-0.15493206679821014,
0.049932051450014114,
-0.13080675899982452,
0.0537138395011425,
-0.10200764983892441,
-0.015528660267591476,
-0.15630334615707397,
0.1821095198392868,
-0.014370914548635483,
0.04736446961760521,
0.06377309560775757,
0.22050140798091888,
0.038237348198890686,
0.039975304156541824,
-0.04434674233198166,
-0.007701681461185217,
0.019904732704162598,
-0.00807536207139492,
-0.01115383580327034,
-0.11080302298069,
-0.02226068824529648,
-0.07659568637609482,
0.16993001103401184,
-0.19147537648677826,
0.01608358696103096,
0.060034990310668945,
0.06775614619255066,
0.008635896258056164,
-0.022622542455792427,
0.062464967370033264,
0.0612700879573822,
-0.00757483160123229,
0.013041970320045948,
-0.0004024359805043787,
0.011689281091094017,
-0.056243978440761566,
0.08415620774030685,
-0.14678728580474854,
-0.015807144343852997,
0.12544845044612885,
-0.003844859777018428,
0.012923668138682842,
0.024511253461241722,
-0.019035687670111656,
0.0015465683536604047,
-0.08363258093595505,
-0.04124864935874939,
0.22795315086841583,
-0.005540069658309221,
0.09595844894647598,
-0.1119503453373909,
0.021747196093201637,
0.008041631430387497,
-0.0948544442653656,
0.06661105155944824,
0.04086169973015785,
-0.03532503917813301,
0.014890345744788647,
0.01940794102847576,
-0.04558491334319115,
-0.09405577182769775,
0.18509723246097565,
-0.04408230632543564,
-0.10163774341344833,
0.029350658878684044,
-0.01717562973499298,
-0.029528141021728516,
0.03804049640893936,
-0.14199897646903992,
-0.022513777017593384,
0.03487972542643547,
0.05591860041022301,
0.0636805072426796,
-0.17948997020721436,
0.05007333308458328,
0.00817771628499031,
-0.12687283754348755,
-0.12762656807899475,
0.03406120836734772,
-0.026674378663301468,
0.041381917893886566,
-0.12474938482046127,
-0.02588021568953991,
0.0040183463133871555,
-0.04427245259284973,
-0.1709408015012741,
0.11009915918111801,
-0.0880047082901001,
-0.2061925083398819,
-0.18520432710647583,
-0.0009781282860785723,
-0.020989440381526947,
0.04129650816321373,
0.10032740235328674,
-0.10362144559621811,
0.002747984603047371,
-0.03172015771269798,
0.10992047190666199,
0.0353253036737442,
0.0037083837669342756,
-0.05349249020218849,
0.044403452426195145,
0.08167269825935364,
-0.14312605559825897,
0.027264023199677467,
-0.054300229996442795,
-0.06291691958904266,
0.002362169325351715,
0.004407086409628391,
0.0019801000598818064,
0.16442371904850006,
0.038588423281908035,
-0.004967903718352318,
-0.011269815266132355,
0.18701596558094025,
-0.0748824030160904,
-0.03119800239801407,
0.2205503135919571,
-0.06699434667825699,
-0.028886057436466217,
0.1096145436167717,
0.011077689938247204,
-0.06041949242353439,
0.002032680669799447,
-0.006431369110941887,
-0.0840965062379837,
-0.25990206003189087,
-0.09052145481109619,
-0.05456876382231712,
-0.08370149880647659,
-0.015712693333625793,
0.026022866368293762,
0.030062463134527206,
0.015872333198785782,
-0.056861404329538345,
-0.09229514002799988,
0.09037036448717117,
-0.02311396412551403,
0.1069234311580658,
-0.00982741080224514,
0.0982026755809784,
-0.0400363989174366,
-0.03297389671206474,
0.037495747208595276,
0.05557896941900253,
0.10641170293092728,
0.05031358823180199,
0.04229153320193291,
0.08423450589179993,
0.09398244321346283,
0.1038174256682396,
0.07077240943908691,
-0.03881816938519478,
-0.000763666641432792,
0.005832490045577288,
-0.07566113024950027,
-0.06983882188796997,
0.06676293909549713,
0.13825660943984985,
-0.026817742735147476,
-0.028635617345571518,
-0.017269792035222054,
-0.0029643154703080654,
0.17551089823246002,
0.11467071622610092,
-0.2268199622631073,
-0.07179075479507446,
0.015493386425077915,
-0.07486137002706528,
-0.014377499930560589,
0.04831235483288765,
0.18638797104358673,
-0.10384044796228409,
0.04978838935494423,
0.03506116569042206,
0.0890914797782898,
-0.012708877213299274,
0.04299847036600113,
-0.1054132804274559,
0.01298047136515379,
0.015281155705451965,
0.09803539514541626,
-0.24286550283432007,
0.16640952229499817,
0.010017218999564648,
0.14341497421264648,
-0.027876706793904305,
0.013333362527191639,
-0.02043243683874607,
0.07741090655326843,
0.0925544798374176,
-0.017041411250829697,
-0.024953298270702362,
-0.08600930869579315,
-0.0931209996342659,
0.05942041799426079,
-0.008958390913903713,
0.0232407096773386,
0.036121174693107605,
0.004906287416815758,
0.005705179646611214,
0.006060156039893627,
-0.08664820343255997,
-0.15108853578567505,
-0.057261351495981216,
-0.0011380291543900967,
0.14510497450828552,
0.14717629551887512,
-0.018126605078577995,
-0.058768268674612045,
-0.06557280570268631,
0.07931270450353622,
-0.11481926590204239,
-0.026084808632731438,
-0.06899939477443695,
-0.028795765712857246,
0.12113388627767563,
-0.040941376239061356,
-0.034705549478530884,
0.09297173470258713,
0.1474589854478836,
-0.012475470080971718,
-0.03698123246431351,
0.04185698553919792,
-0.08876311033964157,
-0.1517438441514969,
-0.019921526312828064,
0.20949077606201172,
0.08874985575675964,
0.0874224305152893,
0.08473306894302368,
0.022010911256074905,
0.005912761203944683,
-0.033066049218177795,
0.055839233100414276,
0.1480252742767334,
-0.08951196074485779,
0.004753382410854101,
-0.01817844994366169,
-0.15287484228610992,
-0.10926201939582825,
-0.040190163999795914,
0.1266937404870987,
0.11328569054603577,
-0.03047538362443447,
0.19483231008052826,
0.24222378432750702,
-0.09987423568964005,
-0.15307165682315826,
-0.02709640935063362,
0.09139309078454971,
0.14366431534290314,
-0.015902534127235413,
-0.1792280524969101,
0.1266607642173767,
0.025749096646904945,
-0.038872819393873215,
-0.11991114169359207,
-0.20311018824577332,
-0.15124911069869995,
0.1390933394432068,
-0.06731530278921127,
0.1004883348941803,
-0.03478274866938591,
-0.03838108107447624,
-0.06741786003112793,
-0.028463149443268776,
0.06133642792701721,
-0.1225416511297226,
0.10931965708732605,
0.03906753286719322,
0.030204283073544502,
0.047382909804582596,
-0.013147457502782345,
0.11364546418190002,
0.0868111252784729,
-0.02952904999256134,
0.014067846350371838,
0.1029161587357521,
0.05844084173440933,
0.021719668060541153,
0.11806893348693848,
-0.0748082771897316,
0.02301223948597908,
-0.11760398000478745,
-0.09937897324562073,
-0.046693380922079086,
0.05901552736759186,
0.02439325675368309,
-0.03297916799783707,
0.032375507056713104,
-0.07854939252138138,
0.027004359290003777,
0.009112043306231499,
-0.03814028948545456,
-0.1371595412492752,
0.0921221524477005,
0.1619652509689331,
0.17093922197818756,
-0.098739854991436,
-0.07927562296390533,
-0.016560504212975502,
-0.024588551372289658,
0.1058565005660057,
-0.09656450152397156,
0.07572145760059357,
0.06289459019899368,
0.037265367805957794,
0.14163479208946228,
0.002632443793118,
-0.10550978779792786,
0.08079400658607483,
0.036248959600925446,
-0.06617433577775955,
-0.105789415538311,
-0.0011961419368162751,
-0.0209161676466465,
-0.049744561314582825,
0.062148742377758026,
0.16248327493667603,
-0.06674070656299591,
-0.018428733572363853,
-0.02705269120633602,
0.056393854320049286,
-0.16272681951522827,
0.22091332077980042,
0.022171195596456528,
0.08055414259433746,
-0.11978562921285629,
-0.012780359946191311,
0.02883310243487358,
-0.04084277153015137,
0.03507479652762413,
-0.08558660000562668,
-0.07832195609807968,
-0.029245370998978615,
-0.05454922467470169,
0.03071349486708641,
0.04862881824374199,
-0.16608428955078125,
-0.07122234255075455,
-0.10640306770801544,
-0.007702999282628298,
0.04324674233794212,
0.04771576449275017,
0.052774250507354736,
-0.09560036659240723,
-0.0712345615029335,
-0.10706961154937744,
0.0374239943921566,
0.0790061354637146,
0.00026853338931687176,
-0.1246117576956749,
0.1713716834783554,
0.04001576080918312,
0.014004969969391823,
-0.04936036467552185,
-0.06895611435174942,
-0.027691740542650223,
0.09977002441883087,
-0.11930077522993088,
0.01620827615261078,
-0.03378956392407417,
0.016779154539108276,
0.01780436746776104,
-0.09376917779445648,
-0.022409658879041672,
0.06704694032669067,
-0.09149912744760513,
0.07558681815862656,
-0.015155111439526081,
0.0536554679274559,
-0.11490825563669205,
0.0435086265206337,
0.038593143224716187,
-0.0784192755818367,
0.07807537913322449,
0.12079868465662003,
-0.11763370782136917,
0.129954993724823,
-0.20402792096138,
-0.10721642524003983,
0.10170898586511612,
0.08514805883169174,
-0.04837080091238022,
-0.09162373840808868,
0.03450533747673035,
0.11369289457798004,
0.039638932794332504,
-0.005219945218414068,
0.10907050967216492,
-0.04557614028453827,
-0.03612088784575462,
-0.07680108398199081,
-0.03095785714685917,
-0.024554410949349403,
0.018299080431461334,
0.10705406963825226,
0.15731337666511536,
0.1845165640115738,
-0.07214714586734772,
0.050022803246974945,
-0.1340784877538681,
0.002985479077324271,
-0.06193160265684128,
-0.027118433266878128,
-0.17110835015773773,
-0.09279973059892654,
0.07008951902389526,
-0.011694636195898056,
0.13018928468227386,
0.0004119507211726159,
0.008587166666984558,
-0.0006603974616155028,
-0.07766000926494598,
0.02323947101831436,
0.00996082928031683,
0.2545696198940277,
0.059037111699581146,
0.023960798978805542,
0.008589484728872776,
-0.016969500109553337,
0.032642100006341934,
0.07175037264823914,
0.002822795184329152,
0.14932776987552643,
0.0015539234736934304,
0.07692857831716537,
0.10155826061964035,
-0.04307040199637413,
-0.008479120209813118,
0.05205458402633667,
-0.12745961546897888,
0.012395784258842468,
-0.06103873997926712,
0.13289079070091248,
0.15224435925483704,
-0.08375189453363419,
0.046373963356018066,
0.004693818278610706,
-0.08714888989925385,
-0.21738201379776,
-0.08814479410648346,
-0.09234225749969482,
-0.12801459431648254,
0.00665591936558485,
-0.09413633495569229,
0.017651379108428955,
0.09321561455726624,
0.030907509848475456,
-0.007822277955710888,
0.09455810487270355,
-0.0038033993914723396,
-0.07805619388818741,
0.019857823848724365,
-0.08578499406576157,
-0.0013424183707684278,
-0.057554300874471664,
0.008844512514770031,
0.19835416972637177,
0.031027590855956078,
0.07814563810825348,
-0.010628078132867813,
-0.012338070198893547,
0.046897415071725845,
-0.09664463251829147,
-0.0438113696873188,
-0.0017355283489450812,
-0.03552233427762985,
0.08245030045509338,
0.10593869537115097,
0.1442645788192749,
-0.08398257941007614,
0.024134162813425064,
0.16918255388736725,
-0.04338039085268974,
-0.12853777408599854,
-0.17992046475410461,
0.1238924190402031,
0.05917026847600937,
0.06080283969640732,
0.011596111580729485,
-0.07253585755825043,
-0.014186225831508636,
0.1500200629234314,
0.19519026577472687,
0.04229309782385826,
0.02562699466943741,
-0.023375287652015686,
-0.007955769076943398,
-0.07106143981218338,
0.05259895324707031,
0.0793759897351265,
0.20676760375499725,
0.010781961493194103,
0.04442426189780235,
-0.037674080580472946,
-0.08656536042690277,
0.0002528943296056241,
0.030199259519577026,
-0.11456909775733948,
-0.09386229515075684,
0.022588206455111504,
0.14303992688655853,
-0.01953277178108692,
-0.18055546283721924,
-0.12496878951787949,
-0.07221066951751709,
-0.10700391978025436,
-0.014853326603770256,
0.04581261798739433,
0.10018830746412277,
-0.0036122710444033146,
-0.07800238579511642,
0.01308292057365179,
0.15432783961296082,
-0.007039891090244055,
-0.048991527408361435,
-0.031553320586681366,
0.047699231654405594,
-0.11533460021018982,
0.005415629129856825,
0.004099003970623016,
0.20226751267910004,
0.005689018871635199,
0.05554281920194626,
-0.016849031671881676,
0.1187899187207222,
-0.014484276995062828,
-0.12098734080791473,
0.03236266225576401,
0.15165556967258453,
-0.028150277212262154,
0.12876024842262268,
0.043920498341321945,
-0.1073244959115982,
0.027039235457777977,
-0.15553389489650726,
-0.04419597610831261,
-0.09440924227237701,
0.05516098812222481,
-0.07608938962221146,
0.0847565308213234,
0.0663088858127594,
-0.0849851667881012,
-0.013093688525259495,
-0.052855752408504486,
0.0860796719789505,
0.041225433349609375,
-0.05592305585741997,
-0.01103580929338932,
-0.2922951579093933,
-0.011944906786084175,
-0.055721037089824677,
-0.004854540340602398,
-0.18726925551891327,
-0.019347473978996277,
-0.0020496321376413107,
-0.07051370292901993,
-0.004888125695288181,
0.05020416900515556,
0.09678944945335388,
0.01012477744370699,
-0.02504108101129532,
-0.04967581480741501,
0.04831741750240326,
0.10653223842382431,
-0.16303475201129913,
-0.11275587230920792
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Interlingua
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Interlingua using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ia", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-ia")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-ia")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Interlingua test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "ia", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-ia")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-ia")
model.to("cuda")
chars_to_ignore_regex = '[\.\,\!\?\-\"\:\;\'\“\”]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 22.08 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "ia", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Interlingua", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ia", "type": "common_voice", "args": "ia"}, "metrics": [{"type": "wer", "value": 22.08, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-ia
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"ia",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ia"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ia #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Interlingua
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Interlingua test data of Common Voice.
Test Result: 22.08 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Interlingua\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Interlingua test data of Common Voice.\n\nTest Result: 22.08 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ia #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Interlingua\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Interlingua test data of Common Voice.\n\nTest Result: 22.08 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ia #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Interlingua\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Interlingua test data of Common Voice.\n\nTest Result: 22.08 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.14575889706611633,
0.01337492000311613,
-0.002648536581546068,
-0.016223935410380363,
0.06666181981563568,
-0.031272340565919876,
0.1859171837568283,
0.09520970284938812,
0.005507904104888439,
-0.029721179977059364,
0.027914809063076973,
0.022075528278946877,
0.05327004939317703,
0.06872399151325226,
-0.01562855951488018,
-0.20376546680927277,
0.00022981023357715458,
0.008494765497744083,
0.07438455522060394,
0.12052503973245621,
0.10402389615774155,
-0.06748835742473602,
0.00040862770401872694,
0.09754471480846405,
-0.15367797017097473,
0.028960177674889565,
0.034422777593135834,
-0.09515146166086197,
0.14406952261924744,
0.08109407126903534,
0.07757055759429932,
0.04722423106431961,
0.08776992559432983,
-0.2050832062959671,
0.023911084979772568,
0.044488247483968735,
0.030826032161712646,
0.03346525505185127,
0.0570373572409153,
-0.000653308816254139,
0.0717177465558052,
0.09160661697387695,
-0.0224239993840456,
0.07406499236822128,
-0.07154868543148041,
-0.20664730668067932,
-0.02036861889064312,
0.014016631059348583,
0.09757410734891891,
0.12376030534505844,
-0.06483208388090134,
0.10997547209262848,
-0.13749536871910095,
0.09574303776025772,
0.08954798430204391,
-0.18966889381408691,
0.00019418008741922677,
0.07303999364376068,
0.046438783407211304,
0.08002878725528717,
-0.05676720291376114,
0.030348366126418114,
0.04486910253763199,
0.028426025062799454,
-0.003996877931058407,
-0.0365038700401783,
-0.19160878658294678,
-0.01053280383348465,
-0.14025089144706726,
-0.03717758134007454,
0.2001706063747406,
-0.00844627059996128,
-0.08437114953994751,
-0.1338834911584854,
-0.014356388710439205,
0.006567118223756552,
-0.013799174688756466,
-0.07202533632516861,
0.0005536187090910971,
0.03955206647515297,
-0.002854859223589301,
-0.031047934666275978,
-0.11611312627792358,
-0.15404613316059113,
-0.031328506767749786,
0.08293760567903519,
0.02371196635067463,
0.03136884421110153,
-0.1200486272573471,
0.07368171215057373,
-0.07722268998622894,
-0.06231648474931717,
-0.021999645978212357,
0.0036130815278738737,
-0.06967243552207947,
0.02259979024529457,
-0.08572519570589066,
-0.14310944080352783,
0.021510062739253044,
-0.02979392185807228,
0.026645276695489883,
0.025529608130455017,
-0.02471807599067688,
0.05944307520985603,
0.0514625646173954,
0.09181375056505203,
-0.06167490407824516,
-0.021950503811240196,
0.01618788205087185,
-0.0029279212467372417,
-0.05610838904976845,
-0.02610485814511776,
-0.06618625670671463,
-0.0475919172167778,
0.005922030657529831,
0.045857083052396774,
-0.01887999102473259,
-0.00039369214209727943,
-0.048707541078329086,
-0.054436229169368744,
-0.004021518398076296,
-0.08999591320753098,
-0.03799727186560631,
0.06420827656984329,
0.005493600387126207,
0.1169273853302002,
0.04610545188188553,
0.07091652601957321,
-0.07048432528972626,
-0.048111941665410995,
0.0222933292388916,
0.04336688667535782,
-0.032185882329940796,
-0.0937580019235611,
0.004678067285567522,
-0.018186112865805626,
-0.01496335119009018,
-0.0943518653512001,
-0.12212002277374268,
-0.0772290825843811,
0.01813294179737568,
0.01982813887298107,
-0.011095335707068443,
-0.09887481480836868,
0.007623123470693827,
-0.014943458139896393,
-0.05968473479151726,
0.07099591940641403,
-0.03993070498108864,
0.05285939574241638,
0.010468780994415283,
0.057771723717451096,
0.019773263484239578,
0.08482784777879715,
-0.07640178501605988,
-0.04180047661066055,
0.06377293914556503,
0.1555662602186203,
-0.021986329928040504,
-0.0859033614397049,
-0.0880938172340393,
-0.06986740976572037,
-0.06363832205533981,
0.0788586288690567,
0.049920789897441864,
0.1198093369603157,
-0.2891855835914612,
-0.10015660524368286,
0.20205458998680115,
-0.1204306036233902,
-0.02729395590722561,
0.2143324911594391,
-0.02021297998726368,
0.10839339345693588,
0.13640739023685455,
0.22727778553962708,
0.08806274831295013,
-0.19231946766376495,
0.0432777926325798,
0.04819623380899429,
-0.007211979012936354,
-0.03614005073904991,
0.07116283476352692,
-0.033306386321783066,
-0.01454179547727108,
0.030352720990777016,
-0.030657319352030754,
0.06953345239162445,
-0.047946397215127945,
-0.06593915075063705,
-0.020627006888389587,
-0.07969173043966293,
0.052496619522571564,
0.057597242295742035,
0.011927996762096882,
-0.013238064013421535,
-0.052091944962739944,
0.055091358721256256,
0.11698482185602188,
-0.12432818114757538,
0.04585615545511246,
-0.10767115652561188,
0.05011045187711716,
-0.06934989243745804,
-0.024615924805402756,
-0.14815199375152588,
0.15847119688987732,
-0.019057497382164,
0.07914005219936371,
0.05650690570473671,
0.20226061344146729,
0.012378958985209465,
0.01554177701473236,
-0.032848652452230453,
-0.008988028392195702,
0.021626563742756844,
-0.02700398676097393,
-0.046611279249191284,
-0.10705084353685379,
-0.027022304013371468,
-0.07321812212467194,
0.09168338775634766,
-0.16941866278648376,
-0.010889491066336632,
-0.011727039702236652,
-0.0027651656419038773,
0.012308184988796711,
-0.017791451886296272,
0.0588623508810997,
0.09248601645231247,
-0.0014980726409703493,
0.016894174739718437,
0.059491485357284546,
0.0029818026814609766,
-0.04630390182137489,
0.1462942361831665,
-0.09885772317647934,
0.01074486505240202,
0.09984441101551056,
-0.06607753038406372,
0.0012676507467404008,
0.013643044047057629,
-0.01522976066917181,
-0.02026986889541149,
-0.08015591651201248,
0.004444227088242769,
0.3222762942314148,
-0.002774478169158101,
0.10541539639234543,
-0.1053444966673851,
-0.0042242370545864105,
0.01569880172610283,
-0.08289080113172531,
0.03994378820061684,
0.07108787447214127,
0.006627315189689398,
0.0016363331815227866,
0.02411029301583767,
-0.030411602929234505,
-0.06836773455142975,
0.2449643462896347,
-0.0283780787140131,
-0.09389077126979828,
0.009837868623435497,
-0.031123647466301918,
-0.030464570969343185,
0.06411653012037277,
-0.20773351192474365,
-0.03455594182014465,
0.042540885508060455,
0.0657067596912384,
0.07131097465753555,
-0.15108542144298553,
0.012635501101613045,
0.01699618063867092,
-0.12139246612787247,
-0.1705048829317093,
0.06321848183870316,
-0.028855295851826668,
0.04176564887166023,
-0.10906187444925308,
-0.04027470201253891,
-0.0031766740139573812,
-0.05462278798222542,
-0.17484627664089203,
0.13555733859539032,
-0.0650961622595787,
-0.23557397723197937,
-0.13156969845294952,
-0.043011538684368134,
0.014327279292047024,
0.009060238488018513,
0.08072972297668457,
-0.10989423841238022,
-0.004689723253250122,
-0.03768175467848778,
0.09954128414392471,
-0.004431507550179958,
-0.022915849462151527,
-0.0568002387881279,
0.015472049824893475,
0.05553710088133812,
-0.15600118041038513,
0.02004965767264366,
-0.062121015042066574,
-0.02261439338326454,
0.010494868271052837,
-0.011279665865004063,
-0.00046351898345164955,
0.18385091423988342,
0.0434938445687294,
0.015910780057311058,
-0.01769844815135002,
0.21886716783046722,
-0.08374445140361786,
-0.0462619923055172,
0.19492851197719574,
-0.017253728583455086,
-0.019138481467962265,
0.08944288641214371,
0.034632038325071335,
-0.07661183178424835,
-0.015691297128796577,
-0.012006228789687157,
-0.07257229834794998,
-0.24566881358623505,
-0.11295759677886963,
-0.07308665663003922,
-0.06094444543123245,
-0.035247210413217545,
-0.020132869482040405,
0.0490981824696064,
0.013395863585174084,
-0.008577845059335232,
-0.03689718618988991,
0.02713802270591259,
-0.022927969694137573,
0.10887211561203003,
-0.023597439751029015,
0.10840385407209396,
-0.03978315740823746,
-0.033444907516241074,
0.024646515026688576,
0.015316082164645195,
0.15674927830696106,
0.0472203753888607,
0.07364998012781143,
0.09636621922254562,
0.12851235270500183,
0.12037636339664459,
0.09113848954439163,
-0.0739983469247818,
-0.030409246683120728,
-0.00030898183467797935,
-0.050003938376903534,
-0.04735491797327995,
0.03403117507696152,
0.14287389814853668,
-0.04214073345065117,
-0.03512118011713028,
-0.03212821111083031,
-0.008039084263145924,
0.2047692984342575,
0.10474788397550583,
-0.1982436180114746,
-0.0630606859922409,
-0.016775162890553474,
-0.07017513364553452,
-0.018459748476743698,
0.060269780457019806,
0.17523197829723358,
-0.1220284178853035,
0.012070593424141407,
-0.0005768610863015056,
0.09551497548818588,
-0.029541267082095146,
0.024946222081780434,
-0.07943098992109299,
0.0336814783513546,
0.0036035352386534214,
0.09423946589231491,
-0.26025643944740295,
0.21165931224822998,
0.014062791131436825,
0.10434728860855103,
-0.06227540597319603,
-0.006732862442731857,
0.0013833744451403618,
0.07727152854204178,
0.10894712060689926,
0.018302742391824722,
0.03513915464282036,
-0.08849751204252243,
-0.07845617830753326,
0.051297448575496674,
0.00770434970036149,
0.045355863869190216,
0.040202025324106216,
0.013006645254790783,
0.008402944542467594,
0.022532103583216667,
-0.02008432149887085,
-0.15599790215492249,
-0.0822553038597107,
0.008661293424665928,
0.15497589111328125,
0.099189892411232,
-0.01749780960381031,
-0.09024932235479355,
-0.08340983837842941,
0.07000529021024704,
-0.08581138402223587,
-0.049934662878513336,
-0.06937333941459656,
-0.033246610313653946,
0.10779798030853271,
-0.049207624047994614,
-0.0027844649739563465,
0.08960763365030289,
0.10423687845468521,
-0.030841918662190437,
-0.05185095593333244,
0.04247107729315758,
-0.12977777421474457,
-0.0916980654001236,
-0.030323626473546028,
0.1633765995502472,
0.1116245836019516,
0.08316707611083984,
0.0625089555978775,
-0.004262082744389772,
-0.015530955977737904,
-0.0476742684841156,
0.01677238941192627,
0.11311520636081696,
-0.09509490430355072,
0.009366982616484165,
0.017682231962680817,
-0.13278192281723022,
-0.09981965273618698,
-0.05439062789082527,
0.1820487231016159,
0.08300147205591202,
-0.0559690035879612,
0.19146911799907684,
0.20993104577064514,
-0.10083691030740738,
-0.19859221577644348,
-0.03247953578829765,
0.10958857089281082,
0.13026438653469086,
-0.0034076322335749865,
-0.18663199245929718,
0.0675949901342392,
-0.003509771078824997,
-0.021085690706968307,
-0.019228776916861534,
-0.2692415416240692,
-0.1431730091571808,
0.15291176736354828,
-0.021329211071133614,
0.16749095916748047,
0.012744766660034657,
-0.03263326734304428,
-0.006668392568826675,
-0.008958147838711739,
0.04851728677749634,
-0.09649229049682617,
0.11429829895496368,
0.014487707056105137,
0.07810665667057037,
0.04346970468759537,
-0.013874277472496033,
0.07996491342782974,
0.08215116709470749,
0.004552672617137432,
0.005705729126930237,
0.07842563837766647,
0.013206294737756252,
0.03148648515343666,
0.11566879600286484,
-0.09118042141199112,
0.0394429937005043,
-0.1058594360947609,
-0.0986519455909729,
-0.0784892663359642,
0.05131130665540695,
0.018307451158761978,
-0.05217357352375984,
0.02491268329322338,
-0.04001057893037796,
0.01309401448816061,
-0.0015475881518796086,
-0.040830571204423904,
-0.11666065454483032,
0.045164305716753006,
0.17317087948322296,
0.19210736453533173,
-0.08174705505371094,
-0.09282946586608887,
-0.03737691417336464,
-0.02511221170425415,
0.11942677199840546,
-0.08549554646015167,
0.022413427010178566,
0.052329372614622116,
0.05998310074210167,
0.12676562368869781,
0.02235533483326435,
-0.09941165894269943,
0.06852869689464569,
0.030300363898277283,
-0.05858520790934563,
-0.12510620057582855,
-0.036480870097875595,
-0.026981933042407036,
-0.027032408863306046,
0.026377150788903236,
0.11921317130327225,
-0.08135102689266205,
-0.012704703025519848,
-0.029962757602334023,
0.016113342717289925,
-0.1221318393945694,
0.20844101905822754,
0.03233320638537407,
0.07186727225780487,
-0.11794168502092361,
0.010067305527627468,
-0.006256199441850185,
-0.03821214661002159,
0.029771771281957626,
-0.03629324212670326,
-0.062395889312028885,
-0.06556357443332672,
-0.014851239509880543,
0.09458614885807037,
0.062071558088064194,
-0.13472092151641846,
-0.04724004492163658,
-0.10741837322711945,
0.016310112550854683,
0.0709771066904068,
0.05718426778912544,
0.020584512501955032,
-0.12368224561214447,
-0.07534470409154892,
-0.09096057713031769,
0.0667567029595375,
0.0641094520688057,
-0.02585996501147747,
-0.10830043256282806,
0.16898438334465027,
0.06696756184101105,
0.03805195167660713,
-0.05248042568564415,
-0.0844903439283371,
-0.01877238042652607,
0.08817350119352341,
-0.10469428449869156,
-0.001753800199367106,
-0.04018252715468407,
0.013134552165865898,
-0.007791061885654926,
-0.0620148703455925,
-0.011807207949459553,
0.06700465828180313,
-0.08801037073135376,
0.07113353908061981,
-0.002060274826362729,
0.06268317252397537,
-0.0702899843454361,
0.029207244515419006,
0.021420331671833992,
-0.05374835059046745,
0.08162182569503784,
0.12610693275928497,
-0.1010097786784172,
0.11053557693958282,
-0.21600374579429626,
-0.033683713525533676,
0.06613651663064957,
0.08015484362840652,
-0.015760326758027077,
-0.10947918891906738,
0.033824872225522995,
0.08145137876272202,
0.053739339113235474,
-0.002858762862160802,
0.07734548300504684,
-0.04883699491620064,
0.007338494993746281,
-0.05147126317024231,
-0.01620093174278736,
-0.04266112670302391,
0.03072698600590229,
0.05715658888220787,
0.14494919776916504,
0.16587714850902557,
-0.08839781582355499,
0.10943735390901566,
-0.14043749868869781,
0.010084726847708225,
-0.02916686423122883,
-0.003339875489473343,
-0.10107368975877762,
-0.10397569835186005,
0.06546986103057861,
-0.02965414524078369,
0.1614917367696762,
0.035337094217538834,
0.021413752809166908,
-0.04717492312192917,
-0.10102636367082596,
0.027546361088752747,
-0.02297656051814556,
0.2358561009168625,
0.052405644208192825,
0.03398382291197777,
-0.030885936692357063,
0.0024479064159095287,
0.023768510669469833,
0.15054407715797424,
0.015052341856062412,
0.17242774367332458,
0.0146583067253232,
0.0810428187251091,
0.07594351470470428,
-0.05665221065282822,
-0.044042546302080154,
-0.02495027519762516,
-0.11196373403072357,
0.033054742962121964,
-0.06864134967327118,
0.17499741911888123,
0.10456525534391403,
-0.10974419862031937,
0.08652500808238983,
0.017146002501249313,
-0.08139818161725998,
-0.17055685818195343,
-0.12323129177093506,
-0.052345652133226395,
-0.16126354038715363,
0.02120247669517994,
-0.1035795584321022,
0.045486241579055786,
0.05300239473581314,
0.046103738248348236,
-0.019480600953102112,
0.11614229530096054,
0.007153904531151056,
-0.09497206658124924,
0.08224568516016006,
-0.07917232811450958,
-0.012023168615996838,
-0.10346083343029022,
0.04833241179585457,
0.1536625325679779,
-0.018377572298049927,
0.05511751398444176,
-0.0033235885202884674,
-0.05734620615839958,
0.030307918787002563,
-0.06396852433681488,
-0.07507532089948654,
-0.01815604791045189,
-0.02194816805422306,
0.08922279626131058,
0.13318166136741638,
0.12464964389801025,
-0.053841568529605865,
-0.002482916694134474,
0.10032046586275101,
-0.04016230255365372,
-0.14677190780639648,
-0.15127860009670258,
0.13003017008304596,
0.000995249138213694,
0.02523600123822689,
0.0073378197848796844,
-0.04051467403769493,
-0.010569398291409016,
0.2624581754207611,
0.20282979309558868,
0.01819017343223095,
0.020055267959833145,
-0.001047025783918798,
-0.009479542262852192,
-0.04385644197463989,
0.07466231286525726,
0.08570621907711029,
0.2550978362560272,
-0.022647593170404434,
-0.0009995698928833008,
-0.07714462280273438,
-0.07831213623285294,
0.02083219774067402,
0.07803326100111008,
-0.060186970978975296,
-0.10455658286809921,
0.0010630039032548666,
0.12984436750411987,
-0.08362291008234024,
-0.09212829917669296,
-0.07542066276073456,
-0.08971436321735382,
-0.08224613964557648,
-0.026578379794955254,
0.025134453549981117,
0.10375005006790161,
0.02305154874920845,
-0.07732756435871124,
0.024194147437810898,
0.15383855998516083,
-0.013673893176019192,
-0.0853152796626091,
-0.05774020776152611,
0.05538495257496834,
-0.07052486389875412,
0.004283009096980095,
-0.004597546998411417,
0.19609332084655762,
0.014454751275479794,
0.09569596499204636,
-0.02116824872791767,
0.12554462254047394,
-0.025339672341942787,
-0.058107346296310425,
0.010144791565835476,
0.1390143483877182,
-0.019371245056390762,
0.10883034020662308,
0.01652943156659603,
-0.1364269107580185,
0.06275726854801178,
-0.11527149379253387,
-0.03989969566464424,
-0.08879614621400833,
0.03841102495789528,
-0.03756304457783699,
0.08927847445011139,
0.09188855439424515,
-0.05138974264264107,
-0.04648158699274063,
-0.04821896553039551,
0.07811028510332108,
0.01082718838006258,
-0.03610672429203987,
-0.051765959709882736,
-0.2302180826663971,
-0.008529710583388805,
-0.05174313858151436,
-0.02095869742333889,
-0.2205324023962021,
-0.029409214854240417,
0.0020272191613912582,
-0.0960567370057106,
0.009806746616959572,
0.030206581577658653,
0.1148737445473671,
0.012486455962061882,
0.006249657366424799,
-0.015297732315957546,
0.04965991526842117,
0.1132112666964531,
-0.14177000522613525,
-0.0957556813955307
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Odia
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Odia using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "or", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-odia")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-odia")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Odia test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "or", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-odia")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-odia")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 57.10 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "or", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Odia", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice or", "type": "common_voice", "args": "or"}, "metrics": [{"type": "wer", "value": 57.1, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-odia
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"or",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"or"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #or #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-XLSR-53-Odia
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Odia test data of Common Voice.
Test Result: 57.10 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Odia\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 57.10 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #or #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Odia\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 57.10 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
84,
63,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #or #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-XLSR-53-Odia\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 57.10 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.15244583785533905,
0.04627031460404396,
-0.002021031454205513,
-0.029944201931357384,
0.0675351545214653,
-0.04737871140241623,
0.1856345385313034,
0.09810973703861237,
-0.03274126723408699,
-0.005627760663628578,
0.028766164556145668,
-0.012952188029885292,
0.05195757746696472,
0.09644632786512375,
-0.042784370481967926,
-0.2021886706352234,
0.0004221545241307467,
0.022233495488762856,
0.047466106712818146,
0.11203217506408691,
0.09524896740913391,
-0.08260424435138702,
-0.004295530263334513,
0.0944327637553215,
-0.1352541297674179,
0.03947915509343147,
0.036629579961299896,
-0.12193267047405243,
0.13292758166790009,
0.08668539673089981,
0.07234810292720795,
0.06065180152654648,
0.0837620422244072,
-0.2016950398683548,
0.025719767436385155,
0.044092532247304916,
0.0351622998714447,
0.03160344064235687,
0.05021699517965317,
0.01055401936173439,
0.06982562690973282,
0.08962973207235336,
-0.015674922615289688,
0.07988061010837555,
-0.06791109591722488,
-0.2245272547006607,
-0.005846753250807524,
-0.007163770496845245,
0.10248032212257385,
0.1396551877260208,
-0.07626841217279434,
0.10144210606813431,
-0.13373541831970215,
0.08809852600097656,
0.10761574655771255,
-0.1826266050338745,
-0.0011039449600502849,
0.08216972649097443,
0.06136683374643326,
0.07876018434762955,
-0.0723518505692482,
0.02166564017534256,
0.04788140580058098,
0.029590509831905365,
0.026120096445083618,
-0.03425266221165657,
-0.1880459040403366,
-0.00010007613309426233,
-0.13703148066997528,
-0.026348093524575233,
0.23424813151359558,
0.013560925610363483,
-0.0729905217885971,
-0.1427631825208664,
-0.01934177242219448,
-0.01417718455195427,
0.002521524904295802,
-0.060379769653081894,
0.006385788321495056,
0.035755787044763565,
-0.0340728759765625,
-0.02594149485230446,
-0.13051259517669678,
-0.12566301226615906,
-0.05053010210394859,
0.06075316295027733,
0.0006065491470508277,
0.017034783959388733,
-0.11984294652938843,
0.07762641459703445,
-0.11044324189424515,
-0.07950261980295181,
-0.013348229229450226,
0.030740082263946533,
-0.07391797006130219,
0.007691407110542059,
-0.09660989046096802,
-0.16597267985343933,
0.040547046810388565,
-0.020217876881361008,
0.04989980533719063,
0.009471716359257698,
-0.052351247519254684,
0.050452060997486115,
0.05837259069085121,
0.11837480962276459,
-0.06965731084346771,
-0.02415957860648632,
0.025750992819666862,
0.011003194376826286,
-0.05649354308843613,
-0.024390120059251785,
-0.06423158198595047,
-0.042391810566186905,
0.014047148637473583,
0.03609764575958252,
-0.005868761800229549,
0.0055015552788972855,
-0.04804464802145958,
-0.041428226977586746,
-0.012506038881838322,
-0.105650395154953,
-0.028232643380761147,
0.08062127977609634,
-0.00637179845944047,
0.07965423911809921,
0.04564477130770683,
0.07680410891771317,
-0.0644221156835556,
-0.014307893812656403,
0.013820873573422432,
0.02976008504629135,
-0.02141796052455902,
-0.09648088365793228,
0.00279177981428802,
-0.03564729914069176,
0.0003085699863731861,
-0.09093136340379715,
-0.0906437560915947,
-0.09583095461130142,
0.005555500276386738,
0.03570941090583801,
-0.04439829662442207,
-0.10112205147743225,
-0.02576070837676525,
-0.009385504759848118,
-0.07345137745141983,
0.06543898582458496,
-0.04577773064374924,
0.06763404607772827,
0.031586162745952606,
0.045868370682001114,
0.016769208014011383,
0.09167863428592682,
-0.08590857684612274,
-0.03781350329518318,
0.0607425794005394,
0.134469136595726,
-0.011649921536445618,
-0.05347880348563194,
-0.08679238706827164,
-0.07427962124347687,
-0.06293973326683044,
0.06688156723976135,
0.04959562048316002,
0.0994887426495552,
-0.2480037957429886,
-0.08279934525489807,
0.15036451816558838,
-0.10094431787729263,
-0.02485642395913601,
0.19703319668769836,
-0.01533400546759367,
0.09122388809919357,
0.1256805956363678,
0.2476527988910675,
0.10977989435195923,
-0.19625802338123322,
0.027643559500575066,
0.012893247418105602,
0.00027773858164437115,
-0.03631306439638138,
0.07730916142463684,
-0.04021570459008217,
0.012373542413115501,
0.025834839791059494,
-0.028458058834075928,
0.05617658048868179,
-0.027947645634412766,
-0.06942030787467957,
-0.019007744267582893,
-0.08135984092950821,
0.05212954431772232,
0.04802396520972252,
0.0073134866543114185,
-0.01765470579266548,
-0.06432696431875229,
0.06548912823200226,
0.12158399820327759,
-0.13778550922870636,
0.0376371406018734,
-0.12170173227787018,
0.05343593657016754,
-0.1061829924583435,
-0.024401405826210976,
-0.15037503838539124,
0.13773660361766815,
-0.022853484377264977,
0.08569895476102829,
0.036147456616163254,
0.22427307069301605,
0.019775548949837685,
-0.00404326431453228,
-0.03421451151371002,
-0.02516583353281021,
-0.0005865339771844447,
-0.021258335560560226,
-0.03801250830292702,
-0.08471361547708511,
-0.023523405194282532,
-0.07271131128072739,
0.08519187569618225,
-0.18644782900810242,
-0.0011115047382190824,
0.045840825885534286,
0.0015631041023880243,
0.03278758376836777,
-0.009515267796814442,
0.06637142598628998,
0.08609753102064133,
0.0012199500342831016,
0.004270429722964764,
0.033784106373786926,
0.013447294011712074,
-0.04665990173816681,
0.13461363315582275,
-0.17163719236850739,
0.050784897059202194,
0.11265593022108078,
-0.02688473090529442,
0.016664205119013786,
0.03531137481331825,
-0.03434966877102852,
-0.018140332773327827,
-0.09718846529722214,
-0.0007976970518939197,
0.3080293536186218,
-0.012448671273887157,
0.09068932384252548,
-0.10353302955627441,
-0.004124265164136887,
0.021021995693445206,
-0.06535351276397705,
0.03667223080992699,
0.05759815499186516,
0.002338665770366788,
0.034346625208854675,
0.03531648591160774,
-0.02694340981543064,
-0.08122773468494415,
0.23993274569511414,
-0.03320292383432388,
-0.09375521540641785,
0.014174471609294415,
-0.043060675263404846,
-0.029764266684651375,
0.0772782489657402,
-0.1643814891576767,
-0.03070986457169056,
0.03762448579072952,
0.05736512690782547,
0.05985475331544876,
-0.15754929184913635,
0.028034651651978493,
0.026588793843984604,
-0.12361735105514526,
-0.1492052674293518,
0.05415555089712143,
-0.03137443587183952,
0.03902226686477661,
-0.09707564860582352,
-0.036032456904649734,
0.007620782125741243,
-0.04377839341759682,
-0.16606539487838745,
0.1269155740737915,
-0.05590314790606499,
-0.2074950933456421,
-0.12455829977989197,
0.01041344553232193,
0.009924300946295261,
0.029395999386906624,
0.09751828014850616,
-0.0835617184638977,
-0.014303535223007202,
-0.05785644054412842,
0.10202959924936295,
0.005978884641081095,
-0.019743602722883224,
-0.05355961248278618,
0.024186018854379654,
0.0760354995727539,
-0.14757750928401947,
0.015620254911482334,
-0.04797929897904396,
-0.04803517088294029,
0.003428555326536298,
0.0026784336660057306,
0.009482012130320072,
0.17476466298103333,
0.04046318307518959,
0.007458785083144903,
-0.027425389736890793,
0.18147213757038116,
-0.10089647769927979,
-0.045628685504198074,
0.2006899118423462,
-0.016192136332392693,
-0.031161611899733543,
0.1279052197933197,
0.016737304627895355,
-0.07658597826957703,
0.004502367693930864,
0.009825103916227818,
-0.07062874734401703,
-0.23983584344387054,
-0.1050136461853981,
-0.08197131007909775,
-0.05200653895735741,
-0.012301545590162277,
0.006918723229318857,
0.04944577440619469,
0.020906580612063408,
-0.007229275535792112,
-0.05347626656293869,
0.03357420861721039,
-0.023226432502269745,
0.14363837242126465,
-0.00807166937738657,
0.10666308552026749,
-0.033829037100076675,
-0.04247763752937317,
0.02624751813709736,
0.024996422231197357,
0.1547413319349289,
0.04285145178437233,
0.058962382376194,
0.07704295963048935,
0.10874686390161514,
0.11150764673948288,
0.053282853215932846,
-0.04476640000939369,
-0.022376464679837227,
0.008918815292418003,
-0.05585378780961037,
-0.034288108348846436,
0.0539512038230896,
0.1716526746749878,
-0.05240009352564812,
-0.052254434674978256,
-0.059606973081827164,
-0.008131029084324837,
0.2252168506383896,
0.10320797562599182,
-0.23011505603790283,
-0.06352624297142029,
-0.02112550288438797,
-0.08175403624773026,
-0.009304196573793888,
0.06699977815151215,
0.1910221427679062,
-0.1138230413198471,
0.0073159378953278065,
0.0042662289924919605,
0.09728140383958817,
-0.024576913565397263,
0.032442688941955566,
-0.07599674165248871,
0.01907282881438732,
-0.01165533997118473,
0.08789486438035965,
-0.27136823534965515,
0.19708284735679626,
0.003586202161386609,
0.12008526176214218,
-0.04887497052550316,
-0.014191698282957077,
-0.0016608876176178455,
0.05787646025419235,
0.10124915838241577,
-0.0032175309024751186,
0.013645654544234276,
-0.08533014357089996,
-0.08550269156694412,
0.07039391994476318,
-0.02186451107263565,
0.01909555494785309,
0.04414365813136101,
0.008553626015782356,
0.010514783672988415,
0.025910818949341774,
-0.0090052904561162,
-0.13120873272418976,
-0.05549969896674156,
0.0014407472917810082,
0.19097571074962616,
0.1219877302646637,
-0.020034709945321083,
-0.08481074869632721,
-0.06640401482582092,
0.054522108286619186,
-0.06639286875724792,
-0.045417796820402145,
-0.08491125702857971,
-0.0333290696144104,
0.1140168085694313,
-0.04212086275219917,
-0.0030978957656770945,
0.10935984551906586,
0.12308941781520844,
-0.026148207485675812,
-0.0486779622733593,
0.06735114753246307,
-0.10717006772756577,
-0.1105620339512825,
-0.023572364822030067,
0.16741789877414703,
0.09843458235263824,
0.08501403778791428,
0.05369476228952408,
0.011773692443966866,
-0.022055136039853096,
-0.03304561227560043,
0.022359367460012436,
0.11602744460105896,
-0.07129133492708206,
-0.003934893291443586,
0.03673829138278961,
-0.15652801096439362,
-0.09412316232919693,
-0.0481160506606102,
0.1620006263256073,
0.07913584262132645,
-0.052835747599601746,
0.18555524945259094,
0.1835131198167801,
-0.09421535581350327,
-0.19940106570720673,
-0.008451094850897789,
0.11602944135665894,
0.142533540725708,
0.017877286300063133,
-0.17331399023532867,
0.07671241462230682,
-0.016814764589071274,
-0.036763064563274384,
-0.026846325024962425,
-0.2703457772731781,
-0.1435384601354599,
0.16774113476276398,
-0.04126782715320587,
0.15792427957057953,
-0.012876763939857483,
-0.014081302098929882,
-0.014594024047255516,
-0.028000086545944214,
0.0560145378112793,
-0.11354480683803558,
0.10961680859327316,
0.025600342079997063,
0.10048383474349976,
0.05400127172470093,
-0.022416625171899796,
0.08115484565496445,
0.08001304417848587,
0.0022679169196635485,
-0.009941634722054005,
0.06385724246501923,
0.025125132873654366,
0.010797734372317791,
0.11583637446165085,
-0.0875532254576683,
0.031660012900829315,
-0.1217852234840393,
-0.108236163854599,
-0.06226152926683426,
0.06868397444486618,
0.029610125347971916,
-0.0507856160402298,
0.015265529043972492,
-0.03978060558438301,
0.022763226181268692,
-0.0012089826632291079,
-0.02628348395228386,
-0.13997742533683777,
0.052607931196689606,
0.15297669172286987,
0.19823883473873138,
-0.04386483505368233,
-0.10822376608848572,
-0.021009892225265503,
-0.019955724477767944,
0.13095669448375702,
-0.10956426709890366,
0.038012683391571045,
0.07581444084644318,
0.03857859596610069,
0.1352073848247528,
0.027863427996635437,
-0.08744575083255768,
0.08426857739686966,
0.02514275535941124,
-0.06528981029987335,
-0.09684734046459198,
-0.03873913362622261,
-0.0344897024333477,
-0.054772891104221344,
0.021786898374557495,
0.1315985918045044,
-0.0892585813999176,
-0.007559188641607761,
-0.015186872333288193,
0.025359107181429863,
-0.13246439397335052,
0.2129543125629425,
0.024395866319537163,
0.07256674766540527,
-0.09430408477783203,
-0.0038299215957522392,
0.01799958199262619,
-0.0651279166340828,
0.04946579039096832,
-0.05502060055732727,
-0.055693112313747406,
-0.058640334755182266,
-0.010702508501708508,
0.07648033648729324,
0.06666862219572067,
-0.13481780886650085,
-0.059475090354681015,
-0.10798486322164536,
0.018268581479787827,
0.06354374438524246,
0.0454523079097271,
0.03125811368227005,
-0.10790891200304031,
-0.07156093418598175,
-0.10364806652069092,
0.05892759934067726,
0.057708416134119034,
-0.018261436372995377,
-0.11718746274709702,
0.1703345775604248,
0.04949796199798584,
0.03369147330522537,
-0.04880877584218979,
-0.06883767992258072,
-0.03819038346409798,
0.08670031279325485,
-0.0832459032535553,
0.007131294347345829,
-0.03376944735646248,
0.0215178020298481,
0.0030209815595299006,
-0.07668693363666534,
-0.03282865881919861,
0.07942204177379608,
-0.09306911379098892,
0.06631121039390564,
-0.0020395638421177864,
0.06199827790260315,
-0.07370112091302872,
0.015177986584603786,
0.01625845581293106,
-0.0533730573952198,
0.08318905532360077,
0.1078299805521965,
-0.11839296668767929,
0.07964927703142166,
-0.19308209419250488,
-0.05613645911216736,
0.06588999927043915,
0.07287673652172089,
-0.03272443637251854,
-0.10827651619911194,
0.03181321546435356,
0.10285099595785141,
0.052482083439826965,
-0.016020668670535088,
0.07892908900976181,
-0.0696646198630333,
-0.01728515513241291,
-0.07234252989292145,
-0.016110675409436226,
-0.039681173861026764,
0.013917985372245312,
0.04294291511178017,
0.15095388889312744,
0.1633806824684143,
-0.0881577730178833,
0.10044445097446442,
-0.1385081559419632,
0.00531215313822031,
-0.035503193736076355,
-0.010777972638607025,
-0.08958619087934494,
-0.08468829840421677,
0.07770968228578568,
-0.04238682985305786,
0.13235411047935486,
0.04927347972989082,
0.015742886811494827,
-0.023678338155150414,
-0.10145871341228485,
0.047234904021024704,
-0.010212665423750877,
0.26071417331695557,
0.03793865814805031,
0.021749502047896385,
-0.007517586462199688,
-0.0050869593396782875,
0.025826752185821533,
0.15000759065151215,
0.009796707890927792,
0.13762198388576508,
0.02058500610291958,
0.07516214996576309,
0.09218999743461609,
-0.06163036823272705,
-0.05073785036802292,
0.0024044658057391644,
-0.12828756868839264,
0.05835771933197975,
-0.07261322438716888,
0.14085404574871063,
0.1326269656419754,
-0.09360207617282867,
0.08347759395837784,
0.005119910463690758,
-0.07540164887905121,
-0.16688689589500427,
-0.12865407764911652,
-0.07433730363845825,
-0.16649623215198517,
0.026820193976163864,
-0.1000090166926384,
0.024873774498701096,
0.07180074602365494,
0.033502694219350815,
-0.018221698701381683,
0.14324340224266052,
0.0059367818757891655,
-0.09522421658039093,
0.08058210462331772,
-0.08721636980772018,
-0.0037235445342957973,
-0.10101928561925888,
0.04056936874985695,
0.16876788437366486,
0.0005809171707369387,
0.05729313939809799,
-0.007214548531919718,
-0.05888865143060684,
0.0346733033657074,
-0.07169467210769653,
-0.05974110588431358,
-0.018547888845205307,
-0.0210930947214365,
0.08281257003545761,
0.12199810892343521,
0.12547819316387177,
-0.0645822063088417,
0.009680312126874924,
0.13036109507083893,
-0.05266358330845833,
-0.149215430021286,
-0.17157350480556488,
0.10904955118894577,
0.026124760508537292,
0.009033960290253162,
0.0018545391503721476,
-0.058674462139606476,
-0.0005953318905085325,
0.2228459119796753,
0.2083844542503357,
0.034950803965330124,
0.026455074548721313,
-0.009185059927403927,
-0.010081111453473568,
-0.03852170333266258,
0.06123240664601326,
0.07646413892507553,
0.25907400250434875,
-0.01973293535411358,
-0.011122983880341053,
-0.06356065720319748,
-0.06790035963058472,
0.015354864299297333,
0.06340422481298447,
-0.06702961772680283,
-0.09507665038108826,
0.009134969674050808,
0.12976936995983124,
-0.044479213654994965,
-0.10332491248846054,
-0.08069676160812378,
-0.08490993082523346,
-0.08458908647298813,
-0.02162141725420952,
0.03521665558218956,
0.09546764194965363,
0.014666535891592503,
-0.08589082211256027,
0.021421078592538834,
0.14210116863250732,
-0.01806446723639965,
-0.06991872191429138,
-0.021792452782392502,
0.05152342468500137,
-0.10117059201002121,
0.0071034422144293785,
-0.011659989133477211,
0.16810187697410583,
0.018163951113820076,
0.08719291538000107,
-0.018101204186677933,
0.14325737953186035,
-0.01843932829797268,
-0.08968304842710495,
0.022562842816114426,
0.13111373782157898,
-0.03254786133766174,
0.12188699841499329,
0.022905435413122177,
-0.12374421209096909,
0.05170386657118797,
-0.13959428668022156,
-0.004763004835695028,
-0.08746891468763351,
0.039640624076128006,
-0.04103202000260353,
0.08551262319087982,
0.060599952936172485,
-0.06637831777334213,
-0.05136902630329132,
-0.0438297800719738,
0.07643010467290878,
0.010734086856245995,
-0.044682037085294724,
-0.038305122405290604,
-0.22456477582454681,
-0.008776376955211163,
-0.05950659513473511,
-0.02791055664420128,
-0.2271912544965744,
-0.023782208561897278,
0.003324788762256503,
-0.08817431330680847,
0.0008596532861702144,
0.020264118909835815,
0.10262294113636017,
-0.0023797075264155865,
-0.024173088371753693,
-0.01575026661157608,
0.0478181391954422,
0.1183948963880539,
-0.1612883359193802,
-0.10646908730268478
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Romansh Sursilv
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Romansh Sursilv using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "rm-sursilv", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-sursilv")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-sursilv")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Romansh Sursilv test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "rm-sursilv", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-sursilv")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-sursilv")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\”\„\–\…\«\»]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 25.78 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "rm-sursilv", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Romansh Sursilv", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice rm-sursilv", "type": "common_voice", "args": "rm-sursilv"}, "metrics": [{"type": "wer", "value": 25.78, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-rm-sursilv
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"rm-sursilv"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Romansh Sursilv
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilv using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Romansh Sursilv test data of Common Voice.
Test Result: 25.78 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Romansh Sursilv\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilv using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Romansh Sursilv test data of Common Voice.\n\nTest Result: 25.78 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Romansh Sursilv\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilv using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Romansh Sursilv test data of Common Voice.\n\nTest Result: 25.78 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
78,
69,
20,
31,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Romansh Sursilv\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilv using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Romansh Sursilv test data of Common Voice.\n\nTest Result: 25.78 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.16233162581920624,
0.0732659175992012,
-0.0024941933806985617,
-0.008846630342304707,
0.08197333663702011,
-0.024080824106931686,
0.1904284805059433,
0.10861900448799133,
-0.03494419530034065,
-0.007029484026134014,
0.04517785459756851,
-0.05403323099017143,
0.06866621226072311,
0.08098418265581131,
0.004024519119411707,
-0.17879998683929443,
0.010812557302415371,
-0.01861214078962803,
0.06821367144584656,
0.11099717020988464,
0.08791235834360123,
-0.08990182727575302,
0.016335517168045044,
0.05915601924061775,
-0.10752873122692108,
0.046839434653520584,
0.038702305406332016,
-0.07615267485380173,
0.15060687065124512,
0.06234367936849594,
0.0499829426407814,
0.05543223023414612,
0.11418788135051727,
-0.20554272830486298,
0.02486386150121689,
0.04998897761106491,
0.020850008353590965,
0.014904454350471497,
0.05271260067820549,
-0.006493026856333017,
0.08280523121356964,
0.06722404807806015,
-0.03857513517141342,
0.0505828894674778,
-0.06401985138654709,
-0.19745123386383057,
-0.04091372713446617,
-0.0050915624015033245,
0.033067550510168076,
0.12907041609287262,
-0.06343024969100952,
0.12595221400260925,
-0.12821555137634277,
0.08918267488479614,
0.11549247056245804,
-0.18993718922138214,
-0.019021296873688698,
0.06700734794139862,
0.03370446711778641,
0.0812641829252243,
-0.04948556050658226,
0.05941788852214813,
0.03483416512608528,
0.022088665515184402,
-0.017690030857920647,
-0.01308486983180046,
-0.19439980387687683,
0.004097333177924156,
-0.1313030868768692,
-0.056229837238788605,
0.2521927058696747,
-0.02770553156733513,
-0.07434898614883423,
-0.11111655086278915,
-0.029094120487570763,
0.01980915665626526,
-0.0016689180629327893,
-0.056433144956827164,
-0.0038677097763866186,
0.04503105953335762,
-0.024810481816530228,
-0.0292114969342947,
-0.11161690205335617,
-0.15938757359981537,
0.008032289333641529,
0.06603096425533295,
0.04352083057165146,
0.026472603902220726,
-0.10480672121047974,
0.09681975096464157,
-0.1414182186126709,
-0.0708777979016304,
0.0030424119904637337,
0.021610070019960403,
-0.09040650725364685,
0.017111938446760178,
-0.08849795162677765,
-0.14373479783535004,
0.04772809147834778,
0.035431452095508575,
0.07644857466220856,
0.013455378822982311,
-0.04973902180790901,
0.04724033549427986,
0.006893839221447706,
0.10750225931406021,
-0.07399572432041168,
-0.01857885904610157,
0.024398736655712128,
0.002642863430082798,
-0.01930226758122444,
0.0015020330902189016,
-0.05616595596075058,
-0.06640937179327011,
0.024628419429063797,
0.07684945315122604,
-0.02194344624876976,
0.004857283551245928,
-0.033781569451093674,
-0.021151699125766754,
0.0020570335909724236,
-0.10107412934303284,
-0.037460118532180786,
0.05425112694501877,
-0.008235868997871876,
0.11572176963090897,
0.04200997203588486,
0.045367706567049026,
-0.07956992834806442,
0.0029506846331059933,
0.027405107393860817,
0.034423064440488815,
-0.015830853953957558,
-0.10880999267101288,
0.020438967272639275,
-0.05684702843427658,
-0.03402017056941986,
-0.08833430707454681,
-0.1639503836631775,
-0.06641720980405807,
-0.003921213559806347,
0.011186813935637474,
0.00818572472780943,
-0.09006023406982422,
-0.002929317532107234,
-0.02786828950047493,
-0.04217826947569847,
0.06304009258747101,
-0.041566140949726105,
0.07162372022867203,
0.019362743943929672,
0.057522427290678024,
0.02653382532298565,
0.07852431386709213,
-0.09843987971544266,
-0.04545970633625984,
0.09248358756303787,
0.12357553094625473,
-0.03232792019844055,
-0.06874709576368332,
-0.10218052566051483,
-0.06039593368768692,
-0.08888811618089676,
0.050269924104213715,
0.08941453695297241,
0.1307394951581955,
-0.34802165627479553,
-0.0739261731505394,
0.22236648201942444,
-0.1550380140542984,
-0.02946907840669155,
0.22776445746421814,
0.0027043672744184732,
0.08803758770227432,
0.15662306547164917,
0.21628035604953766,
0.0810275673866272,
-0.20345482230186462,
0.03430456668138504,
0.004731607623398304,
0.000009555696124152746,
-0.014348224736750126,
0.08221261203289032,
-0.0490742065012455,
0.028341639786958694,
0.03412901610136032,
-0.08418945968151093,
0.052983105182647705,
-0.05040353536605835,
-0.0819699615240097,
-0.042290203273296356,
-0.08970919996500015,
0.09114129841327667,
0.04676558077335358,
0.008461767807602882,
-0.029629670083522797,
-0.06463088095188141,
-0.018881557509303093,
0.1326369047164917,
-0.1412862092256546,
0.05646015331149101,
-0.10564596951007843,
0.0856579914689064,
-0.06870099902153015,
0.010198364965617657,
-0.13660424947738647,
0.1349964290857315,
0.0066552008502185345,
0.10722500085830688,
0.03711693733930588,
0.16886606812477112,
0.025524690747261047,
0.0023725200444459915,
-0.055369630455970764,
-0.010694884695112705,
0.01284799911081791,
-0.03216307982802391,
-0.037998929619789124,
-0.10760336369276047,
-0.03225897252559662,
-0.04925640672445297,
0.08780184388160706,
-0.1771903932094574,
0.004556757397949696,
0.05227043852210045,
0.014509780332446098,
0.000490362464915961,
-0.019181206822395325,
0.03623506426811218,
0.08462166041135788,
0.001514541101641953,
-0.007125833537429571,
0.03923456370830536,
-0.0025417462456971407,
-0.015018711797893047,
0.1118064820766449,
-0.09853915870189667,
0.0025363387539982796,
0.10602208226919174,
-0.02374209836125374,
-0.005207960028201342,
0.05870853364467621,
-0.002670115791261196,
-0.028775667771697044,
-0.10725262761116028,
-0.023983491584658623,
0.2427765429019928,
0.0011770341079682112,
0.1054120883345604,
-0.0992058739066124,
0.014506557956337929,
0.029144367203116417,
-0.06052158772945404,
0.037881333380937576,
0.07623129338026047,
0.015317988581955433,
0.04482250288128853,
0.028997426852583885,
-0.033237896859645844,
-0.11510036140680313,
0.20731490850448608,
-0.03884895518422127,
-0.09848091006278992,
0.020237082615494728,
-0.016258204355835915,
-0.011174230836331844,
0.06506090611219406,
-0.18887890875339508,
-0.04139237478375435,
0.02826138399541378,
0.050858333706855774,
0.06103498488664627,
-0.15838700532913208,
0.018384519964456558,
0.00689929723739624,
-0.14183121919631958,
-0.16691258549690247,
0.06413877010345459,
-0.06798932701349258,
0.05683625862002373,
-0.11964710056781769,
-0.013001305982470512,
-0.010960151441395283,
-0.04902193695306778,
-0.17560669779777527,
0.11224415153265,
-0.06697884947061539,
-0.16141757369041443,
-0.13015541434288025,
0.004362260922789574,
0.0064621055498719215,
0.03337665647268295,
0.0908317044377327,
-0.109112448990345,
-0.010603949427604675,
-0.035862602293491364,
0.07319588959217072,
-0.018253426998853683,
-0.003078829264268279,
-0.060122162103652954,
0.019380338490009308,
0.0572674535214901,
-0.13285435736179352,
0.014905762858688831,
-0.06166182830929756,
-0.06031451374292374,
0.004994252696633339,
-0.02351277507841587,
0.03575177118182182,
0.14545173943042755,
0.016370199620723724,
0.025551078841090202,
-0.02369057387113571,
0.18274334073066711,
-0.07199519127607346,
-0.03809889405965805,
0.23715640604496002,
-0.015915267169475555,
-0.022829655557870865,
0.09202977269887924,
0.02560393139719963,
-0.06448273360729218,
-0.013119188137352467,
-0.021265102550387383,
-0.0904085785150528,
-0.2527765929698944,
-0.12842005491256714,
-0.07287194579839706,
-0.07231145352125168,
-0.004321928136050701,
0.00004938506390317343,
0.018769649788737297,
0.01536228321492672,
-0.041247498244047165,
-0.048914097249507904,
0.058039676398038864,
-0.01214800588786602,
0.14965330064296722,
-0.000323869549902156,
0.10050028562545776,
-0.049845244735479355,
-0.026949889957904816,
0.034754086285829544,
0.02365567535161972,
0.1846621036529541,
0.04400986433029175,
0.10693715512752533,
0.09128013998270035,
0.12636864185333252,
0.09755323082208633,
0.0647812932729721,
-0.039804838597774506,
-0.02494526281952858,
0.015216647647321224,
-0.07441017031669617,
-0.04841945320367813,
0.04263071343302727,
0.12774419784545898,
-0.046281617134809494,
-0.027747858315706253,
-0.019437024369835854,
0.028410928323864937,
0.21484249830245972,
0.09490801393985748,
-0.21803325414657593,
-0.07163093239068985,
-0.02529098652303219,
-0.04497876390814781,
0.02729800157248974,
0.03202657401561737,
0.18134374916553497,
-0.12479668110609055,
0.025548774749040604,
-0.002541664056479931,
0.09108899533748627,
-0.033034343272447586,
0.032239705324172974,
-0.07219204306602478,
0.03391573205590248,
-0.004319585394114256,
0.10556267201900482,
-0.2839242219924927,
0.2274220734834671,
0.014951227232813835,
0.1313999891281128,
-0.06308892369270325,
-0.0017950866604223847,
0.004653025418519974,
0.07898484915494919,
0.11499053239822388,
-0.0001540890516480431,
0.028181837871670723,
-0.06214551255106926,
-0.06944695115089417,
0.06549456715583801,
0.016296857967972755,
0.03938474506139755,
0.04025137796998024,
-0.006292571313679218,
0.011434195563197136,
0.021976623684167862,
0.0339982807636261,
-0.1682850420475006,
-0.06361452490091324,
0.016935210675001144,
0.17510637640953064,
0.09254147857427597,
-0.01800535060465336,
-0.10328362882137299,
-0.12233088165521622,
0.07393482327461243,
-0.08731697499752045,
-0.04492931067943573,
-0.06304509192705154,
-0.0034957490861415863,
0.10683261603116989,
-0.05221521481871605,
-0.021782930940389633,
0.08855190873146057,
0.1102994903922081,
-0.04963267222046852,
-0.030674124136567116,
0.0432610847055912,
-0.11860545724630356,
-0.1444156914949417,
-0.03950253874063492,
0.1874355524778366,
0.10104547441005707,
0.07571575045585632,
0.05280742049217224,
0.015460503287613392,
-0.016117651015520096,
-0.03922605514526367,
0.02336074784398079,
0.12742741405963898,
-0.12001632899045944,
-0.0021541526075452566,
0.015542812645435333,
-0.15159989893436432,
-0.09596690535545349,
-0.06388247013092041,
0.16539350152015686,
0.08689597249031067,
-0.06522218137979507,
0.1845201849937439,
0.19483064115047455,
-0.10564533621072769,
-0.22973787784576416,
-0.023602889850735664,
0.08952425420284271,
0.10565050691366196,
0.0009668687707744539,
-0.18702135980129242,
0.051206763833761215,
0.0025034481659531593,
-0.023828299716114998,
-0.08324310183525085,
-0.2874293029308319,
-0.1449604481458664,
0.10848113894462585,
-0.04496043175458908,
0.1301429271697998,
-0.003499915823340416,
-0.030637264251708984,
-0.04277874529361725,
-0.013643876649439335,
-0.004056902602314949,
-0.11064892262220383,
0.08832698315382004,
0.020915929228067398,
0.058824677020311356,
0.034152500331401825,
-0.025346893817186356,
0.09816086292266846,
0.0907876268029213,
-0.002974393777549267,
0.009342560544610023,
0.07665300369262695,
0.09613250195980072,
0.02994951792061329,
0.12461257725954056,
-0.08663266897201538,
0.04398428648710251,
-0.08023710548877716,
-0.08782794326543808,
-0.061130937188863754,
0.04573258012533188,
0.024157458916306496,
-0.02737506665289402,
0.038788292557001114,
-0.04848563298583031,
0.01176432240754366,
0.013216841965913773,
-0.05911029502749443,
-0.10884993523359299,
0.04084927216172218,
0.12833517789840698,
0.18726693093776703,
-0.03412856534123421,
-0.07198740541934967,
-0.017008034512400627,
-0.012661239132285118,
0.09993700683116913,
-0.07509994506835938,
0.04492460936307907,
0.07413918524980545,
0.05820141360163689,
0.11911934614181519,
0.016842342913150787,
-0.10353946685791016,
0.07884630560874939,
0.016590682789683342,
-0.08425398170948029,
-0.13096089661121368,
-0.025557667016983032,
-0.03611036017537117,
-0.05674399435520172,
0.05753700062632561,
0.1202324703335762,
-0.0923689603805542,
-0.006202611606568098,
-0.03223186731338501,
0.023776469752192497,
-0.10991314798593521,
0.26765912771224976,
0.015504051931202412,
0.061558663845062256,
-0.12507471442222595,
0.043171077966690063,
-0.007365616038441658,
-0.02674134075641632,
0.025243140757083893,
-0.04830658808350563,
-0.10005564242601395,
-0.06486223638057709,
-0.03845994547009468,
0.05025915801525116,
0.07956390827894211,
-0.15251798927783966,
-0.05269405245780945,
-0.08223777264356613,
0.017839621752500534,
0.024720001965761185,
0.0661718100309372,
0.018424153327941895,
-0.13086049258708954,
-0.08271891623735428,
-0.09024673700332642,
0.06976710259914398,
0.10119599103927612,
-0.01763940043747425,
-0.08862939476966858,
0.1651044487953186,
0.05898124352097511,
0.014863832853734493,
-0.04498928412795067,
-0.075935497879982,
-0.011475641280412674,
0.1071535125374794,
-0.08778101950883865,
0.002912819618359208,
-0.04298548400402069,
0.00715399906039238,
-0.0014786932151764631,
-0.0775298997759819,
-0.02022605761885643,
0.08216717094182968,
-0.09495950490236282,
0.05917688459157944,
-0.016253605484962463,
0.07080402225255966,
-0.10145196318626404,
0.03522225841879845,
0.019185932353138924,
-0.050654709339141846,
0.09207605570554733,
0.11613650619983673,
-0.10221002250909805,
0.12020626664161682,
-0.2474808543920517,
-0.047450944781303406,
0.044085532426834106,
0.07688497006893158,
-0.04962276294827461,
-0.07663092762231827,
0.03936097025871277,
0.09877750277519226,
0.02399720810353756,
0.012508577667176723,
0.0921960175037384,
-0.04163793846964836,
0.03533271700143814,
-0.04182913899421692,
-0.0070520322769880295,
-0.012715570628643036,
0.04774787276983261,
0.04697558656334877,
0.15934988856315613,
0.14522546529769897,
-0.09501700103282928,
0.08568450808525085,
-0.1377454698085785,
0.010211307555437088,
-0.04459276795387268,
-0.023019568994641304,
-0.15568232536315918,
-0.0736159160733223,
0.08858252316713333,
-0.02855784073472023,
0.1363580971956253,
0.03871073201298714,
0.07879946380853653,
-0.035357240587472916,
-0.02722889743745327,
0.006293743848800659,
-0.015008024871349335,
0.2455357164144516,
0.048464395105838776,
0.04571229964494705,
-0.05598035454750061,
-0.014183493331074715,
0.0035113124176859856,
0.12830086052417755,
0.0223325714468956,
0.1519538313150406,
0.007363669108599424,
0.07354702055454254,
0.079420305788517,
-0.05950894579291344,
-0.050006624311208725,
-0.006633509881794453,
-0.1366700828075409,
0.05814849212765694,
-0.07562027126550674,
0.18011388182640076,
0.10915547609329224,
-0.09155580401420593,
0.054674796760082245,
0.0030296521726995707,
-0.07633843272924423,
-0.15210534632205963,
-0.17576918005943298,
-0.060745131224393845,
-0.1660785973072052,
0.02606338821351528,
-0.09446311742067337,
0.042694900184869766,
0.09158551692962646,
0.03541632369160652,
-0.03687480837106705,
0.11292784661054611,
0.019798722118139267,
-0.10236922651529312,
0.062223874032497406,
-0.08776923269033432,
-0.013253968209028244,
-0.06738904118537903,
0.02196495421230793,
0.16034634411334991,
-0.03006349503993988,
0.06461938470602036,
0.008835862390697002,
-0.01999032124876976,
0.003070027567446232,
-0.07519208639860153,
-0.05554461106657982,
-0.015645155683159828,
-0.023179689422249794,
0.0667964443564415,
0.11819449812173843,
0.11611849069595337,
-0.06031123548746109,
0.01162513718008995,
0.12078768759965897,
-0.03499223291873932,
-0.1454775482416153,
-0.16356606781482697,
0.13205696642398834,
0.027621902525424957,
0.03538825362920761,
0.0031388294883072376,
-0.05193565785884857,
0.023530729115009308,
0.2504448890686035,
0.2332506626844406,
0.057385433465242386,
0.03163697198033333,
-0.009233788587152958,
-0.006162415258586407,
-0.019935710355639458,
0.05409958213567734,
0.055950094014406204,
0.18026281893253326,
-0.03321587294340134,
0.03313658386468887,
-0.07891387492418289,
-0.07096734642982483,
0.0013621975667774677,
0.06192568689584732,
-0.0843508169054985,
-0.10736798495054245,
-0.003483691718429327,
0.14440782368183136,
-0.01572318933904171,
-0.12523676455020905,
-0.08218981325626373,
-0.10868428647518158,
-0.08152849972248077,
-0.009390073828399181,
0.014019967056810856,
0.10676687955856323,
0.011370882391929626,
-0.09520769864320755,
0.03235989436507225,
0.15580743551254272,
-0.005436637904495001,
-0.08190342783927917,
-0.08574935048818588,
0.06931280344724655,
-0.08982119709253311,
-0.002428047824651003,
-0.029741035774350166,
0.18922878801822662,
0.019839970394968987,
0.10298079252243042,
-0.012285859324038029,
0.14303134381771088,
-0.013134198263287544,
-0.07658567279577255,
0.009430314414203167,
0.11299344152212143,
-0.04266898334026337,
0.12232135981321335,
0.03329777345061302,
-0.08666134625673294,
0.061377204954624176,
-0.08816619217395782,
-0.004745778627693653,
-0.08405160158872604,
0.06408191472291946,
-0.04220094159245491,
0.08716125786304474,
0.06679921597242355,
-0.06370070576667786,
-0.029702257364988327,
-0.056877583265304565,
0.06280972063541412,
0.015931131318211555,
-0.014893004670739174,
-0.02949291653931141,
-0.26032713055610657,
-0.004169720225036144,
-0.06996166706085205,
-0.008245892822742462,
-0.16240714490413666,
-0.03361903503537178,
-0.012307940982282162,
-0.07015429437160492,
0.017977198585867882,
0.049830444157123566,
0.10965617746114731,
0.002002844586968422,
-0.0045959679409861565,
-0.055457014590501785,
0.060738276690244675,
0.10775165259838104,
-0.14338204264640808,
-0.09205823391675949
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Romansh Vallader
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Romansh Vallader using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "rm-vallader", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-vallader")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-vallader")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Romansh Vallader test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "rm-vallader", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-vallader")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-rm-vallader")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\”\„\–\…\«\»]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub('’ ',' ',batch["sentence"])
batch["sentence"] = re.sub(' ‘',' ',batch["sentence"])
batch["sentence"] = re.sub('’|‘','\'',batch["sentence"])
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 32.89 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "rm-vallader", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Romansh Vallader", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice rm-vallader", "type": "common_voice", "args": "rm-vallader"}, "metrics": [{"type": "wer", "value": 32.89, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-rm-vallader
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"rm-vallader"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Romansh Vallader
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Vallader using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Romansh Vallader test data of Common Voice.
Test Result: 32.89 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Romansh Vallader\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Vallader using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Romansh Vallader test data of Common Voice.\n\nTest Result: 32.89 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Romansh Vallader\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Vallader using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Romansh Vallader test data of Common Voice.\n\nTest Result: 32.89 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
78,
67,
20,
30,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Romansh Vallader\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Vallader using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Romansh Vallader test data of Common Voice.\n\nTest Result: 32.89 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1597229689359665,
0.041027866303920746,
-0.002078109420835972,
0.026504915207624435,
0.09443273395299911,
-0.02012704499065876,
0.21224118769168854,
0.10002502799034119,
-0.03504389896988869,
-0.006141535937786102,
0.05592513456940651,
-0.018639639019966125,
0.056049615144729614,
0.06776710599660873,
0.016184361651539803,
-0.2254311442375183,
-0.004623016808182001,
-0.0005525043816305697,
0.09567948430776596,
0.09197176247835159,
0.09486863017082214,
-0.08750393986701965,
0.013206284493207932,
0.06454455852508545,
-0.09784731268882751,
0.06056563928723335,
0.03450964018702507,
-0.08139505982398987,
0.1677100658416748,
0.07210585474967957,
0.04636119306087494,
0.030355310067534447,
0.11906203627586365,
-0.19237026572227478,
0.026164786890149117,
0.044302888214588165,
0.020114358514547348,
0.00648669246584177,
0.02581949718296528,
-0.00912392046302557,
0.10872261971235275,
0.025565318763256073,
-0.014748819172382355,
0.051250480115413666,
-0.06239558756351471,
-0.2366812825202942,
-0.051603175699710846,
-0.02433638460934162,
0.04724383354187012,
0.12877075374126434,
-0.0526924729347229,
0.1580769419670105,
-0.10293712466955185,
0.10164699703454971,
0.12549921870231628,
-0.24115145206451416,
-0.03987559676170349,
0.058055367320775986,
0.04933455213904381,
0.1316109448671341,
-0.053887609392404556,
0.0741061195731163,
0.03331384062767029,
0.04969893395900726,
0.013769273646175861,
-0.009334033355116844,
-0.19782805442810059,
0.00018017822003457695,
-0.13595739006996155,
-0.03660361468791962,
0.2541656792163849,
-0.0411611944437027,
-0.06526403874158859,
-0.12763579189777374,
-0.02548445574939251,
-0.006012238096445799,
0.011287221685051918,
-0.06223643943667412,
-0.01218237541615963,
0.03495902195572853,
0.02239879034459591,
-0.02878890000283718,
-0.11713230609893799,
-0.14212700724601746,
-0.0005833602626807988,
0.06330538541078568,
0.04087277874350548,
0.019472189247608185,
-0.13594605028629303,
0.1003635823726654,
-0.071219302713871,
-0.08283621817827225,
0.028978582471609116,
0.038002341985702515,
-0.06757181137800217,
0.014940236695110798,
-0.08816990256309509,
-0.2095952033996582,
0.031250208616256714,
-0.0002832107711583376,
0.06826119124889374,
0.010020842775702477,
-0.045874085277318954,
0.04991251975297928,
0.0037910109385848045,
0.13643617928028107,
-0.06220163032412529,
-0.036714740097522736,
0.04937359318137169,
0.002196617191657424,
-0.0022280002012848854,
-0.008910810574889183,
-0.06876497715711594,
-0.07160038501024246,
-0.00475301593542099,
0.0781247615814209,
-0.030325716361403465,
-0.01180450338870287,
-0.018298985436558723,
-0.008608993142843246,
-0.03513364866375923,
-0.10478795319795609,
-0.02492714487016201,
0.057468995451927185,
0.029390720650553703,
0.16904297471046448,
0.053116071969270706,
0.025166558101773262,
-0.07937683910131454,
0.004921148996800184,
0.030258607119321823,
0.019724799320101738,
-0.019475242123007774,
-0.11776784062385559,
0.03079134412109852,
-0.06327752768993378,
-0.023380275815725327,
-0.08745507150888443,
-0.09312250465154648,
-0.06466346979141235,
0.006748249754309654,
0.015876274555921555,
0.00218522222712636,
-0.09663470834493637,
0.009669341146945953,
-0.009386365301907063,
-0.02811184525489807,
0.068865567445755,
-0.03627485781908035,
0.0631626695394516,
0.01708337664604187,
0.07024288177490234,
-0.011909602209925652,
0.07450360804796219,
-0.06617236882448196,
-0.04214879870414734,
0.1043202131986618,
0.11043405532836914,
-0.019431672990322113,
-0.05500614643096924,
-0.0845220535993576,
-0.05795320123434067,
-0.046355366706848145,
0.05504104867577553,
0.09502455592155457,
0.10471079498529434,
-0.3497263789176941,
-0.0640832781791687,
0.24481265246868134,
-0.1514572948217392,
-0.039704661816358566,
0.2189861387014389,
-0.007675104308873415,
0.09946710616350174,
0.17361992597579956,
0.1571081131696701,
0.11744408309459686,
-0.2209300547838211,
0.06256360560655594,
0.01368371769785881,
0.006653227377682924,
-0.04748757556080818,
0.07056764513254166,
-0.03325430676341057,
0.015065383166074753,
0.0464470349252224,
-0.08299820870161057,
0.08275940269231796,
-0.0443912111222744,
-0.07329805195331573,
-0.04484889656305313,
-0.08067259192466736,
0.046646974980831146,
0.03038395568728447,
0.026918454095721245,
-0.025983484461903572,
-0.07159102708101273,
-0.011875471100211143,
0.10624666512012482,
-0.14418163895606995,
0.05266871303319931,
-0.08816682547330856,
0.09762926399707794,
-0.05993009731173515,
0.0029527698643505573,
-0.12365496158599854,
0.12274891883134842,
-0.0009174890001304448,
0.0784047394990921,
0.024586830288171768,
0.19483616948127747,
0.025861959904432297,
0.016132252290844917,
-0.04699737951159477,
0.011167680844664574,
0.013305963948369026,
-0.020219922065734863,
-0.05154387280344963,
-0.12006369978189468,
-0.025949733331799507,
-0.06703882664442062,
0.035831090062856674,
-0.1411781758069992,
0.0000736375295673497,
0.05881959944963455,
0.017574472352862358,
-0.005340976640582085,
-0.007301079574972391,
0.02193373069167137,
0.10090681910514832,
0.011447129771113396,
-0.012302054092288017,
0.04222499579191208,
0.002054469892755151,
-0.01555085089057684,
0.09882910549640656,
-0.13474462926387787,
-0.010005357675254345,
0.10605943202972412,
-0.01023699901998043,
0.010176330804824829,
0.048415809869766235,
0.0015242801746353507,
-0.013534448109567165,
-0.10361290723085403,
-0.02534344233572483,
0.2668340504169464,
0.014243172481656075,
0.10733955353498459,
-0.08919396251440048,
0.016602151095867157,
0.04287540167570114,
-0.04522227123379707,
0.03287027031183243,
0.07351403683423996,
0.03687264025211334,
0.039774395525455475,
0.009644721634685993,
-0.03350302204489708,
-0.11500518769025803,
0.21320199966430664,
-0.037516821175813675,
-0.09532550722360611,
0.004882803186774254,
-0.021135009825229645,
-0.007255587261170149,
0.08147847652435303,
-0.18914823234081268,
-0.05179416388273239,
0.015965860337018967,
0.040004268288612366,
0.05136517062783241,
-0.15998908877372742,
0.010266401804983616,
0.015962012112140656,
-0.10871328413486481,
-0.16290447115898132,
0.08444322645664215,
-0.06664051115512848,
0.05374722555279732,
-0.1112309917807579,
-0.01675577089190483,
-0.011724106036126614,
-0.03999941051006317,
-0.16484127938747406,
0.11240651458501816,
-0.06501410901546478,
-0.1579170674085617,
-0.14448752999305725,
-0.0019390980014577508,
0.01712796464562416,
0.011403379030525684,
0.0991227999329567,
-0.11303607374429703,
-0.016666414216160774,
-0.02710578963160515,
0.10020943731069565,
-0.007588651962578297,
0.005858820863068104,
-0.06107172742486,
-0.00024115131236612797,
0.057383373379707336,
-0.1528312861919403,
0.003079693065956235,
-0.058816809207201004,
-0.03298955038189888,
-0.011973355896770954,
-0.0038904889952391386,
0.026458222419023514,
0.12145876884460449,
0.019587602466344833,
0.045071907341480255,
-0.022984623908996582,
0.2001536786556244,
-0.0579197071492672,
-0.026323135942220688,
0.22357235848903656,
-0.020094159990549088,
-0.01953069493174553,
0.07351605594158173,
0.038837797939777374,
-0.07771770656108856,
-0.03107835352420807,
-0.008691402152180672,
-0.10000015795230865,
-0.2546522617340088,
-0.11445935815572739,
-0.08406972140073776,
-0.10388742387294769,
-0.0030828500166535378,
0.006348040886223316,
-0.02509070374071598,
0.017510684207081795,
-0.018618935719132423,
-0.04033578932285309,
0.01739712990820408,
-0.018176477402448654,
0.19552654027938843,
-0.015571324154734612,
0.07842735946178436,
-0.061851345002651215,
-0.02922782301902771,
0.02873941883444786,
0.028518863022327423,
0.1653979867696762,
0.059188082814216614,
0.08745511621236801,
0.09247329831123352,
0.11439984291791916,
0.08066103607416153,
0.09062749147415161,
-0.03606933727860451,
-0.028195271268486977,
0.016031047329306602,
-0.07344164699316025,
-0.029075972735881805,
0.03536007180809975,
0.0978134274482727,
-0.028591327369213104,
-0.045356351882219315,
0.022922858595848083,
0.009394722059369087,
0.2510027587413788,
0.06006341427564621,
-0.22678592801094055,
-0.05605510249733925,
-0.024829216301441193,
-0.03582226112484932,
0.005921808537095785,
0.04509685933589935,
0.17326857149600983,
-0.10793697088956833,
0.02249085158109665,
-0.026012688875198364,
0.08737217634916306,
-0.05323558300733566,
0.032259196043014526,
-0.059417515993118286,
0.05443516746163368,
0.0083384457975626,
0.09062645584344864,
-0.26145288348197937,
0.2372371107339859,
0.010837906040251255,
0.14574293792247772,
-0.062348898500204086,
0.010628811083734035,
0.005782528314739466,
0.06219208985567093,
0.11601261794567108,
0.0010782888857647777,
0.04296031966805458,
-0.048706237226724625,
-0.06781100481748581,
0.07430601865053177,
-0.00975126400589943,
0.04290641099214554,
0.04764522239565849,
-0.004582896828651428,
0.009399505332112312,
0.022153854370117188,
0.010182753205299377,
-0.14250104129314423,
-0.09253432601690292,
0.008871727623045444,
0.19977281987667084,
0.06008453667163849,
-0.01779865473508835,
-0.12332721054553986,
-0.1423443704843521,
0.02072300761938095,
-0.04095662385225296,
-0.044274382293224335,
-0.06582648307085037,
0.013087067753076553,
0.10791993886232376,
-0.061065834015607834,
-0.021373873576521873,
0.07349652796983719,
0.10719704627990723,
-0.08199074864387512,
-0.020545458421111107,
0.04066527634859085,
-0.10562939941883087,
-0.11148466914892197,
-0.031746767461299896,
0.19532491266727448,
0.11069800704717636,
0.053180448710918427,
0.06428317725658417,
0.0019083552761003375,
-0.009725832380354404,
-0.05211344733834267,
0.026900405064225197,
0.12395709753036499,
-0.16310961544513702,
-0.07335295528173447,
-0.0031067209783941507,
-0.11288250237703323,
-0.09222926944494247,
-0.04806463420391083,
0.202274352312088,
0.07389627397060394,
-0.08610764890909195,
0.17162272334098816,
0.15266506373882294,
-0.12164291739463806,
-0.17504486441612244,
-0.02785695157945156,
0.07872722297906876,
0.12907786667346954,
0.023271329700946808,
-0.19086839258670807,
0.06260447949171066,
-0.02911398373544216,
-0.01489508431404829,
-0.06514965742826462,
-0.2915838956832886,
-0.13180501759052277,
0.12645700573921204,
-0.013364781625568867,
0.16121035814285278,
0.006184480618685484,
-0.01702073961496353,
-0.02135274186730385,
-0.013081859797239304,
-0.015461353585124016,
-0.12191235274076462,
0.09410431981086731,
0.009184733033180237,
0.06284637004137039,
0.03246380761265755,
-0.027995090931653976,
0.0932832583785057,
0.06347332149744034,
-0.0022567156702280045,
0.00997005496174097,
0.05562443286180496,
0.08689918369054794,
0.023331208154559135,
0.12282086163759232,
-0.13117937743663788,
0.027418091893196106,
-0.09211232513189316,
-0.09846454858779907,
-0.06325449049472809,
0.04077368974685669,
0.007497341837733984,
-0.027835585176944733,
0.05208855867385864,
-0.06507150083780289,
0.030960572883486748,
-0.003010178217664361,
-0.07559050619602203,
-0.09898141771554947,
0.04296388104557991,
0.07613913714885712,
0.17343932390213013,
-0.04918282851576805,
-0.09491249173879623,
-0.015370262786746025,
-0.012874851934611797,
0.11566881090402603,
-0.07884038239717484,
0.03165370225906372,
0.08262106031179428,
0.041177667677402496,
0.10257069766521454,
0.019066767767071724,
-0.09777845442295074,
0.08444082736968994,
0.022005537524819374,
-0.07076933234930038,
-0.15094506740570068,
-0.02898324280977249,
-0.037701062858104706,
-0.03455031290650368,
0.04120982810854912,
0.13776937127113342,
-0.08562412858009338,
-0.015160501934587955,
-0.03671004995703697,
0.005385715514421463,
-0.11619432270526886,
0.216708242893219,
0.0049008559435606,
0.053175728768110275,
-0.1116061732172966,
0.017234129831194878,
-0.00971804279834032,
-0.05123262479901314,
0.036615628749132156,
-0.014948111958801746,
-0.08642236143350601,
-0.055827438831329346,
0.001077128341421485,
0.06197468936443329,
0.05707940086722374,
-0.13946828246116638,
-0.0718151107430458,
-0.11914891004562378,
0.005016457289457321,
0.058058418333530426,
0.06212829425930977,
0.0184677354991436,
-0.17253834009170532,
-0.08760425448417664,
-0.08719287812709808,
0.06622251123189926,
0.08438102155923843,
-0.030177759006619453,
-0.10091736912727356,
0.17077437043190002,
0.06825028359889984,
-0.011417447589337826,
-0.03471328690648079,
-0.0828590914607048,
0.0023969002068042755,
0.09927309304475784,
-0.11405259370803833,
-0.008529304526746273,
-0.033275049179792404,
0.02171708084642887,
-0.0031155957840383053,
-0.07367543876171112,
-0.01398751512169838,
0.08538196980953217,
-0.08777303993701935,
0.05243535339832306,
-0.010312802158296108,
0.08858033269643784,
-0.08443012088537216,
0.02702605538070202,
0.002449135761708021,
-0.050869882106781006,
0.08854340761899948,
0.09784191846847534,
-0.1187884658575058,
0.11373982578516006,
-0.27446579933166504,
-0.028699224814772606,
0.05044984444975853,
0.07560443133115768,
-0.03259115293622017,
-0.09845064580440521,
0.04956529289484024,
0.0991676077246666,
0.034479450434446335,
0.016374140977859497,
0.06734964996576309,
-0.05885384604334831,
0.02952948585152626,
-0.002507640980184078,
-0.010040026158094406,
-0.0037949206307530403,
0.028582831844687462,
0.06350167095661163,
0.1400657743215561,
0.1302478015422821,
-0.09828721731901169,
0.10328847169876099,
-0.13804511725902557,
0.015636049211025238,
-0.04642580822110176,
-0.009324794635176659,
-0.15704970061779022,
-0.09324086457490921,
0.08516933768987656,
-0.02674267068505287,
0.13257843255996704,
0.04014253988862038,
0.07067006826400757,
-0.03105936013162136,
-0.010302829556167126,
-0.018556516617536545,
-0.022684941068291664,
0.24142558872699738,
0.018585020676255226,
0.05551363155245781,
-0.05111018568277359,
-0.024534815922379494,
0.003556361421942711,
0.1546766459941864,
0.02327463962137699,
0.13066712021827698,
0.02994300238788128,
0.09265708178281784,
0.11160899698734283,
-0.029880890622735023,
-0.07385723292827606,
-0.037442419677972794,
-0.0982937291264534,
0.07530909031629562,
-0.08963002264499664,
0.10866802930831909,
0.10900983959436417,
-0.07875233888626099,
0.04546311870217323,
0.006675833370536566,
-0.07609860599040985,
-0.1583710014820099,
-0.1495189368724823,
-0.05673627182841301,
-0.13360825181007385,
0.021770909428596497,
-0.08480940014123917,
0.027680089697241783,
0.03836173936724663,
0.03802412003278732,
-0.04310302808880806,
0.13498488068580627,
0.05736003816127777,
-0.08989479392766953,
0.07248688489198685,
-0.06706378608942032,
-0.0014834176981821656,
-0.09355705976486206,
0.042268816381692886,
0.12343708425760269,
-0.04343222454190254,
0.027417168021202087,
0.009608401916921139,
-0.022065453231334686,
-0.0001268215710297227,
-0.088821180164814,
-0.05036482587456703,
-0.01852400414645672,
-0.03277624398469925,
0.06525790691375732,
0.15503646433353424,
0.12282822281122208,
-0.06994866579771042,
0.0023800090420991182,
0.07685550302267075,
-0.013452483341097832,
-0.13491027057170868,
-0.16456235945224762,
0.09152856469154358,
0.022849343717098236,
0.02768557146191597,
-0.010797911323606968,
-0.017334192991256714,
0.009798014536499977,
0.23872148990631104,
0.23842845857143402,
0.08231710642576218,
0.023578796535730362,
-0.005897212773561478,
-0.0014181267470121384,
-0.029459821060299873,
0.06090234965085983,
0.06046488508582115,
0.20695938169956207,
-0.042796023190021515,
0.047781992703676224,
-0.11674968153238297,
-0.06177685782313347,
0.021188586950302124,
0.04437418282032013,
-0.07101933658123016,
-0.08806135505437851,
0.008595637045800686,
0.14778748154640198,
-0.004818026442080736,
-0.067156121134758,
-0.07606442272663116,
-0.11605353653430939,
-0.08608774095773697,
-0.017910435795783997,
0.02019813284277916,
0.09653449803590775,
0.014409917406737804,
-0.09120257943868637,
0.019788166508078575,
0.21290339529514313,
-0.017571479082107544,
-0.07928138971328735,
-0.0934314876794815,
0.06882442533969879,
-0.10716794431209564,
0.0016016523586586118,
-0.02887670136988163,
0.17071694135665894,
0.03160437196493149,
0.1142755076289177,
-0.006329026538878679,
0.1314222365617752,
-0.018459104001522064,
-0.06426548957824707,
0.002181063173338771,
0.0628233328461647,
-0.040131669491529465,
0.13919594883918762,
0.006723940838128328,
-0.040432561188936234,
0.07953352481126785,
-0.09608880430459976,
-0.02497945912182331,
-0.06702403724193573,
0.06613342463970184,
-0.04322843626141548,
0.09581203013658524,
0.07561373710632324,
-0.057921089231967926,
-0.05131303891539574,
-0.04041368514299393,
0.0694383829832077,
0.025563891977071762,
-0.051595933735370636,
-0.03409232571721077,
-0.24829480051994324,
-0.007952039130032063,
-0.05864819511771202,
-0.01196450088173151,
-0.1623772829771042,
-0.06235243380069733,
-0.012153824791312218,
-0.07953879237174988,
0.01097953598946333,
0.02543819695711136,
0.10952028632164001,
0.012211043387651443,
-0.009390264749526978,
-0.05444170534610748,
0.0410328283905983,
0.1105840653181076,
-0.16209325194358826,
-0.0914081409573555
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Sakha
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Sakha using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "sah", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-sah")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-sah")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Sakha test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "sah", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-sah")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-sah")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\”\„\–\…\«\»]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 38.04 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "sah", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Sakha", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice sah", "type": "common_voice", "args": "sah"}, "metrics": [{"type": "wer", "value": 38.04, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-sah
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"sah",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sah"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sah #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Sakha
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Sakha test data of Common Voice.
Test Result: 38.04 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Sakha\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Sakha test data of Common Voice.\n\nTest Result: 38.04 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sah #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Sakha\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Sakha test data of Common Voice.\n\nTest Result: 38.04 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
63,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #sah #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Sakha\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Sakha using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Sakha test data of Common Voice.\n\nTest Result: 38.04 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.14663080871105194,
-0.020214613527059555,
-0.0020358769688755274,
-0.03905520215630531,
0.09952420741319656,
-0.047060731798410416,
0.19943295419216156,
0.07287074625492096,
0.01852632500231266,
-0.01608305051922798,
0.02875857800245285,
0.011639580130577087,
0.05685831233859062,
0.12504006922245026,
0.005122622940689325,
-0.21793684363365173,
0.010277412831783295,
-0.011937728151679039,
0.028750477358698845,
0.1183612272143364,
0.1027112752199173,
-0.05729283392429352,
-0.007951803505420685,
0.09636565297842026,
-0.150577113032341,
0.031161120161414146,
0.06862976402044296,
-0.13214321434497833,
0.1495974361896515,
0.08226192742586136,
0.07788042724132538,
0.05348564684391022,
0.09677080065011978,
-0.1647282987833023,
0.03185897320508957,
0.027233485132455826,
0.05428469553589821,
0.02754407562315464,
0.0328032411634922,
-0.009190034121274948,
0.08398916572332382,
0.0917184129357338,
-0.04029543697834015,
0.08213446289300919,
-0.06184672564268112,
-0.17896097898483276,
-0.014582731761038303,
0.012127846479415894,
0.12090418487787247,
0.1301329880952835,
-0.06847841292619705,
0.06992830336093903,
-0.13843339681625366,
0.07741451263427734,
0.07864634692668915,
-0.168783038854599,
-0.006661434657871723,
0.10664305835962296,
0.024091141298413277,
0.09130537509918213,
-0.04916335269808769,
0.05579109489917755,
0.0392175018787384,
0.024337071925401688,
0.04378887265920639,
-0.041374463587999344,
-0.17929084599018097,
-0.004707483574748039,
-0.13212311267852783,
-0.03550628945231438,
0.20659399032592773,
-0.010680644772946835,
-0.07803283631801605,
-0.14176522195339203,
-0.03009069710969925,
0.010960211977362633,
-0.017559312283992767,
-0.05558844283223152,
-0.00283588538877666,
0.02642069198191166,
-0.0061969514936208725,
-0.030637741088867188,
-0.11190897226333618,
-0.17599865794181824,
-0.005618066992610693,
0.09671538323163986,
0.016913220286369324,
0.0269931610673666,
-0.13753418624401093,
0.04805419594049454,
-0.11062915623188019,
-0.07170592993497849,
0.0033013976644724607,
0.0053590997122228146,
-0.09483299404382706,
0.04054299741983414,
-0.09060846269130707,
-0.20843574404716492,
0.02966614067554474,
-0.05925321206450462,
0.030018651857972145,
0.04270332679152489,
-0.02578805200755596,
0.04903440549969673,
0.026521293446421623,
0.10639603435993195,
-0.0626753717660904,
0.009492394514381886,
0.021385084837675095,
0.026031967252492905,
-0.03776784986257553,
-0.032431554049253464,
-0.05490541085600853,
-0.06662868708372116,
-0.02761441469192505,
0.028593268245458603,
-0.05476049333810806,
0.00666737649589777,
-0.026790114119648933,
-0.03220023214817047,
-0.017651870846748352,
-0.10641621053218842,
-0.06563618034124374,
0.058279380202293396,
0.026066739112138748,
0.11205677688121796,
0.05694320425391197,
0.047840338200330734,
-0.0572962760925293,
-0.019045056775212288,
0.018946228548884392,
0.03386986628174782,
-0.029137132689356804,
-0.057192612439394,
-0.012761702761054039,
0.0034696096554398537,
-0.02291817031800747,
-0.10337785631418228,
-0.12761922180652618,
-0.07889090478420258,
-0.000045433440391207114,
0.03972742706537247,
-0.05026284605264664,
-0.06201249733567238,
-0.015136045403778553,
-0.021833321079611778,
-0.07413579523563385,
0.06587140262126923,
-0.03697628155350685,
0.06416240334510803,
0.04679449647665024,
0.052971240133047104,
0.06101761385798454,
0.08575591444969177,
-0.06550299376249313,
-0.02734357677400112,
-0.0569271557033062,
0.1545880138874054,
-0.05809180811047554,
-0.06133606284856796,
-0.08588206022977829,
-0.06760971248149872,
-0.07743192464113235,
0.09538302570581436,
0.048190437257289886,
0.10357503592967987,
-0.2528035342693329,
-0.10416818410158157,
0.21109016239643097,
-0.14283686876296997,
-0.027832411229610443,
0.20560938119888306,
-0.011266825720667839,
0.10270757228136063,
0.1262325793504715,
0.24978189170360565,
0.09553191810846329,
-0.15276192128658295,
0.03985030949115753,
0.020223595201969147,
0.019837884232401848,
-0.0542452447116375,
0.07746455818414688,
-0.04424786940217018,
-0.06099126860499382,
0.041956909000873566,
-0.09241019189357758,
0.10445409268140793,
-0.02219359576702118,
-0.06516171246767044,
-0.018340200185775757,
-0.09796994924545288,
0.07917877286672592,
0.03960398957133293,
0.03840058669447899,
-0.002819466171786189,
-0.05191038176417351,
0.05072804167866707,
0.13732679188251495,
-0.11197223514318466,
0.035089731216430664,
-0.11391100287437439,
0.048575688153505325,
-0.08054827153682709,
-0.009177770465612411,
-0.14728350937366486,
0.18374912440776825,
-0.03672230616211891,
0.008080106228590012,
0.06761494278907776,
0.1742313951253891,
0.010708208195865154,
-0.00032799120526760817,
-0.04299482703208923,
-0.016989488154649734,
0.010789749212563038,
-0.021802131086587906,
-0.023312972858548164,
-0.09288989007472992,
-0.017090389505028725,
-0.052160441875457764,
0.13981029391288757,
-0.14789333939552307,
0.0032732996623963118,
-0.03368925675749779,
-0.00828289333730936,
-0.008017400279641151,
-0.0011728814570233226,
0.09591968357563019,
0.08636430650949478,
0.030639030039310455,
0.030028171837329865,
0.03435041382908821,
0.004903770051896572,
-0.07067622244358063,
0.14728441834449768,
-0.12681372463703156,
-0.01852606050670147,
0.09835665673017502,
-0.039268527179956436,
-0.005710605997592211,
0.07528798282146454,
0.0015141888288781047,
-0.018796004354953766,
-0.10848388820886612,
-0.0005936281522735953,
0.301090270280838,
0.00903368927538395,
0.10912182182073593,
-0.07621415704488754,
0.023313386365771294,
0.026930052787065506,
-0.09551944583654404,
0.07201298326253891,
0.04555952921509743,
0.003139757551252842,
0.03718830645084381,
0.02667294628918171,
-0.0351945199072361,
-0.11455783247947693,
0.22785310447216034,
-0.05421736463904381,
-0.08277735114097595,
0.03113938681781292,
-0.043112192302942276,
-0.04801924526691437,
0.07000041753053665,
-0.13704918324947357,
-0.06177419424057007,
0.038739126175642014,
0.05217824503779411,
0.04977188631892204,
-0.1287684291601181,
0.020022405311465263,
0.01329098455607891,
-0.11743170022964478,
-0.161586195230484,
0.07455726712942123,
-0.051707182079553604,
0.0366520993411541,
-0.09434133768081665,
-0.03230801597237587,
-0.003024461679160595,
-0.042024292051792145,
-0.16383463144302368,
0.13581515848636627,
-0.04547371715307236,
-0.20045305788516998,
-0.13641522824764252,
0.023395491763949394,
0.0005593686364591122,
0.019260572269558907,
0.08739934861660004,
-0.13545645773410797,
-0.005627841223031282,
-0.008159955032169819,
0.09705612063407898,
0.02550049126148224,
-0.035031985491514206,
-0.03805878013372421,
-0.019467126578092575,
0.08232906460762024,
-0.1269800364971161,
0.01179414801299572,
-0.053742196410894394,
-0.022607382386922836,
0.006083282176405191,
-0.004069024231284857,
-0.005519406404346228,
0.18644265830516815,
0.041941363364458084,
0.011451355181634426,
-0.02464466542005539,
0.16226457059383392,
-0.07865706086158752,
-0.02993103861808777,
0.24581418931484222,
-0.011475715786218643,
-0.024406036362051964,
0.10077648609876633,
0.018410760909318924,
-0.055244605988264084,
-0.01230610255151987,
-0.009905822575092316,
-0.10241130739450455,
-0.2505033612251282,
-0.1115759015083313,
-0.07033949345350266,
-0.05014519765973091,
-0.052041806280612946,
-0.006259530317038298,
0.04240356385707855,
0.05003704875707626,
0.0034669346641749144,
-0.06817397475242615,
0.03738967329263687,
-0.02106150984764099,
0.134211465716362,
-0.01213062647730112,
0.11854950338602066,
-0.06063663586974144,
-0.02193518355488777,
0.006807313300669193,
0.011552561074495316,
0.1154288724064827,
0.04355437308549881,
0.07506286352872849,
0.09979785233736038,
0.09487469494342804,
0.13803067803382874,
0.049302566796541214,
-0.04318298026919365,
-0.026569686830043793,
-0.001725200330838561,
-0.04814601317048073,
-0.09907485544681549,
0.05938010290265083,
0.0858900398015976,
-0.03408847376704216,
-0.030589478090405464,
-0.00015043151506688446,
0.009221869520843029,
0.21879786252975464,
0.06725040823221207,
-0.19199977815151215,
-0.10230771452188492,
-0.006453943904489279,
-0.07506053149700165,
0.00011038941011065617,
0.07537993043661118,
0.13673584163188934,
-0.12954208254814148,
0.023692594841122627,
-0.00646292744204402,
0.09190613776445389,
0.01550962682813406,
0.040738169103860855,
-0.10615641623735428,
0.03671450912952423,
-0.00011751458805520087,
0.07708165794610977,
-0.2755826711654663,
0.21089161932468414,
-0.001059772097505629,
0.10475508868694305,
-0.034977950155735016,
0.003884349251165986,
0.03376604616641998,
0.08006937056779861,
0.1059660017490387,
0.022560682147741318,
-0.04321765899658203,
-0.055164635181427,
-0.039836376905441284,
0.07064909487962723,
-0.041161488741636276,
0.02050829865038395,
0.018281592056155205,
-0.0041303448379039764,
0.003768410999327898,
0.005538685712963343,
-0.012358671985566616,
-0.11720315366983414,
-0.03270873427391052,
0.0039443387649953365,
0.16247281432151794,
0.1268092542886734,
-0.03751996532082558,
-0.07310925424098969,
-0.09076443314552307,
0.016966290771961212,
-0.03875388205051422,
-0.06707917153835297,
-0.07120629400014877,
-0.05074935778975487,
0.09040401130914688,
-0.06349306553602219,
0.005208911839872599,
0.09554684907197952,
0.11040738224983215,
-0.02236776053905487,
-0.04625461623072624,
0.041962865740060806,
-0.11386117339134216,
-0.10230956226587296,
0.0010561288800090551,
0.16265550255775452,
0.10091113299131393,
0.08034657686948776,
0.04515986517071724,
0.010169683024287224,
0.0011984173906967044,
-0.036640048027038574,
0.013789328746497631,
0.1504143625497818,
-0.10901595652103424,
0.01663614623248577,
0.04002801328897476,
-0.18100780248641968,
-0.0991879403591156,
-0.057687908411026,
0.15083882212638855,
0.06349298357963562,
-0.057698093354701996,
0.1987062692642212,
0.24459241330623627,
-0.0808449536561966,
-0.20476196706295013,
-0.037595394998788834,
0.10032521188259125,
0.1366541087627411,
0.027137691155076027,
-0.201624795794487,
0.09680783748626709,
-0.026926390826702118,
-0.051732517778873444,
-0.13847100734710693,
-0.25723353028297424,
-0.14094582200050354,
0.17951087653636932,
-0.03413049131631851,
0.1850268542766571,
-0.012734195217490196,
-0.03110230155289173,
-0.029332371428608894,
-0.0158014427870512,
-0.01468019187450409,
-0.11792352050542831,
0.10718641430139542,
0.025807637721300125,
0.13635221123695374,
0.04767641797661781,
-0.022831851616501808,
0.08809611201286316,
0.0968543291091919,
-0.02034536376595497,
0.008546695113182068,
0.0974707305431366,
0.0022400273010134697,
0.0156471598893404,
0.125130295753479,
-0.10009726881980896,
0.052737101912498474,
-0.10748646408319473,
-0.09973954409360886,
-0.0915611982345581,
0.05504899471998215,
0.0475119985640049,
-0.04951361194252968,
0.03475513309240341,
-0.046455807983875275,
0.014965137466788292,
-0.0050763534381985664,
-0.040635980665683746,
-0.15236417949199677,
0.09354255348443985,
0.13711701333522797,
0.19279639422893524,
-0.01935652829706669,
-0.10995146632194519,
-0.03693787381052971,
-0.03405090421438217,
0.12473345547914505,
-0.16276273131370544,
0.036495573818683624,
0.036074768751859665,
0.05653732642531395,
0.11293254792690277,
0.013426239602267742,
-0.09117120504379272,
0.08426293730735779,
0.02351006120443344,
-0.0447111576795578,
-0.1283455640077591,
-0.022383445873856544,
0.006283354014158249,
-0.0365942157804966,
0.03259596601128578,
0.11832106113433838,
-0.09930719435214996,
-0.027021577581763268,
-0.017692439258098602,
0.016657434403896332,
-0.1336720734834671,
0.20776569843292236,
0.022426022216677666,
0.08118556439876556,
-0.10563261061906815,
0.006753826979547739,
-0.012421919032931328,
-0.04545425623655319,
0.03603256866335869,
-0.018619321286678314,
-0.08431500196456909,
-0.0688251480460167,
-0.037917610257864,
0.07382870465517044,
0.029923392459750175,
-0.1216718778014183,
-0.05200579762458801,
-0.1271282583475113,
-0.011741460300981998,
0.0813705325126648,
0.04932554066181183,
-0.011752449907362461,
-0.11561623215675354,
-0.051844727247953415,
-0.09493532031774521,
0.053480908274650574,
0.07179772108793259,
-0.0026138003449887037,
-0.12868048250675201,
0.15310098230838776,
0.06319990009069443,
0.06787378340959549,
-0.05460105836391449,
-0.05729803442955017,
-0.01842600665986538,
0.10023852437734604,
-0.1466529667377472,
-0.004075993318110704,
-0.0549839586019516,
-0.0009188064141198993,
-0.005354746710509062,
-0.07536354660987854,
-0.00601242994889617,
0.07905831933021545,
-0.09922966361045837,
0.08428478240966797,
-0.0011977177346125245,
0.08700931072235107,
-0.06292149424552917,
0.04544954001903534,
0.029124952852725983,
-0.048122186213731766,
0.07868237793445587,
0.15023790299892426,
-0.10417111217975616,
0.1330726593732834,
-0.16569268703460693,
-0.040954187512397766,
0.05545439571142197,
0.05198930948972702,
-0.03947458043694496,
-0.10689783841371536,
0.04279882460832596,
0.06750845909118652,
0.05047157034277916,
-0.016608908772468567,
0.07572908699512482,
-0.04709513485431671,
0.02699318528175354,
-0.04939977452158928,
0.02637392468750477,
-0.0525403693318367,
0.014240778051316738,
0.07376040518283844,
0.17551931738853455,
0.1313313990831375,
-0.10807303339242935,
0.09811031818389893,
-0.12279325723648071,
0.01396252866834402,
-0.06580150127410889,
-0.020142413675785065,
-0.13990366458892822,
-0.12083215266466141,
0.06354337185621262,
-0.06312888860702515,
0.11831258982419968,
0.03617202863097191,
0.029551483690738678,
-0.02310313656926155,
-0.08809015899896622,
0.06676188111305237,
-0.011636864393949509,
0.24381454288959503,
0.04292190074920654,
0.04261215031147003,
-0.003242444945499301,
-0.00705878296867013,
-0.010514441877603531,
0.14856170117855072,
-0.016988754272460938,
0.16688098013401031,
-0.023729246109724045,
0.06704477965831757,
0.0770714208483696,
-0.04733968898653984,
-0.05174391716718674,
-0.018886245787143707,
-0.15956051647663116,
0.020563818514347076,
-0.07074739038944244,
0.15885205566883087,
0.13982905447483063,
-0.08316226303577423,
0.10229813307523727,
-0.02005845308303833,
-0.0774228572845459,
-0.16376329958438873,
-0.11075834184885025,
-0.051602765917778015,
-0.1706830859184265,
0.030161594972014427,
-0.0707172304391861,
0.015773937106132507,
0.12996183335781097,
0.019697705283761024,
-0.003896973794326186,
0.1781659871339798,
0.010640079155564308,
-0.09261389076709747,
0.06317223608493805,
-0.08523501455783844,
-0.017742671072483063,
-0.09832411259412766,
0.037700653076171875,
0.1849227398633957,
0.008903048001229763,
0.04675471782684326,
-0.017877988517284393,
-0.07225847989320755,
0.030153317376971245,
-0.06538897007703781,
-0.06496316194534302,
-0.015223825350403786,
-0.02126169763505459,
0.09902623295783997,
0.10892143100500107,
0.10870381444692612,
-0.07440602779388428,
-0.00003617290349211544,
0.14798444509506226,
-0.06026599556207657,
-0.14654026925563812,
-0.16222167015075684,
0.17375892400741577,
0.009779759682714939,
0.005258411634713411,
-0.012233353219926357,
-0.02717767283320427,
-0.013733026571571827,
0.22734875977039337,
0.20100422203540802,
0.03570704907178879,
0.03159042447805405,
-0.04228132218122482,
-0.008683017455041409,
-0.06046343222260475,
0.08106841892004013,
0.06650474667549133,
0.22800616919994354,
-0.012673651799559593,
0.05031949281692505,
-0.07638842612504959,
-0.10110502690076828,
0.009581421501934528,
0.08452144265174866,
-0.07609719038009644,
-0.10848192870616913,
0.02889254316687584,
0.1394658088684082,
-0.0721374899148941,
-0.12241315096616745,
-0.10177662968635559,
-0.06465873122215271,
-0.0786762684583664,
-0.02574727311730385,
0.015215088613331318,
0.12650586664676666,
0.002777125919237733,
-0.08747819066047668,
0.03202320262789726,
0.15245841443538666,
-0.002071842784062028,
-0.03842100501060486,
-0.04172690957784653,
0.064410001039505,
-0.05894507095217705,
-0.022435689345002174,
0.008956512436270714,
0.15580321848392487,
0.00485320994630456,
0.08431997895240784,
-0.012373629957437515,
0.16121579706668854,
-0.02382557839155197,
-0.09080309420824051,
0.008551583625376225,
0.1588054746389389,
-0.03173896670341492,
0.12067116796970367,
0.00788700208067894,
-0.1454918384552002,
0.04411781579256058,
-0.1267414689064026,
-0.03458799421787262,
-0.07817868143320084,
0.051055483520030975,
-0.04769982397556305,
0.09228348731994629,
0.06628871709108353,
-0.061542827636003494,
-0.05700675770640373,
-0.06189851835370064,
0.07876908034086227,
0.027540884912014008,
-0.06217978522181511,
-0.046335119754076004,
-0.23502786457538605,
-0.017897894605994225,
-0.07508093863725662,
-0.01794542372226715,
-0.23337794840335846,
-0.01966175250709057,
-0.01628970168530941,
-0.06199974566698074,
0.026028083637356758,
0.024930784478783607,
0.11087971925735474,
0.014617213979363441,
0.0038103372789919376,
0.0020409333519637585,
0.043992768973112106,
0.13451960682868958,
-0.16541877388954163,
-0.11414444446563721
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Telugu
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Telugu using the [OpenSLR SLR66](http://openslr.org/66/) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import pandas as pd
# Evaluation notebook contains the procedure to download the data
df = pd.read_csv("/content/te/test.tsv", sep="\t")
df["path"] = "/content/te/clips/" + df["path"]
test_dataset = Dataset.from_pandas(df)
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-telugu")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-telugu")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
```python
import torch
import torchaudio
from datasets import Dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
from sklearn.model_selection import train_test_split
import pandas as pd
# Evaluation notebook contains the procedure to download the data
df = pd.read_csv("/content/te/test.tsv", sep="\t")
df["path"] = "/content/te/clips/" + df["path"]
test_dataset = Dataset.from_pandas(df)
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-telugu")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-telugu")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\_\;\:\"\“\%\‘\”\।\’\'\&]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
def normalizer(text):
# Use your custom normalizer
text = text.replace("\\n","\n")
text = ' '.join(text.split())
text = re.sub(r'''([a-z]+)''','',text,flags=re.IGNORECASE)
text = re.sub(r'''%'''," శాతం ", text)
text = re.sub(r'''(/|-|_)'''," ", text)
text = re.sub("ై","ై", text)
text = text.strip()
return text
def speech_file_to_array_fn(batch):
batch["sentence"] = normalizer(batch["sentence"])
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()+ " "
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 44.98%
## Training
70% of the OpenSLR Telugu dataset was used for training.
Train Split of annotations is [here](https://www.dropbox.com/s/xqc0wtour7f9h4c/train.tsv)
Test Split of annotations is [here](https://www.dropbox.com/s/qw1uy63oj4qdiu4/test.tsv)
Training Data Preparation notebook can be found [here](https://colab.research.google.com/drive/1_VR1QtY9qoiabyXBdJcOI29-xIKGdIzU?usp=sharing)
Training notebook can be found[here](https://colab.research.google.com/drive/14N-j4m0Ng_oktPEBN5wiUhDDbyrKYt8I?usp=sharing)
Evaluation notebook is [here](https://colab.research.google.com/drive/1SLEvbTWBwecIRTNqpQ0fFTqmr1-7MnSI?usp=sharing)
|
{"language": "te", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["openslr"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Telugu", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "OpenSLR te", "type": "openslr", "args": "te"}, "metrics": [{"type": "wer", "value": 44.98, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-telugu
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"te",
"dataset:openslr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"te"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #te #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-XLSR-53-Telugu
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Telugu using the OpenSLR SLR66 dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
Test Result: 44.98%
## Training
70% of the OpenSLR Telugu dataset was used for training.
Train Split of annotations is here
Test Split of annotations is here
Training Data Preparation notebook can be found here
Training notebook can be foundhere
Evaluation notebook is here
|
[
"# Wav2Vec2-Large-XLSR-53-Telugu\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Telugu using the OpenSLR SLR66 dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\n\nTest Result: 44.98%",
"## Training\n70% of the OpenSLR Telugu dataset was used for training.\n\nTrain Split of annotations is here\n\nTest Split of annotations is here\n\nTraining Data Preparation notebook can be found here\n\nTraining notebook can be foundhere\n\nEvaluation notebook is here"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #te #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Telugu\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Telugu using the OpenSLR SLR66 dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\n\nTest Result: 44.98%",
"## Training\n70% of the OpenSLR Telugu dataset was used for training.\n\nTrain Split of annotations is here\n\nTest Split of annotations is here\n\nTraining Data Preparation notebook can be found here\n\nTraining notebook can be foundhere\n\nEvaluation notebook is here"
] |
[
82,
67,
20,
9,
52
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #te #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-XLSR-53-Telugu\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Telugu using the OpenSLR SLR66 dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\n\n\nTest Result: 44.98%## Training\n70% of the OpenSLR Telugu dataset was used for training.\n\nTrain Split of annotations is here\n\nTest Split of annotations is here\n\nTraining Data Preparation notebook can be found here\n\nTraining notebook can be foundhere\n\nEvaluation notebook is here"
] |
[
-0.16720885038375854,
0.04552572965621948,
0.0014215491246432066,
0.053019311279058456,
0.09556087106466293,
-0.05716618523001671,
0.16100214421749115,
0.12254591286182404,
-0.07651954889297485,
-0.03564898297190666,
-0.016900423914194107,
0.025599956512451172,
0.07854192703962326,
0.13892509043216705,
0.007638723589479923,
-0.23878218233585358,
-0.024222156032919884,
0.007127443328499794,
-0.024695493280887604,
0.1046723946928978,
0.12000772356987,
-0.0781315267086029,
-0.02886742353439331,
0.07355272024869919,
-0.17076411843299866,
0.04214906319975853,
-0.03868305683135986,
-0.11915426701307297,
0.11763881146907806,
0.04654920473694801,
0.05988435074687004,
0.04635760188102722,
0.0727512389421463,
-0.139530211687088,
0.04606202617287636,
0.032461896538734436,
0.04048275202512741,
0.028242412954568863,
0.06937582045793533,
-0.0005976263200864196,
0.0610945038497448,
0.0603661835193634,
-0.0022363592870533466,
0.09770939499139786,
-0.07391564548015594,
-0.2991240918636322,
-0.037347953766584396,
-0.010394277051091194,
0.06316148489713669,
0.17033414542675018,
-0.06447160989046097,
0.1557631641626358,
-0.09899581223726273,
0.08496104180812836,
0.09936005622148514,
-0.21708281338214874,
-0.04060472920536995,
0.05132047459483147,
0.04313165694475174,
0.0765838772058487,
-0.09822319447994232,
0.0004877750761806965,
0.036487627774477005,
0.01330997608602047,
0.03516010940074921,
-0.017627138644456863,
-0.2739216089248657,
0.006423088256269693,
-0.10865575820207596,
0.03502625226974487,
0.2759912610054016,
-0.01960255764424801,
-0.04536837711930275,
-0.10463225841522217,
0.0021291375160217285,
-0.01235488522797823,
0.006483644247055054,
0.0015760835958644748,
-0.012492740526795387,
0.03872796520590782,
0.00007077925693010911,
-0.06389618664979935,
-0.15017636120319366,
-0.07178551703691483,
0.028089238330721855,
0.019304804503917694,
0.006412763614207506,
-0.007469288073480129,
-0.11198990046977997,
0.08102167397737503,
-0.030766278505325317,
-0.08181803673505783,
-0.020430883392691612,
-0.009329008869826794,
-0.09525908529758453,
-0.004499775357544422,
-0.08808670938014984,
-0.21153908967971802,
-0.03231172263622284,
0.005568333435803652,
0.06311964988708496,
0.03383868187665939,
-0.03156363591551781,
0.05490018427371979,
0.025260239839553833,
0.09292937070131302,
-0.0341578908264637,
-0.04516468942165375,
0.03799913451075554,
0.010414354503154755,
-0.0027456977404654026,
-0.007041464559733868,
-0.05247025936841965,
-0.11806590110063553,
-0.00032275941339321434,
0.04649798944592476,
-0.026144538074731827,
0.03995370492339134,
-0.0430239737033844,
-0.04357820376753807,
-0.006278742104768753,
-0.10781510174274445,
-0.012332228012382984,
0.04461681470274925,
-0.03664795309305191,
0.001797748962417245,
0.06106797605752945,
0.038367900997400284,
-0.07084350287914276,
-0.06793604046106339,
-0.022510051727294922,
0.06830891966819763,
-0.08307728171348572,
-0.060868944972753525,
0.007915399968624115,
0.0031260792165994644,
-0.018664073199033737,
-0.09581071138381958,
-0.18652129173278809,
-0.0518653430044651,
-0.0023516155779361725,
-0.013499578461050987,
0.005503159482032061,
-0.0741892009973526,
0.00415393291041255,
-0.030627988278865814,
-0.04606380686163902,
0.10895828157663345,
-0.03480752184987068,
0.07221569865942001,
0.04635443910956383,
0.07299888879060745,
-0.01104632206261158,
0.09198224544525146,
-0.04686054587364197,
-0.06049466133117676,
0.03871752694249153,
0.07490681111812592,
-0.09235912561416626,
0.04065223038196564,
-0.08107641339302063,
-0.041500307619571686,
-0.105528824031353,
0.06491792947053909,
0.07418076694011688,
0.08498351275920868,
-0.27899783849716187,
-0.046187542378902435,
0.14978346228599548,
-0.1320556402206421,
-0.05824780464172363,
0.1355232149362564,
-0.009003271348774433,
0.18516971170902252,
0.03810465708374977,
0.2424689084291458,
0.12264369428157806,
-0.1251879185438156,
-0.01063087210059166,
0.017042579129338264,
-0.02617095224559307,
-0.11930082738399506,
0.06459489464759827,
-0.002192136598750949,
0.03055676259100437,
0.05102166905999184,
0.09090230613946915,
0.05957730486989021,
-0.004674872383475304,
-0.07672654092311859,
-0.049866821616888046,
-0.07158639281988144,
-0.007891467772424221,
0.04420599713921547,
0.08415208756923676,
-0.06454604119062424,
-0.06474048644304276,
0.08953120559453964,
0.14078259468078613,
-0.06357187032699585,
0.02412821166217327,
-0.10360196232795715,
0.07358692586421967,
-0.015881678089499474,
-0.030886808410286903,
-0.1695495992898941,
0.1395106464624405,
-0.004692580550909042,
0.055712852627038956,
0.04982641711831093,
0.17271392047405243,
0.03493696451187134,
-0.002839710796251893,
-0.02902400866150856,
0.0007995254709385335,
0.00030837682425044477,
0.012670455500483513,
-0.06561882793903351,
-0.0699971467256546,
0.020365122705698013,
-0.060002245008945465,
0.15006408095359802,
-0.208571195602417,
0.04251554608345032,
0.133071631193161,
0.05598115175962448,
0.05002698674798012,
-0.0462321862578392,
0.1033882200717926,
0.07558026909828186,
-0.00791817158460617,
-0.0235074982047081,
0.06136664003133774,
0.06947002559900284,
0.0469786562025547,
0.052295561879873276,
-0.09205468744039536,
-0.038809411227703094,
0.11465444415807724,
-0.04541245847940445,
-0.060785215348005295,
-0.03018062561750412,
-0.039502210915088654,
-0.027485745027661324,
-0.09279540926218033,
0.022772368043661118,
0.18991719186306,
-0.0212419256567955,
0.14138977229595184,
-0.04394901916384697,
-0.026166260242462158,
-0.02171100303530693,
-0.05991550162434578,
0.026648083701729774,
0.044260770082473755,
0.041819386184215546,
-0.007333493325859308,
0.06774032860994339,
0.01375378854572773,
-0.0762077048420906,
0.159233957529068,
-0.02390100061893463,
-0.08445919305086136,
0.005814943928271532,
-0.01707981899380684,
-0.00476950453594327,
0.08752947300672531,
-0.1982087343931198,
0.0019029052928090096,
0.023900231346488,
0.03569495305418968,
0.08423236012458801,
-0.14402219653129578,
-0.002448650076985359,
0.018320303410291672,
-0.03754290193319321,
-0.11724284291267395,
0.09683048725128174,
-0.03210676461458206,
0.03705976903438568,
-0.09441877901554108,
0.060196928679943085,
-0.005424202885478735,
-0.028763992711901665,
-0.11687492579221725,
0.12455598264932632,
-0.12827463448047638,
-0.2774476408958435,
-0.1694091558456421,
0.020150521770119667,
0.0857527107000351,
-0.02158738672733307,
0.1236165314912796,
-0.10160480439662933,
-0.04797329008579254,
-0.0449579693377018,
0.07154618948698044,
0.008842109702527523,
-0.030662942677736282,
0.00541777303442359,
0.06586908549070358,
0.07233106344938278,
-0.1419118195772171,
0.05202305689454079,
0.02926572784781456,
-0.028518972918391228,
0.051533281803131104,
-0.010902468115091324,
-0.02768927998840809,
0.0822204202413559,
0.0024974080733954906,
0.020834030583500862,
-0.016327930614352226,
0.1746569275856018,
-0.08369291573762894,
0.016327006742358208,
0.17263993620872498,
0.007890350185334682,
-0.014295827597379684,
0.0993524044752121,
0.009336531162261963,
-0.07383333891630173,
0.023470399901270866,
-0.01367128174751997,
-0.035154081881046295,
-0.26752033829689026,
-0.06924662739038467,
-0.06916353106498718,
-0.020825745537877083,
-0.038736119866371155,
0.03235780820250511,
0.038085535168647766,
0.04128752648830414,
0.02825181372463703,
-0.01234013494104147,
0.0029859752394258976,
0.009696384891867638,
0.15881037712097168,
-0.03173769265413284,
0.0689387172460556,
-0.08785218745470047,
0.04972267150878906,
0.07311376929283142,
0.04290223866701126,
0.1701703816652298,
-0.018784454092383385,
0.11556251347064972,
0.05297386273741722,
0.1053943857550621,
0.11341647803783417,
0.02318553999066353,
-0.021322032436728477,
-0.030343789607286453,
0.001374171581119299,
-0.026702264323830605,
-0.015095367096364498,
0.06471183896064758,
0.0610559917986393,
-0.04101110249757767,
-0.050279401242733,
-0.09268371760845184,
0.03894520550966263,
0.15843357145786285,
0.08421970158815384,
-0.17941632866859436,
-0.10433498024940491,
-0.007784239016473293,
-0.06213116645812988,
-0.039408136159181595,
0.04042894393205643,
0.1573004126548767,
-0.102115198969841,
-0.030614323914051056,
-0.010097207501530647,
0.07583360373973846,
-0.036829039454460144,
-0.011341994628310204,
-0.07277753204107285,
0.028487829491496086,
-0.01918732561171055,
0.09168383479118347,
-0.1961694359779358,
0.19930626451969147,
0.01889882981777191,
0.09576162695884705,
-0.006087443791329861,
-0.006076558027416468,
0.028804192319512367,
0.06689684838056564,
0.08620757609605789,
-0.013815236277878284,
-0.06383813172578812,
-0.0687214583158493,
-0.0750967413187027,
0.007481712847948074,
-0.009842809289693832,
0.09356949478387833,
0.029718251898884773,
-0.03907018527388573,
0.024650100618600845,
0.020136944949626923,
-0.08289454132318497,
-0.156342014670372,
-0.10430383682250977,
0.029240205883979797,
0.13781806826591492,
0.08968581259250641,
-0.07909788191318512,
-0.10844849050045013,
-0.07048764824867249,
0.14846636354923248,
-0.021200237795710564,
-0.019169464707374573,
-0.09829378128051758,
0.05698495730757713,
0.04626630246639252,
-0.040142253041267395,
0.01650603674352169,
0.07590051740407944,
0.1062619611620903,
0.006738157942891121,
-0.0012468136847019196,
0.05722834914922714,
-0.07490909844636917,
-0.04030079022049904,
-0.04580838233232498,
0.09242729842662811,
0.045917853713035583,
0.026732392609119415,
0.033302802592515945,
0.00300995004363358,
-0.029493672773241997,
-0.00722054298967123,
0.027939654886722565,
0.14996615052223206,
-0.037812091410160065,
-0.03110184706747532,
-0.07328387349843979,
-0.19983471930027008,
-0.03171378746628761,
-0.11668086796998978,
0.21214742958545685,
0.14700967073440552,
-0.07320789247751236,
0.15074975788593292,
0.12141191214323044,
-0.08080069720745087,
-0.23232530057430267,
0.09170617163181305,
0.057448286563158035,
0.10474404692649841,
0.08077883720397949,
-0.2169790118932724,
0.04970567300915718,
0.0456739105284214,
-0.02955050580203533,
0.029178928583860397,
-0.23121775686740875,
-0.11945343762636185,
0.1107884868979454,
0.046383004635572433,
0.11835747957229614,
-0.04043782874941826,
-0.035357002168893814,
0.006240661721676588,
-0.03163626790046692,
-0.03506143018603325,
-0.10616115480661392,
0.13589344918727875,
0.0000013746793001701008,
0.023201895877718925,
0.030854592099785805,
-0.05388224497437477,
0.07467299699783325,
-0.009504441171884537,
0.003420507302507758,
-0.057237088680267334,
0.07546847313642502,
0.03247825428843498,
0.0033911392092704773,
0.08400703221559525,
-0.04957260936498642,
0.04285475239157677,
-0.14076390862464905,
-0.08897913992404938,
-0.008904863148927689,
0.0939965769648552,
0.013953305780887604,
-0.047753896564245224,
0.019715307280421257,
0.05132240429520607,
-0.019452329725027084,
-0.03548216074705124,
-0.09948120266199112,
-0.09622351825237274,
0.03337371349334717,
0.17427925765514374,
0.14831213653087616,
0.05317428708076477,
-0.0120261050760746,
-0.02004668116569519,
-0.026455983519554138,
0.06798579543828964,
-0.11816181987524033,
-0.00439602043479681,
0.03901371359825134,
0.028349721804261208,
0.11832936853170395,
0.0024352045729756355,
-0.09254140406847,
0.09610332548618317,
0.10715873539447784,
-0.019700132310390472,
-0.09486302733421326,
-0.06246580928564072,
0.06503640115261078,
-0.011892604641616344,
0.012084712274372578,
0.10204070806503296,
-0.07936912029981613,
-0.027193643152713776,
0.011892768554389477,
0.0024038448464125395,
-0.05324115976691246,
0.13690611720085144,
0.030973520129919052,
0.049218568950891495,
-0.036539651453495026,
0.07330503314733505,
0.03657791018486023,
-0.05189836025238037,
0.03299194574356079,
0.0786944255232811,
-0.10862071067094803,
-0.1021677702665329,
0.027149444445967674,
0.0013733680825680494,
0.11614114791154861,
-0.07152248173952103,
-0.06106289476156235,
-0.03574246168136597,
-0.012155482545495033,
0.12394401431083679,
0.05031629651784897,
-0.015927011147141457,
-0.08501944690942764,
-0.017830993980169296,
-0.07781090587377548,
0.0786733478307724,
0.04375670477747917,
0.0009473064565099776,
-0.0855112224817276,
0.1672765165567398,
0.03683827444911003,
-0.005744400434195995,
-0.031930241733789444,
-0.08342740684747696,
-0.034990616142749786,
0.025629661977291107,
-0.13415418565273285,
-0.045437756925821304,
-0.05674605071544647,
-0.0034916060976684093,
-0.009505004622042179,
-0.07241741567850113,
0.013237682171165943,
0.07819633930921555,
-0.08233039081096649,
0.03801649063825607,
-0.02681240811944008,
0.09297949820756912,
-0.07693634927272797,
-0.005421899259090424,
0.056802984327077866,
-0.04255023971199989,
0.0780450701713562,
0.10262050479650497,
-0.08494146913290024,
0.08987433463335037,
-0.13547922670841217,
-0.05072100833058357,
0.012194805778563023,
0.06990252435207367,
0.031909286975860596,
-0.10796564817428589,
0.033202189952135086,
0.025079144164919853,
0.08633426576852798,
-0.002432367065921426,
0.13160043954849243,
-0.07138172537088394,
-0.022354017943143845,
-0.07654738426208496,
-0.014333362691104412,
-0.06120641529560089,
0.03310644254088402,
0.04830073565244675,
0.1217481717467308,
0.13811711966991425,
-0.09578180313110352,
0.07093121111392975,
-0.10864509642124176,
0.026252517476677895,
-0.041422322392463684,
-0.04914971441030502,
-0.07362382113933563,
-0.07535164803266525,
0.06469018012285233,
-0.04988134279847145,
0.08022366464138031,
0.022335132583975792,
-0.05489867925643921,
0.015979962423443794,
-0.22203785181045532,
-0.006397333461791277,
-0.014494822360575199,
0.2568746507167816,
0.058838531374931335,
0.03521505370736122,
-0.004286184906959534,
0.014154210686683655,
0.0186032485216856,
0.10094935446977615,
0.019986052066087723,
0.18008968234062195,
-0.12644916772842407,
0.06203019246459007,
0.03362543508410454,
-0.08595921099185944,
-0.09872962534427643,
0.027812547981739044,
-0.12054312974214554,
0.0289426539093256,
-0.13577218353748322,
0.16633930802345276,
0.13923080265522003,
-0.06638412177562714,
0.05921443551778793,
-0.036036524921655655,
-0.09056354314088821,
-0.10691201686859131,
-0.08456052094697952,
-0.07714786380529404,
-0.15194550156593323,
0.027817923575639725,
-0.0717090517282486,
0.04218320548534393,
0.0005055537912994623,
0.07138419896364212,
-0.011153781786561012,
0.20343191921710968,
0.02061723917722702,
-0.0281387846916914,
0.04441400244832039,
-0.08660247176885605,
-0.0385512113571167,
-0.17674435675144196,
0.031148921698331833,
0.03414902836084366,
0.014313294552266598,
0.018308747559785843,
0.008646183647215366,
-0.09607204794883728,
-0.010617204010486603,
-0.05463503673672676,
-0.03862328827381134,
-0.028738223016262054,
0.0124238021671772,
0.02303924225270748,
0.10657632350921631,
0.1274121105670929,
-0.058790337294340134,
0.012489963322877884,
0.22732405364513397,
-0.05472801253199577,
-0.13443325459957123,
-0.1444406509399414,
0.08319748193025589,
0.030588047578930855,
-0.03025810979306698,
0.005296378396451473,
-0.052176181226968765,
0.019632982090115547,
0.2629421055316925,
0.16951735317707062,
0.012850707396864891,
0.042311228811740875,
0.010174200870096684,
-0.010579808615148067,
-0.07876639068126678,
0.10155850648880005,
0.1054338812828064,
0.13321374356746674,
-0.03872699663043022,
-0.03867834806442261,
-0.06408257782459259,
-0.06540800631046295,
-0.05676857754588127,
0.04419541731476784,
-0.07371595501899719,
-0.05482414737343788,
-0.01953902840614319,
0.1477941870689392,
-0.023603137582540512,
-0.14217132329940796,
-0.09789887070655823,
-0.047328535467386246,
-0.12367967516183853,
-0.005961921531707048,
0.013219122774899006,
0.04294172674417496,
-0.0008006620337255299,
-0.053757425397634506,
0.03674345836043358,
0.15860815346240997,
-0.0002647709916345775,
-0.025351570919156075,
-0.04149385914206505,
0.08904966711997986,
-0.16039860248565674,
0.04065166413784027,
-0.047676682472229004,
0.09609998017549515,
0.04470442607998848,
0.08329740166664124,
-0.05767542123794556,
0.14918152987957,
-0.0035255495458841324,
-0.04317719489336014,
0.01141970232129097,
0.11876174807548523,
-0.049885138869285583,
0.14471179246902466,
0.016819972544908524,
-0.059667475521564484,
0.07891944795846939,
-0.11361512541770935,
-0.008960926905274391,
-0.08171966671943665,
0.06607579439878464,
-0.029748957604169846,
0.11752412468194962,
0.12106098979711533,
-0.09601033478975296,
-0.07829926162958145,
-0.03817540034651756,
0.004011277109384537,
0.012630571611225605,
-0.05947086587548256,
-0.04245764762163162,
-0.21851970255374908,
-0.013456009328365326,
-0.06579481810331345,
0.036685407161712646,
-0.21327714622020721,
-0.059649448841810226,
-0.012404640205204487,
-0.07848107069730759,
-0.013781176880002022,
0.0869288221001625,
0.13569942116737366,
0.01466367021203041,
-0.015994984656572342,
0.039388108998537064,
0.01916368119418621,
0.11909066140651703,
-0.15730640292167664,
-0.12401440739631653
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Vietnamese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Vietnamese using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "vi", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-vietnamese")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-vietnamese")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Vietnamese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "vi", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-vietnamese")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-53-vietnamese")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 66.78 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "vi", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Vietnamese", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice vi", "type": "common_voice", "args": "vi"}, "metrics": [{"type": "wer", "value": 66.78, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-53-vietnamese
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"vi",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"vi"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #vi #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Vietnamese
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Vietnamese using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Vietnamese test data of Common Voice.
Test Result: 66.78 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Vietnamese\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Vietnamese using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Vietnamese test data of Common Voice.\n\nTest Result: 66.78 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #vi #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Vietnamese\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Vietnamese using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Vietnamese test data of Common Voice.\n\nTest Result: 66.78 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
29,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #vi #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Vietnamese\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Vietnamese using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Vietnamese test data of Common Voice.\n\nTest Result: 66.78 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1114252433180809,
0.05609160661697388,
-0.0020430528093129396,
-0.037966400384902954,
0.07581824064254761,
-0.08668576925992966,
0.15414440631866455,
0.1046571284532547,
-0.018076468259096146,
-0.022855544462800026,
0.029498454183340073,
0.001037940732203424,
0.05758604779839516,
0.07643257081508636,
-0.0018036251422017813,
-0.23137569427490234,
0.031805671751499176,
0.05452987924218178,
0.06665428727865219,
0.11227262765169144,
0.08607033640146255,
-0.07549356669187546,
0.05000454932451248,
0.11994326114654541,
-0.11051476746797562,
0.054013464599847794,
0.0005853540496900678,
-0.11033247411251068,
0.12960532307624817,
0.0474981926381588,
0.05089239031076431,
0.04578736796975136,
0.08507879823446274,
-0.18950888514518738,
0.016035811975598335,
0.0504513643682003,
0.02464768849313259,
0.013874679803848267,
0.03325602784752846,
0.03845468536019325,
0.16626080870628357,
0.06246841698884964,
-0.010804915800690651,
0.05787482485175133,
-0.057064495980739594,
-0.23782698810100555,
-0.023312920704483986,
-0.008652816526591778,
0.133214071393013,
0.1285363733768463,
-0.07106626778841019,
0.13095685839653015,
-0.17286573350429535,
0.07263735681772232,
0.07660655677318573,
-0.17879560589790344,
0.0004277811967767775,
0.09599780291318893,
0.08200771361589432,
0.05312013998627663,
-0.08199943602085114,
0.030662324279546738,
0.05386527627706528,
0.04237642139196396,
-0.008256726898252964,
-0.055669087916612625,
-0.19304139912128448,
-0.002258959459140897,
-0.12756621837615967,
-0.01333511434495449,
0.21461477875709534,
0.014656096696853638,
-0.06823141127824783,
-0.12120533734560013,
-0.019534951075911522,
-0.039562489837408066,
-0.018351569771766663,
-0.11294551193714142,
0.023592261597514153,
0.04384544864296913,
0.042064081877470016,
-0.03266453742980957,
-0.10724564641714096,
-0.14620615541934967,
-0.007675677537918091,
-0.05745718255639076,
0.020224031060934067,
0.007186142262071371,
-0.13259649276733398,
0.045108769088983536,
-0.12956055998802185,
-0.04389719292521477,
-0.020607557147741318,
0.026650618761777878,
-0.06902318447828293,
0.02723337896168232,
-0.033865973353385925,
-0.1274121105670929,
0.014975599944591522,
-0.07865039259195328,
0.026735592633485794,
0.04165087267756462,
-0.07770150154829025,
0.049552980810403824,
0.024510851129889488,
0.1380443572998047,
-0.10345546901226044,
0.017013536766171455,
-0.006264431867748499,
0.004068558569997549,
-0.06840066611766815,
-0.003365768352523446,
-0.07867086678743362,
-0.03792045637965202,
0.013995730318129063,
0.00732994731515646,
-0.07350751012563705,
0.0062909857369959354,
-0.04299232363700867,
-0.052239805459976196,
0.08256343752145767,
-0.06087016686797142,
-0.04000604897737503,
0.05853097885847092,
0.010366824455559254,
0.16615410149097443,
0.08342845737934113,
0.08641064167022705,
-0.06441409140825272,
-0.023432090878486633,
0.027148393914103508,
0.04649297520518303,
-0.02261260524392128,
-0.09306856244802475,
0.006971045397222042,
-0.040695011615753174,
-0.031074801459908485,
-0.04052709415555,
-0.11629225313663483,
-0.07397075742483139,
0.009545057080686092,
0.024999773129820824,
-0.025495415553450584,
-0.11200251430273056,
-0.035812705755233765,
-0.04356705769896507,
-0.019383400678634644,
0.07234262675046921,
-0.02179512195289135,
0.05523538216948509,
0.00712399510666728,
0.08167506754398346,
0.022093217819929123,
0.10266797989606857,
-0.07425820827484131,
-0.022332578897476196,
-0.0025232050102204084,
0.13659854233264923,
-0.01899143122136593,
-0.07475518435239792,
-0.07622089236974716,
-0.097711481153965,
-0.049678921699523926,
0.05205150321125984,
0.024331776425242424,
0.07561767101287842,
-0.28459420800209045,
-0.08160080760717392,
0.18093465268611908,
-0.12314730137586594,
-0.015632931143045425,
0.22598126530647278,
-0.004388090223073959,
0.09561053663492203,
0.11252299696207047,
0.22147585451602936,
0.1040073037147522,
-0.22315572202205658,
0.07548707723617554,
0.03146876022219658,
-0.007800054736435413,
-0.04328230023384094,
0.07830537110567093,
-0.026462364941835403,
-0.038968268781900406,
0.03437171131372452,
-0.05523611977696419,
0.0454181507229805,
-0.04988310858607292,
-0.06583300232887268,
-0.00836124550551176,
-0.09500282257795334,
0.08647362142801285,
0.05626997351646423,
0.029987281188368797,
0.005613433662801981,
-0.03410990163683891,
0.07803071290254593,
0.12942896783351898,
-0.10812915861606598,
0.02083810791373253,
-0.12903980910778046,
0.03599129244685173,
-0.07236862927675247,
-0.021885370835661888,
-0.11196814477443695,
0.21851614117622375,
0.010248934850096703,
0.08572597801685333,
0.028827141970396042,
0.19557072222232819,
0.0002978170814458281,
0.029687274247407913,
-0.04306695610284805,
-0.0029098745435476303,
0.015517204999923706,
-0.006450507324188948,
-0.028204819187521935,
-0.0722518116235733,
-0.002507197204977274,
-0.05548573657870293,
0.05384679511189461,
-0.20951546728610992,
-0.0347004309296608,
0.04700100049376488,
-0.003964725881814957,
0.013749522157013416,
0.007284657098352909,
0.07524510473012924,
0.10627184808254242,
0.014875630848109722,
0.029532968997955322,
0.04450998082756996,
-0.002209064783528447,
-0.00702692149206996,
0.15925852954387665,
-0.1059713363647461,
-0.008229663595557213,
0.09910018742084503,
-0.042544808238744736,
-0.006943418178707361,
0.04979013651609421,
-0.014743765816092491,
-0.035751163959503174,
-0.09970827400684357,
0.04428050294518471,
0.26574692130088806,
-0.009618373587727547,
0.14365415275096893,
-0.10964522510766983,
0.021087616682052612,
0.04216698929667473,
-0.06816063076257706,
0.04725855588912964,
0.08633383363485336,
0.04063615947961807,
0.007099908776581287,
0.046369753777980804,
-0.047528062015771866,
-0.0688905417919159,
0.23144926130771637,
-0.019332677125930786,
-0.06354205310344696,
-0.0012133027194067836,
0.02073843590915203,
-0.03673585504293442,
0.07013411074876785,
-0.23260338604450226,
-0.016720706596970558,
0.027474215254187584,
0.06895053386688232,
0.04415401816368103,
-0.1667248159646988,
0.005922309122979641,
-0.00228683790192008,
-0.13322649896144867,
-0.1397334784269333,
0.09241421520709991,
-0.016526948660612106,
0.03804188594222069,
-0.09345024824142456,
-0.026129277423024178,
-0.015236588194966316,
-0.047701530158519745,
-0.1627940833568573,
0.13194988667964935,
-0.07227104157209396,
-0.27357226610183716,
-0.09472832828760147,
-0.03164784610271454,
0.007738121785223484,
0.009222020395100117,
0.07720958441495895,
-0.1697424352169037,
-0.02119969204068184,
-0.028019296005368233,
0.04235644266009331,
0.017918379977345467,
-0.03583251312375069,
-0.04022505134344101,
0.028747815638780594,
0.051291801035404205,
-0.11156751960515976,
0.00013761621084995568,
-0.08990170061588287,
-0.04556041210889816,
-0.006734214257448912,
-0.05555468425154686,
-0.03682565689086914,
0.16776196658611298,
0.01678897999227047,
0.01808750443160534,
-0.0025023657362908125,
0.1760178953409195,
-0.08128059655427933,
-0.0228068046271801,
0.2042907327413559,
0.03308610990643501,
-0.002934261690825224,
0.09058429300785065,
0.01098156813532114,
-0.0765647292137146,
0.028539378196001053,
0.03257974237203598,
-0.08061612397432327,
-0.21542994678020477,
-0.12189752608537674,
-0.0771002322435379,
-0.05586329475045204,
-0.02209748886525631,
0.006214868277311325,
0.07787040621042252,
0.01375084649771452,
0.015489021316170692,
0.020648036152124405,
0.02649267204105854,
-0.041350800544023514,
0.14019601047039032,
-0.022594867274165154,
0.07104368507862091,
-0.06796238571405411,
-0.03746456280350685,
0.03359465301036835,
0.03513197973370552,
0.14069779217243195,
0.062435027211904526,
0.038490451872348785,
0.12162744998931885,
0.15431343019008636,
0.13222016394138336,
0.04853017255663872,
-0.11427710205316544,
-0.03253018856048584,
-0.010972593910992146,
-0.07877662032842636,
0.009726549498736858,
0.05305146425962448,
0.1261233389377594,
-0.0476602241396904,
-0.024365240707993507,
0.014312292449176311,
-0.0063037509098649025,
0.22160688042640686,
0.0802990272641182,
-0.18720807135105133,
-0.03253532201051712,
-0.02740192413330078,
-0.02409677393734455,
0.008589680306613445,
0.07730940729379654,
0.1888771504163742,
-0.15784482657909393,
0.046021636575460434,
0.011597628705203533,
0.09121830761432648,
-0.013037349097430706,
0.03325623273849487,
-0.059137456119060516,
0.003549209563061595,
0.028214851394295692,
0.08720922470092773,
-0.2703288495540619,
0.2190844714641571,
-0.01563393510878086,
0.06612388044595718,
-0.0669654980301857,
-0.020180044695734978,
-0.0019799680449068546,
0.09551560878753662,
0.13017408549785614,
0.03146897628903389,
0.058738648891448975,
-0.07429497689008713,
-0.08122839033603668,
0.05102497711777687,
-0.04297671094536781,
0.08761228621006012,
-0.010559983551502228,
0.013812641613185406,
-0.02831423282623291,
0.0034634724725037813,
-0.04366561397910118,
-0.12099223583936691,
-0.03493546321988106,
-0.011267028748989105,
0.15497848391532898,
0.10231118649244308,
-0.005462266970425844,
-0.08238536864519119,
-0.07493717968463898,
0.09699947386980057,
-0.06272891163825989,
-0.015666581690311432,
-0.03800215199589729,
-0.0773061141371727,
0.10813742876052856,
-0.03309336677193642,
0.0023820437490940094,
0.05864030495285988,
0.10040141642093658,
-0.016526905819773674,
-0.016540123149752617,
0.07486369460821152,
-0.10457824170589447,
-0.11280030012130737,
-0.018175886943936348,
0.15713731944561005,
0.11019521206617355,
0.06319759786128998,
0.06930571049451828,
-0.008311355486512184,
0.0036361010279506445,
-0.06812795996665955,
-0.033402763307094574,
0.14554141461849213,
-0.09748408943414688,
-0.0017096323426812887,
-0.009491345845162868,
-0.16591009497642517,
-0.08584868162870407,
-0.08326727151870728,
0.13354776799678802,
0.08284930139780045,
-0.06268997490406036,
0.22710832953453064,
0.21732456982135773,
-0.09711142629384995,
-0.180171400308609,
-0.02722773887217045,
0.10267925262451172,
0.11115389317274094,
-0.006486088968813419,
-0.17589963972568512,
0.06527968496084213,
-0.011844871565699577,
-0.02832944504916668,
-0.10221127420663834,
-0.27779334783554077,
-0.16424691677093506,
0.15357883274555206,
-0.03221328184008598,
0.14392264187335968,
0.021661579608917236,
-0.02639591693878174,
-0.014830521307885647,
-0.02250431664288044,
0.09401523321866989,
-0.06279108673334122,
0.08526583015918732,
0.017875347286462784,
0.06539516896009445,
0.0396457277238369,
-0.0047919861972332,
0.10285888612270355,
0.08307253569364548,
0.0002179094881284982,
-0.0015408290782943368,
0.028123576194047928,
0.018572380766272545,
0.04277162626385689,
0.15356753766536713,
-0.05448434129357338,
0.0352570116519928,
-0.13297785818576813,
-0.09320968389511108,
-0.1003267765045166,
0.006780774332582951,
0.02856454811990261,
-0.03906470909714699,
0.006533891893923283,
-0.013384082354605198,
0.017896533012390137,
0.008869683369994164,
-0.06350371986627579,
-0.09626226872205734,
0.028496267274022102,
0.16368131339550018,
0.1754484623670578,
-0.05898410081863403,
-0.05000939965248108,
-0.028462931513786316,
-0.0014256710419431329,
0.13157619535923004,
-0.09657188504934311,
0.033875856548547745,
0.06605052947998047,
0.045448411256074905,
0.15857456624507904,
-0.0017347338143736124,
-0.09462040662765503,
0.0796322375535965,
0.026303166523575783,
-0.05335678532719612,
-0.1440018117427826,
-0.031734615564346313,
0.010615129955112934,
0.02516697533428669,
0.009997322224080563,
0.055408354848623276,
-0.10607331991195679,
-0.022091761231422424,
-0.014776612631976604,
0.022906716912984848,
-0.12878507375717163,
0.1617821604013443,
-0.0033509996719658375,
0.08593525737524033,
-0.10995426028966904,
0.01948409341275692,
0.041073136031627655,
-0.09048692882061005,
0.0379914827644825,
0.01627230830490589,
-0.08297988027334213,
-0.048544544726610184,
0.014855182729661465,
0.12851884961128235,
0.08655869215726852,
-0.1454603672027588,
-0.02931891195476055,
-0.10627629607915878,
0.025452565401792526,
0.10015011578798294,
0.07274626940488815,
0.026706328615546227,
-0.11620231717824936,
-0.10120463371276855,
-0.10818605124950409,
0.04662413150072098,
0.07836868613958359,
-0.04340745881199837,
-0.11256664991378784,
0.07441163808107376,
0.11486614495515823,
0.1037769764661789,
-0.05152424797415733,
-0.10308096557855606,
-0.027811286970973015,
0.08472567051649094,
-0.10927551984786987,
0.03500596433877945,
-0.050687599927186966,
0.004753113258630037,
-0.018336625769734383,
-0.08792822062969208,
-0.02349156327545643,
0.0686962828040123,
-0.09441661834716797,
0.09209493547677994,
0.004751857835799456,
0.07361326366662979,
-0.08428732305765152,
0.01844744384288788,
0.06398861110210419,
-0.03624281659722328,
0.0662277489900589,
0.07063516229391098,
-0.11582392454147339,
0.09777814149856567,
-0.18983806669712067,
-0.061639755964279175,
0.1006203219294548,
0.07409536093473434,
-0.0027031260542571545,
-0.11645746976137161,
0.03908642753958702,
0.09431178122758865,
0.09643889963626862,
-0.017734359949827194,
0.11840180307626724,
-0.07263237237930298,
-0.05048811808228493,
-0.08884621411561966,
-0.041843052953481674,
-0.032766904681921005,
0.04190424829721451,
0.04263262078166008,
0.12998554110527039,
0.12135419994592667,
-0.10150129348039627,
0.08253657817840576,
-0.11266271024942398,
-0.012729840353131294,
-0.07538696378469467,
-0.01762459985911846,
-0.1057404950261116,
-0.07775373756885529,
0.05435893312096596,
-0.015937326475977898,
0.14208568632602692,
0.03151640295982361,
0.04402399808168411,
-0.03107120655477047,
-0.09374237805604935,
-0.025649700313806534,
-0.008792294189333916,
0.2226640284061432,
0.04281344264745712,
0.041130583733320236,
-0.020113345235586166,
0.0023148979526013136,
-0.017076898366212845,
0.20063208043575287,
-0.02498524822294712,
0.15175719559192657,
0.00483528571203351,
0.04929061606526375,
0.07633720338344574,
-0.023639241233468056,
-0.06714978814125061,
0.029319999739527702,
-0.18284033238887787,
0.05230487138032913,
-0.10041274130344391,
0.13721652328968048,
0.10390110313892365,
-0.11827065050601959,
0.11404378712177277,
0.013769901357591152,
-0.0983930453658104,
-0.1666048765182495,
-0.16408702731132507,
-0.07337754219770432,
-0.19485078752040863,
0.015149692073464394,
-0.06928764283657074,
0.0659552663564682,
0.04232475534081459,
0.08752357214689255,
-0.03814927488565445,
0.1463927924633026,
-0.03246748819947243,
-0.08104819059371948,
0.06635263562202454,
-0.08812516182661057,
-0.007778864353895187,
-0.10336380451917648,
0.08972229063510895,
0.18385326862335205,
-0.020972592756152153,
0.04393840208649635,
0.012481310404837132,
-0.06525552272796631,
0.033201124519109726,
-0.04939660802483559,
-0.04294872656464577,
-0.012119943276047707,
-0.057591915130615234,
0.07397160679101944,
0.10992361605167389,
0.12496514618396759,
-0.06440398842096329,
0.011669496074318886,
0.09755367040634155,
-0.04247864708304405,
-0.16094815731048584,
-0.1733635514974594,
0.18318529427051544,
0.003748884191736579,
0.014789508655667305,
0.002686138032004237,
0.007851018570363522,
-0.04020484909415245,
0.2825332283973694,
0.18324927985668182,
0.059563614428043365,
0.012826792895793915,
-0.029786676168441772,
-0.005901603493839502,
-0.048689913004636765,
0.034100450575351715,
0.11386211216449738,
0.25146931409835815,
-0.03594476357102394,
-0.009292087517678738,
-0.10207930952310562,
-0.09511157125234604,
0.0021017794497311115,
0.011546476744115353,
-0.011822310276329517,
-0.0885186716914177,
0.006961950566619635,
0.12794771790504456,
-0.08269244432449341,
-0.04384868964552879,
-0.07361382991075516,
-0.11277551203966141,
-0.09129603207111359,
-0.021338189020752907,
0.010098461993038654,
0.12366598844528198,
-0.035816412419080734,
-0.07941066473722458,
0.023716086521744728,
0.16693784296512604,
-0.00869529414921999,
-0.07424300909042358,
-0.0690951943397522,
0.061985697597265244,
-0.06259357184171677,
-0.0019082570215687156,
0.00015548350347671658,
0.14287817478179932,
-0.0028697282541543245,
0.09470538049936295,
-0.005948425270617008,
0.14739985764026642,
-0.03776947408914566,
-0.04769185557961464,
-0.01156795583665371,
0.11805087327957153,
-0.025735987350344658,
0.13217996060848236,
-0.01920563355088234,
-0.1134713739156723,
0.05648990347981453,
-0.1599419265985489,
-0.004886130336672068,
-0.06939271092414856,
0.07645191252231598,
-0.04673000052571297,
0.07969128340482712,
0.08436287939548492,
-0.05261359363794327,
-0.017997117713093758,
-0.06884175539016724,
0.09227371215820312,
-0.032701823860406876,
-0.11301802843809128,
-0.04129788279533386,
-0.22161373496055603,
0.027931496500968933,
-0.05310283973813057,
-0.005926064681261778,
-0.22467835247516632,
-0.035297978669404984,
-0.012028783559799194,
-0.07324603945016861,
0.005950221326202154,
0.016210509464144707,
0.06027805432677269,
0.04335206001996994,
-0.02177397720515728,
-0.06581316143274307,
0.027470489963889122,
0.10712505131959915,
-0.13764086365699768,
-0.11751430481672287
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Assamese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Assamese using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "as", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-as")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-as")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Assamese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "as", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-large-xlsr-as")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-large-xlsr-as")
model.to("cuda")
chars_to_ignore_regex = '[\\,\\?\\.\\!\\-\\;\\:\\"\\“\\%\\”\\়\\।]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub('’ ',' ',batch["sentence"])
batch["sentence"] = re.sub(' ‘',' ',batch["sentence"])
batch["sentence"] = re.sub('’|‘','\'',batch["sentence"])
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 69.63 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "as", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Assamese", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice as", "type": "common_voice", "args": "as"}, "metrics": [{"type": "wer", "value": 69.63, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-large-xlsr-as
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"as",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"as"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #as #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Assamese
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Assamese using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Assamese test data of Common Voice.
Test Result: 69.63 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Assamese\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Assamese using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Assamese test data of Common Voice.\n\nTest Result: 69.63 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #as #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Assamese\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Assamese using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Assamese test data of Common Voice.\n\nTest Result: 69.63 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
65,
20,
30,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #as #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Assamese\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Assamese using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Assamese test data of Common Voice.\n\nTest Result: 69.63 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.1528865098953247,
0.020260432735085487,
-0.0017886146670207381,
-0.005690727848559618,
0.08979082852602005,
-0.057358674705028534,
0.1790434718132019,
0.07189574837684631,
-0.03507162258028984,
-0.017394760623574257,
0.05292031913995743,
-0.009757526218891144,
0.06373300403356552,
0.0515424981713295,
-0.015378850512206554,
-0.2178380787372589,
-0.01432091649621725,
0.016820687800645828,
0.06228099763393402,
0.1262441873550415,
0.11585337668657303,
-0.08664283901453018,
-0.0007018813630566001,
0.1104540005326271,
-0.14766906201839447,
0.03620205819606781,
0.02603572979569435,
-0.12277321517467499,
0.12568987905979156,
0.07152490317821503,
0.079363152384758,
0.05288267880678177,
0.07152804732322693,
-0.19412142038345337,
0.023359470069408417,
0.016635311767458916,
0.05234520137310028,
0.017490554600954056,
0.028045671060681343,
0.017408546060323715,
-0.007232074625790119,
0.14135144650936127,
-0.025943923741579056,
0.08287325501441956,
-0.06332875043153763,
-0.17109714448451996,
-0.015722285956144333,
0.01635301113128662,
0.11816159635782242,
0.13702431321144104,
-0.0687611773610115,
0.04557722434401512,
-0.14763323962688446,
0.07327130436897278,
0.07240466773509979,
-0.14189279079437256,
0.007416407577693462,
0.024919122457504272,
0.01751246303319931,
0.11938255280256271,
-0.046566810458898544,
0.02522088587284088,
0.04313522204756737,
0.026541385799646378,
0.009311980567872524,
-0.03486761823296547,
-0.16820521652698517,
-0.011741985566914082,
-0.10090058296918869,
-0.013385290279984474,
0.22505955398082733,
-0.0008768679108470678,
-0.08077634871006012,
-0.12408826500177383,
0.006731460336595774,
-0.0303454902023077,
-0.012429463677108288,
-0.06403519213199615,
0.0007599001401104033,
0.042389094829559326,
0.005185065325349569,
-0.019017523154616356,
-0.1314234733581543,
-0.12248200923204422,
-0.00015247921692207456,
0.046990033239126205,
0.005101483315229416,
0.023190641775727272,
-0.14180652797222137,
0.040284596383571625,
-0.08530206233263016,
-0.07211856544017792,
-0.02116214670240879,
0.03144606202840805,
-0.07402253150939941,
0.05345472693443298,
-0.0780114084482193,
-0.18749664723873138,
0.02555232122540474,
-0.056327689439058304,
0.05231170356273651,
0.0422687754034996,
-0.02080175280570984,
0.05770977959036827,
0.036131639033555984,
0.12957896292209625,
-0.04217536002397537,
-0.00727463373914361,
0.0252279881387949,
0.01083722710609436,
-0.04176942631602287,
-0.04057322442531586,
-0.0666130930185318,
-0.026793697848916054,
0.016990533098578453,
0.047429557889699936,
-0.06327628344297409,
-0.031842220574617386,
-0.04461776465177536,
-0.060855623334646225,
0.02481052652001381,
-0.1108313798904419,
-0.06256694346666336,
0.056080058217048645,
0.03040575049817562,
0.10642775148153305,
0.042367011308670044,
0.07294642180204391,
-0.04795949161052704,
-0.01940869726240635,
-0.003588175866752863,
0.03673647716641426,
-0.011781281791627407,
-0.05781007185578346,
-0.017149854451417923,
-0.12075792998075485,
-0.0057595269754529,
-0.10663400590419769,
-0.11602967232465744,
-0.06798261404037476,
-0.007676869630813599,
0.04180121794342995,
-0.0392058789730072,
-0.08792048692703247,
0.009436379186809063,
-0.003867949591949582,
-0.08324064314365387,
0.06053158640861511,
-0.033702053129673004,
0.053295619785785675,
0.037664175033569336,
0.033557746559381485,
0.04093535989522934,
0.10534372925758362,
-0.06876421719789505,
-0.02663162723183632,
0.0367388017475605,
0.1646232306957245,
-0.01970057189464569,
-0.015283148735761642,
-0.08515842258930206,
-0.06195169314742088,
-0.04425406828522682,
0.05043192580342293,
0.05689665302634239,
0.11074375361204147,
-0.2667922377586365,
-0.10453736782073975,
0.17295053601264954,
-0.11830927431583405,
-0.039814021438360214,
0.20478495955467224,
-0.03327362611889839,
0.10067959129810333,
0.13096687197685242,
0.21378949284553528,
0.0937143936753273,
-0.15249976515769958,
0.04967048391699791,
-0.020021051168441772,
0.01931128092110157,
-0.05299922823905945,
0.06885231286287308,
-0.05606035143136978,
-0.05698729678988457,
0.04088254272937775,
-0.047541335225105286,
0.06382954865694046,
-0.031987689435482025,
-0.07366833090782166,
-0.01776810921728611,
-0.09820881485939026,
0.024805823341012,
0.04396974667906761,
0.024864139035344124,
-0.02851978875696659,
-0.04267467185854912,
0.051805224269628525,
0.11248105019330978,
-0.12917551398277283,
0.04214855656027794,
-0.1378500610589981,
0.0548013299703598,
-0.11675289273262024,
-0.03610771894454956,
-0.13476315140724182,
0.18936549127101898,
0.00036417884984984994,
0.10085548460483551,
0.0754648745059967,
0.1325220763683319,
0.015298238955438137,
-0.0035351887345314026,
-0.027378467842936516,
-0.02106180042028427,
0.007186553906649351,
-0.026441438123583794,
-0.04788468778133392,
-0.07725110650062561,
0.010705976746976376,
-0.06310664117336273,
0.12584301829338074,
-0.15145304799079895,
-0.017632056027650833,
-0.013771915808320045,
-0.004045529291033745,
0.011285022832453251,
-0.011547939851880074,
0.057723261415958405,
0.08269187808036804,
0.002647163812071085,
0.021064069122076035,
0.03189460188150406,
0.017246339470148087,
-0.0936114564538002,
0.17622153460979462,
-0.12398022413253784,
-0.014538658782839775,
0.07833771407604218,
-0.05838409438729286,
0.007342614233493805,
0.0031336757820099592,
-0.01487428043037653,
-0.02707335725426674,
-0.07530653476715088,
0.01631791517138481,
0.37442147731781006,
-0.012888775207102299,
0.12664268910884857,
-0.1101643443107605,
0.02023315615952015,
0.03659552335739136,
-0.07234515994787216,
0.04354606568813324,
0.05300574377179146,
0.004612714983522892,
0.015523376874625683,
0.0336417481303215,
-0.029779275879263878,
-0.07455863058567047,
0.23311685025691986,
-0.02822546288371086,
-0.09706805646419525,
0.006779889110475779,
-0.04827474057674408,
-0.03497973829507828,
0.06499073654413223,
-0.1869633048772812,
-0.05180186778306961,
0.03460775315761566,
0.04840761795639992,
0.05776555836200714,
-0.14660879969596863,
0.00980275496840477,
0.03120821714401245,
-0.09896446019411087,
-0.12933935225009918,
0.044391077011823654,
-0.05630970746278763,
0.029315372928977013,
-0.100652776658535,
-0.02381516806781292,
0.01588483527302742,
-0.027376355603337288,
-0.17276768386363983,
0.1379462331533432,
-0.06444849073886871,
-0.2973591089248657,
-0.136410653591156,
-0.007979482412338257,
0.0322263166308403,
0.015058694407343864,
0.09193520992994308,
-0.1565529704093933,
-0.02323203906416893,
-0.06045759469270706,
0.0415896438062191,
0.00032015712349675596,
-0.002555272076278925,
-0.054611336439847946,
-0.007460849359631538,
0.06904413551092148,
-0.1373969167470932,
0.016503024846315384,
-0.06337348371744156,
-0.012854835949838161,
0.00496655935421586,
0.0034506837837398052,
-0.0030886793974786997,
0.20735274255275726,
0.03683307394385338,
0.011948945000767708,
-0.004347220994532108,
0.16855975985527039,
-0.08935487270355225,
-0.028707070276141167,
0.1962030827999115,
-0.027907798066735268,
-0.016012340784072876,
0.10729584842920303,
0.01692749373614788,
-0.057549264281988144,
0.004783019423484802,
0.013515224680304527,
-0.08078311383724213,
-0.26025891304016113,
-0.08394558727741241,
-0.07371428608894348,
-0.09176217019557953,
-0.02870308980345726,
0.01812869682908058,
0.08674918115139008,
0.04993211477994919,
0.00784691795706749,
-0.06008899211883545,
0.007110745180398226,
-0.030764872208237648,
0.08808586001396179,
-0.01657198928296566,
0.12005404382944107,
-0.05161192640662193,
-0.03253816440701485,
0.011515569873154163,
0.017878880724310875,
0.19709312915802002,
0.0700986236333847,
0.0986996442079544,
0.114571712911129,
0.10489597171545029,
0.14963188767433167,
0.03024984709918499,
-0.073182612657547,
-0.024903979152441025,
-0.0028110432904213667,
-0.04895278438925743,
-0.05308323726058006,
0.06556874513626099,
0.15194976329803467,
-0.03702573850750923,
-0.026340443640947342,
-0.01259987149387598,
-0.005768688395619392,
0.24797143042087555,
0.07176461815834045,
-0.20004872977733612,
-0.09308810532093048,
-0.004781767725944519,
-0.06277171522378922,
-0.001199413905851543,
0.06604160368442535,
0.1419733315706253,
-0.08132798969745636,
0.0036296152975410223,
0.0023908622097223997,
0.11118023842573166,
-0.014376522973179817,
-0.0013689928455278277,
-0.10322113335132599,
0.020130015909671783,
-0.0056926957331597805,
0.07689058035612106,
-0.26443377137184143,
0.20203056931495667,
0.01356145367026329,
0.10647766292095184,
-0.04057657718658447,
-0.004244635812938213,
0.012138518504798412,
0.05307544395327568,
0.09264413267374039,
0.02366260252892971,
0.010297437198460102,
-0.06683086603879929,
-0.11444800347089767,
0.06964027136564255,
-0.00023712844995316118,
0.05401415750384331,
0.02654625102877617,
0.017234282568097115,
0.014758662320673466,
0.020978469401597977,
-0.07723776251077652,
-0.17221485078334808,
-0.06548171490430832,
0.018091492354869843,
0.18203264474868774,
0.06919482350349426,
-0.0167482178658247,
-0.09231200069189072,
-0.08180221915245056,
0.04830740764737129,
-0.09796277433633804,
-0.07119333744049072,
-0.06700127571821213,
-0.07148703187704086,
0.11418471485376358,
-0.058140821754932404,
0.02361345663666725,
0.0773995891213417,
0.11218070238828659,
-0.01776326447725296,
-0.01222969964146614,
0.04325694218277931,
-0.1089300811290741,
-0.10120904445648193,
-0.04174872860312462,
0.1651785671710968,
0.12777146697044373,
0.06855551898479462,
0.0530751533806324,
0.004054194316267967,
-0.009056267328560352,
-0.04768858850002289,
0.017634019255638123,
0.11742134392261505,
-0.12092505395412445,
0.027150265872478485,
0.016526680439710617,
-0.14025773108005524,
-0.12084311246871948,
-0.07511624693870544,
0.16712656617164612,
0.05812883749604225,
-0.05644240975379944,
0.2017536461353302,
0.2473340481519699,
-0.10563679784536362,
-0.16658423840999603,
-0.021964024752378464,
0.10801014304161072,
0.13811931014060974,
-0.01486497838050127,
-0.19174300134181976,
0.09225322306156158,
0.0018171954434365034,
-0.03729700669646263,
-0.023167917504906654,
-0.23994985222816467,
-0.15069307386875153,
0.17027004063129425,
-0.03059273585677147,
0.15327130258083344,
-0.018790416419506073,
-0.047634322196245193,
-0.028024567291140556,
0.007858602330088615,
0.03042682074010372,
-0.04461739584803581,
0.11904428899288177,
0.0039715226739645,
0.08526802808046341,
0.04061787202954292,
-0.03371887281537056,
0.09062547981739044,
0.09107239544391632,
-0.012055574916303158,
-0.009519191458821297,
0.0659397691488266,
0.007086535915732384,
0.031740717589855194,
0.11934150010347366,
-0.08823739737272263,
0.05283531919121742,
-0.10516543686389923,
-0.10581111162900925,
-0.06721147894859314,
0.0419272854924202,
0.02623843029141426,
-0.04907679557800293,
0.012464240193367004,
-0.03607174754142761,
0.024726415053009987,
0.014783157967031002,
-0.039324723184108734,
-0.13739825785160065,
0.015996627509593964,
0.12580128014087677,
0.1790490597486496,
-0.0976584404706955,
-0.11534920334815979,
-0.0693245604634285,
-0.02883782796561718,
0.11484652757644653,
-0.14314688742160797,
0.045930612832307816,
0.05925402417778969,
0.048719875514507294,
0.17527706921100616,
0.01138302031904459,
-0.08396743983030319,
0.11915318667888641,
0.04002327099442482,
-0.02877998910844326,
-0.10603334754705429,
-0.05143396556377411,
-0.01259009912610054,
-0.03257916867733002,
0.026020098477602005,
0.11210227757692337,
-0.0931219756603241,
-0.011579860933125019,
-0.0401751846075058,
0.03317514806985855,
-0.1123485118150711,
0.1978043168783188,
0.04761297255754471,
0.057245347648859024,
-0.0995187982916832,
0.027860935777425766,
0.015961239114403725,
-0.04375043138861656,
0.03932415693998337,
-0.021111009642481804,
-0.05365142971277237,
-0.04403000697493553,
-0.07731115818023682,
0.0667734295129776,
0.030540764331817627,
-0.13932864367961884,
-0.07248762995004654,
-0.1107562929391861,
0.018460504710674286,
0.050475768744945526,
0.024532822892069817,
0.006458446383476257,
-0.12168329209089279,
-0.07306098192930222,
-0.09173773229122162,
0.05970705673098564,
0.08282693475484848,
-0.006975740194320679,
-0.11059314012527466,
0.12394474446773529,
0.10690449923276901,
0.05954694002866745,
-0.04143563285470009,
-0.10069718956947327,
0.02284405566751957,
0.09286488592624664,
-0.10252116620540619,
0.037389375269412994,
-0.028239091858267784,
0.008077112026512623,
-0.0100192716345191,
-0.07788123935461044,
-0.03130508214235306,
0.08048994839191437,
-0.07969357818365097,
0.07779571413993835,
0.0046715582720935345,
0.06986957788467407,
-0.0519782230257988,
-0.0010471359128132463,
0.0479152649641037,
-0.04196701943874359,
0.08874505013227463,
0.11150168627500534,
-0.13419830799102783,
0.0978405624628067,
-0.19579622149467468,
-0.02785567194223404,
0.0715692937374115,
0.07291348278522491,
-0.025155695155262947,
-0.12879064679145813,
0.041453178972005844,
0.08584504574537277,
0.0513407401740551,
-0.0290467981249094,
0.07417825609445572,
-0.06681197881698608,
-0.03767896071076393,
-0.06581057608127594,
-0.013424582779407501,
-0.05401257053017616,
0.009044907055795193,
0.0562409870326519,
0.14870725572109222,
0.14591214060783386,
-0.08626382797956467,
0.11037477850914001,
-0.12415945529937744,
0.015128097496926785,
-0.05822628736495972,
-0.01056626532226801,
-0.141254261136055,
-0.08011167496442795,
0.05015005171298981,
-0.05960975959897041,
0.15654559433460236,
0.005455750040709972,
0.009170631878077984,
-0.027676241472363472,
-0.09360373020172119,
0.050505660474300385,
-0.01453846599906683,
0.2792084515094757,
0.04977536201477051,
0.017079250887036324,
-0.006767837796360254,
-0.009533442556858063,
0.01566295139491558,
0.16601724922657013,
-0.03606346994638443,
0.15834903717041016,
0.01609359122812748,
0.0899529755115509,
0.0815599262714386,
-0.04207992926239967,
-0.012771053239703178,
-0.0013391587417572737,
-0.1528712511062622,
0.035770535469055176,
-0.022754602134227753,
0.19559957087039948,
0.16971029341220856,
-0.0909537598490715,
0.09562081098556519,
0.015123049728572369,
-0.10440236330032349,
-0.13906624913215637,
-0.09519307315349579,
-0.08050616830587387,
-0.1169448271393776,
0.023922031745314598,
-0.09566369652748108,
0.016438553109765053,
0.09229305386543274,
0.04027017951011658,
-0.03853776678442955,
0.13845224678516388,
0.05037877708673477,
-0.07731943577528,
0.0936097651720047,
-0.0853651687502861,
-0.0030367490835487843,
-0.08606201410293579,
0.02346675843000412,
0.17897294461727142,
0.004039281513541937,
0.035353757441043854,
-0.00701440405100584,
-0.04384344071149826,
0.037633802741765976,
-0.04707203060388565,
-0.0681428611278534,
-0.013034391216933727,
-0.0348830446600914,
0.08834930509328842,
0.12938182055950165,
0.11558081954717636,
-0.04551037400960922,
0.004511467646807432,
0.06079765409231186,
-0.0211170744150877,
-0.10571524500846863,
-0.1452852338552475,
0.11291768401861191,
0.021224096417427063,
-0.00005983981827739626,
0.01240194495767355,
-0.03560919314622879,
0.0032607833854854107,
0.24494518339633942,
0.1929761916399002,
0.013829121366143227,
0.02469283901154995,
-0.04692947119474411,
-0.009000616148114204,
-0.05146024376153946,
0.08116765320301056,
0.05932862311601639,
0.2263362854719162,
-0.005631448235362768,
0.009079650975763798,
-0.10831484198570251,
-0.08132946491241455,
0.0064275749027729034,
0.06159081310033798,
-0.05229870602488518,
-0.079600490629673,
0.00745621370151639,
0.13824914395809174,
-0.07599670439958572,
-0.06086757779121399,
-0.07574959844350815,
-0.08906073123216629,
-0.105278879404068,
-0.027948100119829178,
-0.0019826788920909166,
0.10388290137052536,
-0.005552858114242554,
-0.07016020268201828,
0.04220694676041603,
0.15495385229587555,
0.0013359930599108338,
-0.07216008007526398,
-0.046930644661188126,
0.06437757611274719,
-0.07567022740840912,
0.035795051604509354,
0.006431409623473883,
0.15758350491523743,
0.019990630447864532,
0.12352640181779861,
-0.015155293978750706,
0.1664123833179474,
-0.01580808311700821,
-0.05213727802038193,
0.02685542404651642,
0.1411142647266388,
-0.010789183899760246,
0.10707572102546692,
0.019708141684532166,
-0.08805841207504272,
0.06020065024495125,
-0.1401873230934143,
-0.030366964638233185,
-0.0934026837348938,
0.05027956888079643,
-0.04602816328406334,
0.08591405302286148,
0.05045216903090477,
-0.06168360263109207,
-0.09128052741289139,
-0.05484192445874214,
0.06029944494366646,
0.006322074681520462,
-0.07285039871931076,
-0.04287486523389816,
-0.24580247700214386,
0.005419491790235043,
-0.002226631622761488,
-0.008115067146718502,
-0.2617309093475342,
-0.01891142688691616,
-0.00029221110162325203,
-0.09329834580421448,
0.037887636572122574,
0.03203235939145088,
0.08594132214784622,
0.034781452268362045,
-0.007252182811498642,
0.007204648107290268,
0.04082803800702095,
0.10686931014060974,
-0.15164650976657867,
-0.1070776879787445
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6780
- Wer: 0.3670
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.514 | 2.07 | 400 | 1.4589 | 0.8531 |
| 1.4289 | 4.15 | 800 | 0.8940 | 0.6475 |
| 1.276 | 6.22 | 1200 | 0.7743 | 0.6089 |
| 1.2213 | 8.29 | 1600 | 0.6919 | 0.4973 |
| 1.1522 | 10.36 | 2000 | 0.6635 | 0.4588 |
| 1.0914 | 12.44 | 2400 | 0.6839 | 0.4586 |
| 1.0499 | 14.51 | 2800 | 0.7151 | 0.4467 |
| 1.0238 | 16.58 | 3200 | 0.6824 | 0.4436 |
| 0.9963 | 18.65 | 3600 | 0.6872 | 0.4437 |
| 0.9728 | 20.73 | 4000 | 0.7047 | 0.4244 |
| 0.9373 | 22.8 | 4400 | 0.6569 | 0.4189 |
| 0.9028 | 24.87 | 4800 | 0.6623 | 0.4094 |
| 0.8759 | 26.94 | 5200 | 0.6723 | 0.4152 |
| 0.8824 | 29.02 | 5600 | 0.6467 | 0.4017 |
| 0.8371 | 31.09 | 6000 | 0.6911 | 0.4080 |
| 0.8205 | 33.16 | 6400 | 0.7145 | 0.4063 |
| 0.7837 | 35.23 | 6800 | 0.7037 | 0.3930 |
| 0.7708 | 37.31 | 7200 | 0.6925 | 0.3840 |
| 0.7359 | 39.38 | 7600 | 0.7034 | 0.3829 |
| 0.7153 | 41.45 | 8000 | 0.7030 | 0.3794 |
| 0.7127 | 43.52 | 8400 | 0.6823 | 0.3761 |
| 0.6884 | 45.6 | 8800 | 0.6854 | 0.3711 |
| 0.6835 | 47.67 | 9200 | 0.6723 | 0.3665 |
| 0.6703 | 49.74 | 9600 | 0.6773 | 0.3668 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
|
{"language": ["hi"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "", "results": []}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-1b-hi-cv8
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"hi",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hi"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #hi #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - HI dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6780
* Wer: 0.3670
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1500
* num\_epochs: 50.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #hi #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
79,
160,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #hi #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
-0.12581504881381989,
0.14905059337615967,
-0.003934110514819622,
0.027692783623933792,
0.10925710201263428,
0.00827640201896429,
0.09612789005041122,
0.14952579140663147,
-0.06800848245620728,
0.12426180392503738,
0.09360598027706146,
0.0914701446890831,
0.10014741867780685,
0.1452479064464569,
-0.017437748610973358,
-0.2655702531337738,
0.03411179035902023,
-0.03401593118906021,
-0.09617383778095245,
0.09846083074808121,
0.08823911845684052,
-0.10713690519332886,
0.03694476559758186,
0.01118109654635191,
-0.09319514036178589,
-0.01701459474861622,
-0.039509501308202744,
-0.05962064862251282,
0.08587362617254257,
0.05197112262248993,
0.03511153161525726,
0.028677335008978844,
0.0717841237783432,
-0.27888762950897217,
0.00935172475874424,
0.061223942786455154,
0.03710348159074783,
0.05555365979671478,
0.10353883355855942,
-0.006810595281422138,
0.10032983869314194,
-0.08222223818302155,
0.033733949065208435,
0.05504303798079491,
-0.08175729215145111,
-0.26845699548721313,
-0.10299486666917801,
0.014164460822939873,
0.12938176095485687,
0.0905122235417366,
-0.040858250111341476,
0.043630316853523254,
-0.04855421185493469,
0.07684806734323502,
0.23515523970127106,
-0.2293991595506668,
-0.06157630681991577,
-0.017677387222647667,
0.03625115379691124,
0.03484153375029564,
-0.10228870809078217,
0.008522499352693558,
0.030953917652368546,
0.01066527795046568,
0.07502961158752441,
0.017668886110186577,
0.07458353787660599,
0.00424508610740304,
-0.1302337795495987,
-0.05087660998106003,
0.146489217877388,
0.09462551027536392,
-0.01824060268700123,
-0.11358329653739929,
-0.02533036284148693,
-0.1700238734483719,
-0.05062482878565788,
0.009484353475272655,
0.022608252242207527,
-0.025082336738705635,
-0.06840257346630096,
0.017336638644337654,
-0.03604864701628685,
-0.05795776844024658,
0.030423803254961967,
0.12616893649101257,
0.039468903094530106,
-0.04305490851402283,
0.025570353493094444,
0.06303299963474274,
0.06311339884996414,
-0.15923640131950378,
-0.018997708335518837,
0.04364264756441116,
-0.09239739179611206,
-0.0016307441983371973,
-0.021089239045977592,
0.02019280567765236,
0.0655391737818718,
0.12718138098716736,
0.015441779047250748,
0.09439007937908173,
0.007363465614616871,
0.006135901436209679,
-0.0655495896935463,
0.1441919356584549,
-0.07838049530982971,
-0.12607057392597198,
-0.03200290724635124,
0.10354776680469513,
0.025290491059422493,
-0.02882816456258297,
-0.08662065863609314,
0.0319378562271595,
0.10140345245599747,
0.05023004859685898,
0.016413697972893715,
0.0058157965540885925,
-0.07448060065507889,
-0.028420135378837585,
0.0010833953274413943,
-0.11569973826408386,
0.05312207341194153,
0.04119520261883736,
-0.05853760242462158,
0.01597541943192482,
-0.03985518962144852,
0.030007224529981613,
-0.03939478099346161,
0.08848990499973297,
-0.05202032998204231,
-0.011624563485383987,
-0.06721510738134384,
-0.09571133553981781,
0.03969340771436691,
-0.04937057942152023,
0.0001256803807336837,
-0.06975574046373367,
-0.07911141961812973,
-0.07076894491910934,
0.04728969931602478,
-0.06802212446928024,
-0.06908112019300461,
-0.10039342194795609,
-0.0784250944852829,
0.06948903948068619,
-0.02504405379295349,
0.1456148624420166,
-0.06442836672067642,
0.09157911688089371,
0.010858055204153061,
0.06082522124052048,
0.07267927378416061,
0.07130599766969681,
-0.013464485295116901,
0.04257640615105629,
-0.1387316882610321,
0.10447097569704056,
-0.10356266051530838,
0.05452712997794151,
-0.14753331243991852,
-0.0851534754037857,
-0.003502574283629656,
-0.004514055326581001,
0.11199415475130081,
0.12963594496250153,
-0.1753421127796173,
-0.09776894003152847,
0.14452378451824188,
-0.05272311344742775,
-0.09668293595314026,
0.1428692638874054,
-0.006008296739310026,
-0.05441448837518692,
0.03547825291752815,
0.19063593447208405,
0.10712683945894241,
-0.09997307509183884,
-0.022873716428875923,
-0.05933411791920662,
0.133822500705719,
0.028267258778214455,
0.1073017418384552,
-0.06406284868717194,
0.015566526912152767,
-0.00451172050088644,
-0.006486411672085524,
0.07742992043495178,
-0.08180268108844757,
-0.07147642225027084,
-0.00976675283163786,
-0.07451014220714569,
-0.008743446320295334,
0.0458659827709198,
0.008457114920020103,
-0.0930481031537056,
-0.11895338445901871,
-0.028147900477051735,
0.11125416308641434,
-0.10361798852682114,
0.028691349551081657,
-0.08711935579776764,
0.09659415483474731,
-0.03124961070716381,
0.005002248100936413,
-0.13877259194850922,
0.02308833785355091,
0.04254821315407753,
-0.039972156286239624,
-0.005438150838017464,
-0.06130080670118332,
0.055491864681243896,
0.03490818664431572,
-0.03337996080517769,
-0.06126442179083824,
-0.032790444791316986,
-0.0005042622215114534,
-0.052727241069078445,
-0.24441006779670715,
-0.055412717163562775,
-0.023308025673031807,
0.1656447798013687,
-0.15925295650959015,
-0.010501056909561157,
0.04185132682323456,
0.14472590386867523,
0.024176910519599915,
-0.0582449808716774,
0.01110982894897461,
0.06738446652889252,
-0.03367409482598305,
-0.07736597955226898,
0.025458097457885742,
0.011654617264866829,
-0.10835547000169754,
0.017315154895186424,
-0.11591049283742905,
0.07070028781890869,
0.11165636777877808,
0.02058008313179016,
-0.043016038835048676,
-0.05995895341038704,
-0.03897559642791748,
-0.043356191366910934,
-0.032971229404211044,
-0.003057964611798525,
0.1369948387145996,
0.011403273791074753,
0.09487702697515488,
-0.07771925628185272,
-0.02727639116346836,
0.048679959028959274,
0.026347298175096512,
-0.03451219201087952,
0.13731756806373596,
0.07559805363416672,
-0.03930789604783058,
0.10726011544466019,
0.07299815118312836,
-0.02467450313270092,
0.14038507640361786,
-0.06593499332666397,
-0.08881551772356033,
-0.04092011600732803,
0.014086165465414524,
0.02276713401079178,
0.10213962942361832,
-0.18744419515132904,
-0.024348191916942596,
0.02674541249871254,
0.03342365846037865,
0.01710008829832077,
-0.1622191220521927,
0.009328161366283894,
0.04089365154504776,
-0.07606382668018341,
-0.004771600477397442,
-0.0040010311640799046,
-0.029408326372504234,
0.0795903131365776,
0.024032708257436752,
-0.07207445055246353,
-0.032772406935691833,
-0.03953678160905838,
-0.09414177387952805,
0.15023884177207947,
-0.12607114017009735,
-0.14197255671024323,
-0.10368593037128448,
-0.05342302843928337,
-0.03148120269179344,
-0.005179875995963812,
0.05516936257481575,
-0.10692816227674484,
-0.042845096439123154,
-0.06649941205978394,
0.01694362238049507,
-0.062228865921497345,
0.043400369584560394,
0.054233454167842865,
-0.0021113366819918156,
0.03170151263475418,
-0.08420497924089432,
0.009142422117292881,
-0.004707103129476309,
0.017538055777549744,
-0.011176797561347485,
0.00770838325843215,
0.11346650123596191,
0.15857148170471191,
0.06759239733219147,
0.04499517008662224,
-0.03705725073814392,
0.18755197525024414,
-0.12895949184894562,
-0.0037214860785752535,
0.10580220073461533,
0.03864862397313118,
0.038294509053230286,
0.15653222799301147,
0.03166833519935608,
-0.09483618289232254,
0.01828158088028431,
0.022870054468512535,
-0.012471123598515987,
-0.2180265635251999,
-0.04515431448817253,
-0.08286940306425095,
-0.03253341093659401,
0.09380245953798294,
0.022449851036071777,
-0.008951438590884209,
0.018277423456311226,
-0.027007564902305603,
0.01016679685562849,
0.02555617317557335,
0.051561757922172546,
0.07750115543603897,
0.04613948240876198,
0.10901237279176712,
-0.017140069976449013,
-0.00979834608733654,
0.03887724131345749,
-0.012035186402499676,
0.24092617630958557,
0.01923581212759018,
0.20016001164913177,
0.038417719304561615,
0.15878379344940186,
-0.002503807656466961,
0.03237353637814522,
0.023386338725686073,
0.0028687529265880585,
0.011424405500292778,
-0.05902785807847977,
-0.050325751304626465,
0.04089159891009331,
0.1360708475112915,
0.01704351231455803,
-0.11478762328624725,
0.029007719829678535,
0.028506232425570488,
0.34717974066734314,
0.09608349204063416,
-0.27406349778175354,
-0.06824000924825668,
0.0191474799066782,
-0.05574844405055046,
-0.03666860982775688,
0.03990606591105461,
0.11294405162334442,
-0.05628609284758568,
0.09118422865867615,
-0.038143355399370193,
0.08938492089509964,
-0.0818822905421257,
-0.003534322138875723,
0.05240721255540848,
0.0909726470708847,
0.0011048214510083199,
0.05530429631471634,
-0.2738761007785797,
0.260628879070282,
-0.002648521913215518,
0.0855729877948761,
-0.05257410556077957,
0.04174289107322693,
0.03655915707349777,
-0.022254731506109238,
0.07876192778348923,
0.0014175742398947477,
-0.14045684039592743,
-0.13431952893733978,
-0.11124680936336517,
0.010178442113101482,
0.1212388277053833,
-0.057480502873659134,
0.1081179827451706,
-0.04002949595451355,
-0.03939030319452286,
0.029323343187570572,
-0.04458753764629364,
-0.11911781877279282,
-0.11971033364534378,
0.030961735174059868,
0.04661884158849716,
0.08807901293039322,
-0.07610861212015152,
-0.09256012737751007,
-0.06989738345146179,
0.1508016139268875,
-0.10219478607177734,
-0.010548204183578491,
-0.1310572773218155,
0.07404955476522446,
0.1349087506532669,
-0.06499839574098587,
0.044327571988105774,
0.0127911027520895,
0.1282724142074585,
0.020412368699908257,
-0.015840675681829453,
0.11317571252584457,
-0.08234748989343643,
-0.1797455996274948,
-0.06833518296480179,
0.15930980443954468,
0.025264844298362732,
0.06331528723239899,
-0.017142338678240776,
0.0263588298112154,
-0.00842652190476656,
-0.06959303468465805,
0.09400750696659088,
0.08359963446855545,
0.027967305853962898,
0.06107066199183464,
0.0016361629823222756,
-0.01607203297317028,
-0.0890718400478363,
-0.08616652339696884,
0.12648113071918488,
0.2677333354949951,
-0.07711860537528992,
0.04703358933329582,
0.053465280681848526,
-0.05788277089595795,
-0.15420259535312653,
-0.012736290693283081,
0.11134079098701477,
0.04733389988541603,
-0.0376502200961113,
-0.2108263522386551,
-0.0006320054526440799,
0.063303641974926,
-0.022663865238428116,
0.06593220680952072,
-0.33349668979644775,
-0.12967857718467712,
0.07851514965295792,
0.07014768570661545,
0.00714392215013504,
-0.16012786328792572,
-0.07754949480295181,
-0.04184558615088463,
-0.09456895291805267,
0.033734191209077835,
-0.0030628249514847994,
0.11965636163949966,
0.014246608130633831,
0.004541939124464989,
0.013677691109478474,
-0.044735830277204514,
0.1585366576910019,
0.0061348057352006435,
0.02365930564701557,
-0.015890855342149734,
0.04529735445976257,
-0.026218852028250694,
-0.06556408107280731,
-0.010625110939145088,
-0.06813202798366547,
0.029315387830138206,
-0.1343410760164261,
-0.030884165316820145,
-0.06788114458322525,
0.013533449731767178,
-0.029590468853712082,
-0.021956050768494606,
-0.0430942103266716,
0.04137898609042168,
0.10177896171808243,
0.012882595881819725,
0.13064973056316376,
-0.04947438836097717,
0.11410167813301086,
0.09359904378652573,
0.09355408698320389,
-0.036297962069511414,
-0.0717088058590889,
-0.010303293354809284,
-0.01888788305222988,
0.04197824373841286,
-0.11486804485321045,
0.031623173505067825,
0.1363300383090973,
0.03831484541296959,
0.14396920800209045,
0.04875745624303818,
-0.09008568525314331,
0.022545771673321724,
0.063722625374794,
-0.07212533056735992,
-0.16667766869068146,
-0.0278361514210701,
0.027663148939609528,
-0.10807929188013077,
-0.00984340813010931,
0.10665763169527054,
-0.0251329243183136,
-0.002307885093614459,
0.0048207491636276245,
0.04735575616359711,
-0.024159038439393044,
0.2160431444644928,
0.009318406693637371,
0.0753873735666275,
-0.10549835860729218,
0.08869287371635437,
0.050432778894901276,
-0.14446869492530823,
0.059088513255119324,
0.07245544344186783,
-0.06070573627948761,
-0.014247660525143147,
0.017245395109057426,
0.09767017513513565,
0.04305466637015343,
-0.06269815564155579,
-0.10803069174289703,
-0.14664693176746368,
0.1092926636338234,
0.050661347806453705,
0.02365150675177574,
0.015152832493185997,
-0.015506035648286343,
0.019198812544345856,
-0.09293124824762344,
0.09845227748155594,
0.0827622264623642,
0.0519576370716095,
-0.1299198865890503,
0.07509087026119232,
0.007079576142132282,
-0.00024699512869119644,
-0.00018132540571969002,
-0.01102518755942583,
-0.09840351343154907,
0.023129234090447426,
-0.12364227324724197,
-0.00980220828205347,
-0.08728724718093872,
-0.007719465531408787,
0.01384106557816267,
-0.06430460512638092,
-0.06624289602041245,
0.01715121418237686,
-0.10637122392654419,
-0.06040092185139656,
-0.031787920743227005,
0.06747683137655258,
-0.09262665361166,
-0.014303470961749554,
0.025680290535092354,
-0.12747949361801147,
0.09722111374139786,
0.04737047106027603,
0.027910172939300537,
0.002261996502056718,
-0.07180466502904892,
-0.016093095764517784,
0.03203333169221878,
0.0073216878809034824,
0.035256218165159225,
-0.2019205093383789,
-0.008542400784790516,
-0.00834133941680193,
0.005911712069064379,
-0.013630809262394905,
0.041906263679265976,
-0.1113128811120987,
-0.025089949369430542,
-0.05428434535861015,
-0.03409541770815849,
-0.04685330390930176,
0.06985504180192947,
0.1004861369729042,
0.016336074098944664,
0.1508035808801651,
-0.07607194036245346,
0.047731418162584305,
-0.20187368988990784,
0.01232083234935999,
-0.03323311358690262,
-0.06440754979848862,
-0.05168928578495979,
-0.027506927028298378,
0.10014837235212326,
-0.05022364482283592,
0.07227945327758789,
-0.045320894569158554,
0.05545607581734657,
0.027002064511179924,
-0.10943286865949631,
0.01692396216094494,
0.050389278680086136,
0.19154568016529083,
0.05965016037225723,
-0.021036595106124878,
0.08124502748250961,
0.004013242665678263,
0.08446609228849411,
0.17888487875461578,
0.12371518462896347,
0.14327850937843323,
0.10241007059812546,
0.10841245949268341,
0.06264303624629974,
-0.14294815063476562,
-0.14689625799655914,
0.15865196287631989,
-0.06991815567016602,
0.15442143380641937,
-0.0042677102610468864,
0.17906172573566437,
0.10398240387439728,
-0.19822414219379425,
0.050952110439538956,
-0.03647950664162636,
-0.07540193945169449,
-0.10499171167612076,
-0.07856034487485886,
-0.08974621444940567,
-0.19734086096286774,
0.010872294194996357,
-0.10841754823923111,
0.06447672843933105,
0.03479371219873428,
0.04562890902161598,
0.03926800936460495,
0.0652977004647255,
0.04513201862573624,
-0.010583343915641308,
0.12224241346120834,
0.004384253174066544,
-0.03943861648440361,
-0.04778779670596123,
-0.12330316007137299,
0.05095669627189636,
-0.04303640127182007,
0.0748099610209465,
-0.01737038791179657,
-0.11030595749616623,
0.08083135634660721,
0.018894385546445847,
-0.10703282058238983,
0.026478959247469902,
-0.026489300653338432,
0.05355824902653694,
0.12077385187149048,
0.04599975794553757,
-0.01346792932599783,
0.004659808240830898,
0.20342841744422913,
-0.0935494601726532,
-0.03769427165389061,
-0.1385430544614792,
0.16725599765777588,
0.006327938288450241,
0.015545872040092945,
0.022354794666171074,
-0.08143932372331619,
-0.026106655597686768,
0.16423948109149933,
0.12874431908130646,
-0.017451297491788864,
-0.034201931208372116,
0.02590087242424488,
-0.010588441044092178,
-0.04148639366030693,
0.07197766751050949,
0.1246812716126442,
0.0398096963763237,
-0.03708750754594803,
-0.025761863216757774,
-0.05016811564564705,
-0.06788484752178192,
-0.02771211601793766,
0.07126988470554352,
0.008796901442110538,
-0.024275582283735275,
-0.0035039950162172318,
0.11834751814603806,
-0.05732842534780502,
-0.1426493525505066,
0.05211793631315231,
-0.18205153942108154,
-0.1829376220703125,
-0.026342177763581276,
0.055856913328170776,
0.054158151149749756,
0.05021965503692627,
0.0026170266792178154,
-0.017858756706118584,
0.1333499699831009,
0.006475887261331081,
-0.05468737334012985,
-0.10582215338945389,
0.061342500150203705,
-0.10951730608940125,
0.16212047636508942,
-0.03883736580610275,
0.011556899175047874,
0.13232235610485077,
0.08549455553293228,
-0.09539875388145447,
0.03418298810720444,
0.08547583967447281,
-0.09866963326931,
0.06572014838457108,
0.17176000773906708,
-0.04584519937634468,
0.15161144733428955,
0.06051815673708916,
-0.06280618906021118,
0.022671755403280258,
-0.08830307424068451,
-0.029095035046339035,
-0.05641356483101845,
0.007540278136730194,
-0.05458102375268936,
0.14192405343055725,
0.16128596663475037,
-0.07245856523513794,
-0.016836371272802353,
-0.02598516270518303,
0.013203429989516735,
0.014974771998822689,
0.14247030019760132,
-0.03662915527820587,
-0.2756684124469757,
0.013248711824417114,
-0.0014249883824959397,
0.03514883294701576,
-0.2033657282590866,
-0.07109708338975906,
0.014132922515273094,
-0.053082894533872604,
-0.06812240183353424,
0.12493054568767548,
0.05524551123380661,
0.028439387679100037,
-0.07684466987848282,
-0.12213516235351562,
-0.01863153465092182,
0.18384167551994324,
-0.1643519401550293,
-0.06306464970111847
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-1B - Hindi
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6921
- Wer: 0.3547
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.0674 | 2.07 | 400 | 1.3411 | 0.8835 |
| 1.324 | 4.15 | 800 | 0.9311 | 0.7142 |
| 1.2023 | 6.22 | 1200 | 0.8060 | 0.6170 |
| 1.1573 | 8.29 | 1600 | 0.7415 | 0.4972 |
| 1.1117 | 10.36 | 2000 | 0.7248 | 0.4588 |
| 1.0672 | 12.44 | 2400 | 0.6729 | 0.4350 |
| 1.0336 | 14.51 | 2800 | 0.7117 | 0.4346 |
| 1.0025 | 16.58 | 3200 | 0.7019 | 0.4272 |
| 0.9578 | 18.65 | 3600 | 0.6792 | 0.4118 |
| 0.9272 | 20.73 | 4000 | 0.6863 | 0.4156 |
| 0.9321 | 22.8 | 4400 | 0.6535 | 0.3972 |
| 0.8802 | 24.87 | 4800 | 0.6766 | 0.3906 |
| 0.844 | 26.94 | 5200 | 0.6782 | 0.3949 |
| 0.8387 | 29.02 | 5600 | 0.6916 | 0.3921 |
| 0.8042 | 31.09 | 6000 | 0.6806 | 0.3797 |
| 0.793 | 33.16 | 6400 | 0.7120 | 0.3831 |
| 0.7567 | 35.23 | 6800 | 0.6862 | 0.3808 |
| 0.7463 | 37.31 | 7200 | 0.6893 | 0.3709 |
| 0.7053 | 39.38 | 7600 | 0.7096 | 0.3701 |
| 0.6906 | 41.45 | 8000 | 0.6921 | 0.3676 |
| 0.6891 | 43.52 | 8400 | 0.7167 | 0.3663 |
| 0.658 | 45.6 | 8800 | 0.6833 | 0.3580 |
| 0.6576 | 47.67 | 9200 | 0.6914 | 0.3569 |
| 0.6358 | 49.74 | 9600 | 0.6922 | 0.3551 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-1b-hi-with-lm --dataset mozilla-foundation/common_voice_8_0 --config hi --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-xls-r-1b-hi-with-lm"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "hi", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "तुम्हारे पास तीन महीने बचे हैं"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 26.209 | 15.899 |
|
{"language": ["hi"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer"], "model-index": [{"name": "XLS-R-1B - Hindi", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "hi"}, "metrics": [{"type": "wer", "value": 15.899, "name": "Test WER"}, {"type": "cer", "value": 5.83, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-1b-hi-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"hi",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hi"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #hi #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
XLS-R-1B - Hindi
================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - HI dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6921
* Wer: 0.3547
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1500
* num\_epochs: 50.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #hi #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
115,
160,
4,
41,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #hi #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.11647762358188629,
0.11110296845436096,
-0.006923345848917961,
0.034047938883304596,
0.09928243607282639,
0.02202165313065052,
0.09480245411396027,
0.15545539557933807,
-0.060491133481264114,
0.11799735575914383,
0.06128925457596779,
0.07180213183164597,
0.08363039791584015,
0.07988263666629791,
-0.015501366928219795,
-0.261199414730072,
0.017859619110822678,
-0.05098573490977287,
-0.10831557959318161,
0.10183307528495789,
0.1041400283575058,
-0.09117501229047775,
0.02301456220448017,
0.011693247593939304,
-0.05534813553094864,
0.0032506275456398726,
-0.04843370243906975,
-0.03854483738541603,
0.07240414619445801,
0.043465353548526764,
0.032242294400930405,
0.039313264191150665,
0.07212073355913162,
-0.28213733434677124,
0.008361371234059334,
0.06818490475416183,
0.035288628190755844,
0.05075554922223091,
0.13146652281284332,
-0.015571888536214828,
0.10911215841770172,
-0.07055559754371643,
0.027710044756531715,
0.07006652653217316,
-0.09858459234237671,
-0.24161306023597717,
-0.09080450981855392,
0.03206489607691765,
0.14063866436481476,
0.0823233351111412,
-0.04733194783329964,
0.023710984736680984,
-0.09653705358505249,
0.0901661142706871,
0.2166670560836792,
-0.20294521749019623,
-0.06978266686201096,
-0.021449146792292595,
0.043808672577142715,
0.0066373529843986034,
-0.10768360644578934,
-0.010494884103536606,
0.010401592589914799,
0.015944750979542732,
0.05689998343586922,
0.0028659470845013857,
0.023947857320308685,
0.0082553057000041,
-0.13465768098831177,
-0.06045537814497948,
0.11751498281955719,
0.08152364939451218,
-0.02391749806702137,
-0.10779877752065659,
-0.0019828493241220713,
-0.17210830748081207,
-0.038107823580503464,
0.02508772350847721,
0.014411249198019505,
-0.032916534692049026,
-0.010374133475124836,
0.04423966258764267,
-0.04327473044395447,
-0.07510995119810104,
0.06734825670719147,
0.10656572133302689,
0.05088093876838684,
-0.05363153666257858,
0.020566755905747414,
0.09236497431993484,
0.0628383532166481,
-0.17884580790996552,
-0.03943093121051788,
0.03309369459748268,
-0.12569129467010498,
-0.013705002143979073,
-0.02359125018119812,
0.007837231270968914,
0.08054941147565842,
0.15493011474609375,
-0.01667224057018757,
0.096016064286232,
0.00751467514783144,
0.0106339817866683,
-0.06439631432294846,
0.15262678265571594,
-0.07058734446763992,
-0.08256474137306213,
-0.05359964817762375,
0.12712569534778595,
-0.025495214387774467,
-0.0035534861963242292,
-0.04053448885679245,
0.023907095193862915,
0.10581803321838379,
0.07386089116334915,
0.006365598179399967,
0.02092876471579075,
-0.0744224265217781,
-0.015834569931030273,
-0.011130208149552345,
-0.14136290550231934,
0.051878783851861954,
0.08540559560060501,
-0.07258440554141998,
-0.018083950504660606,
-0.01654413342475891,
0.0005420800880528986,
-0.05932315066456795,
0.08479722589254379,
-0.03385292738676071,
-0.002378486329689622,
-0.0798955038189888,
-0.08965600281953812,
0.04248184710741043,
-0.024650540202856064,
-0.005340518895536661,
-0.051340874284505844,
-0.07085785269737244,
-0.08740025758743286,
0.048799820244312286,
-0.07077101618051529,
-0.0576806515455246,
-0.06760768592357635,
-0.10347606986761093,
0.043039288371801376,
-0.01959429308772087,
0.17339642345905304,
-0.04821953549981117,
0.07579614222049713,
0.03886047378182411,
0.0357997789978981,
0.14743591845035553,
0.06915412843227386,
-0.00979479681700468,
0.06767749786376953,
-0.11852330714464188,
0.10880259424448013,
-0.12957918643951416,
0.05358633026480675,
-0.14459629356861115,
-0.10384159535169601,
0.0018503571627661586,
0.0037678293883800507,
0.10866637527942657,
0.14189332723617554,
-0.15879733860492706,
-0.07265021651983261,
0.13507460057735443,
-0.04225463047623634,
-0.0879383310675621,
0.13404375314712524,
-0.00501563074067235,
-0.05469875410199165,
0.009165181778371334,
0.1648467630147934,
0.1483316719532013,
-0.08845855295658112,
-0.0034220486413687468,
-0.06294505298137665,
0.11060217767953873,
0.10090433806180954,
0.09949095547199249,
-0.05260999873280525,
0.04370517283678055,
-0.0014537777751684189,
-0.04027233645319939,
0.040160611271858215,
-0.0792226642370224,
-0.07907677441835403,
-0.00032812292920425534,
-0.061753012239933014,
0.007646523416042328,
0.056490615010261536,
-0.01194713357836008,
-0.08259012550115585,
-0.14607161283493042,
-0.013259642757475376,
0.0998392328619957,
-0.08203838765621185,
0.010059148073196411,
-0.0951838418841362,
0.06889910250902176,
0.009776161052286625,
0.010460028424859047,
-0.1464136838912964,
-0.009339631535112858,
0.03345361351966858,
-0.057516228407621384,
0.010743601247668266,
-0.03354448452591896,
0.06663545221090317,
0.03500281274318695,
-0.024537069723010063,
-0.0691337138414383,
-0.05132218077778816,
-0.000957897980697453,
-0.03846295177936554,
-0.23286959528923035,
-0.08878638595342636,
-0.014671297743916512,
0.16723552346229553,
-0.20024539530277252,
0.011847707442939281,
0.07875996083021164,
0.13034652173519135,
0.0004152171895839274,
-0.0410451702773571,
0.027237797155976295,
0.04980245232582092,
-0.02093580923974514,
-0.06599194556474686,
0.01481319684535265,
-0.002472040941938758,
-0.08035305142402649,
0.004002427216619253,
-0.11733344197273254,
0.04807571694254875,
0.07104503363370895,
-0.006064616143703461,
-0.07761926203966141,
-0.024873604997992516,
-0.05835062265396118,
-0.053928349167108536,
-0.027408210560679436,
-0.030727539211511612,
0.16439662873744965,
0.02636038511991501,
0.09232380986213684,
-0.07019060105085373,
-0.052269354462623596,
0.02534809522330761,
0.015799498185515404,
-0.0070631252601742744,
0.17365948855876923,
0.04281000792980194,
-0.02798451855778694,
0.08460505306720734,
0.022823728621006012,
-0.055311620235443115,
0.12297601252794266,
-0.08424200117588043,
-0.09398337453603745,
-0.04978204146027565,
0.046271707862615585,
0.039141107350587845,
0.08731885999441147,
-0.18637016415596008,
-0.010262965224683285,
0.029275694862008095,
0.020066341385245323,
0.027684299275279045,
-0.16486364603042603,
0.005620242562144995,
0.0377538688480854,
-0.08656872808933258,
-0.008486640639603138,
0.015557992272078991,
-0.0035978262312710285,
0.07106947898864746,
0.0038053945172578096,
-0.07150530070066452,
-0.04084936901926994,
-0.06280015408992767,
-0.09725449979305267,
0.15856751799583435,
-0.08743088692426682,
-0.12921321392059326,
-0.11667412519454956,
-0.0033594712149351835,
-0.02236935868859291,
-0.024123279377818108,
0.04777410626411438,
-0.0951492190361023,
-0.04363385960459709,
-0.07283996045589447,
0.010916763916611671,
-0.018340446054935455,
0.009761841036379337,
0.03373793140053749,
0.011088619939982891,
0.07344081997871399,
-0.09584421664476395,
0.012899907305836678,
0.005115651059895754,
-0.029327286407351494,
-0.0029734105337411165,
0.027349799871444702,
0.09363618493080139,
0.1557529866695404,
0.058000851422548294,
0.04936061426997185,
-0.010215147398412228,
0.1985761970281601,
-0.1401386857032776,
0.006212166510522366,
0.11974051594734192,
0.0036830510944128036,
0.04089096933603287,
0.14109618961811066,
0.03600287809967995,
-0.0756692886352539,
0.013519384898245335,
0.04809551313519478,
-0.0025474121794104576,
-0.25276148319244385,
-0.018201295286417007,
-0.07771214097738266,
-0.019221680238842964,
0.08462589979171753,
0.0350445993244648,
0.0053499191999435425,
0.00844530202448368,
-0.024970319122076035,
-0.04388945922255516,
0.06178152188658714,
0.043984659016132355,
0.06607785075902939,
0.038927093148231506,
0.0930766612291336,
-0.01591932587325573,
-0.026424212381243706,
0.011233486235141754,
-0.008812782354652882,
0.21036764979362488,
-0.0029292376711964607,
0.19361454248428345,
0.06802715361118317,
0.12897635996341705,
-0.024600980803370476,
0.03210270404815674,
-0.00497515220195055,
0.003604692639783025,
0.04271433874964714,
-0.061871856451034546,
-0.013695143163204193,
0.026401212438941002,
0.1175965890288353,
0.017747322097420692,
-0.09250850230455399,
0.006718344520777464,
0.04458778724074364,
0.3370321989059448,
0.08257509022951126,
-0.24631856381893158,
-0.06331708282232285,
0.013699449598789215,
-0.07173454016447067,
-0.029990138486027718,
0.019631536677479744,
0.11336640268564224,
-0.07872918993234634,
0.0803058072924614,
-0.04084627702832222,
0.09625759720802307,
-0.04639337211847305,
0.016283947974443436,
0.09539805352687836,
0.10080847889184952,
0.015049793757498264,
0.07409783452749252,
-0.2627180218696594,
0.234893336892128,
-0.01822741888463497,
0.06906187534332275,
-0.04959675297141075,
0.057489026337862015,
0.03961324691772461,
-0.028483020141720772,
0.08527370542287827,
-0.006029720418155193,
-0.09775390475988388,
-0.13907983899116516,
-0.07970892637968063,
0.005104864947497845,
0.1244339719414711,
-0.06493892520666122,
0.12383944541215897,
-0.03941692039370537,
-0.054195377975702286,
0.02536998875439167,
-0.035974808037281036,
-0.12826970219612122,
-0.08608845621347427,
0.060877725481987,
-0.00844448059797287,
0.08387000113725662,
-0.08212588727474213,
-0.08221045881509781,
-0.1021670252084732,
0.1531876027584076,
-0.14597657322883606,
-0.02044171653687954,
-0.1316269040107727,
0.05117063969373703,
0.16382409632205963,
-0.06340215355157852,
0.018246451392769814,
0.027996163815259933,
0.13666723668575287,
0.04132372513413429,
0.002748785074800253,
0.09739097207784653,
-0.077512726187706,
-0.21256113052368164,
-0.037155866622924805,
0.1829882711172104,
0.022702883929014206,
0.06771624833345413,
-0.014963651075959206,
0.014758342877030373,
-0.005939824506640434,
-0.0782305896282196,
0.08576299250125885,
0.02640390954911709,
-0.003877979703247547,
0.06771010160446167,
-0.03463493287563324,
-0.0016528385458514094,
-0.09789476543664932,
-0.0529034286737442,
0.08484657853841782,
0.23678049445152283,
-0.07243049889802933,
0.028968332335352898,
0.0023553436622023582,
-0.0761498212814331,
-0.15143004059791565,
-0.009028592146933079,
0.12991254031658173,
0.04662467911839485,
-0.00613498454913497,
-0.18605975806713104,
0.004156759474426508,
0.060525406152009964,
-0.014199813827872276,
0.08616559952497482,
-0.33119329810142517,
-0.12784834206104279,
0.07765407115221024,
0.048174045979976654,
-0.10048414021730423,
-0.18027746677398682,
-0.07616501301527023,
-0.015995802357792854,
-0.08487933874130249,
0.009133781306445599,
-0.02596924640238285,
0.11052098870277405,
0.010915243998169899,
0.002356699900701642,
0.027419958263635635,
-0.04800541698932648,
0.16561275720596313,
0.027381010353565216,
0.04003608971834183,
-0.01711115427315235,
0.021241167560219765,
0.014356784522533417,
-0.06582298129796982,
0.025438182055950165,
-0.08078301697969437,
0.017318084836006165,
-0.15481950342655182,
-0.018759259954094887,
-0.08876743912696838,
0.012922627851366997,
-0.06310369819402695,
0.0021924974862486124,
-0.01147044450044632,
0.05974915623664856,
0.09756194055080414,
0.03131881728768349,
0.09663305431604385,
-0.0817272886633873,
0.12000973522663116,
0.16387711465358734,
0.11881626397371292,
0.0346781387925148,
-0.11286625266075134,
0.0065210130997002125,
0.0274571031332016,
0.020836759358644485,
-0.11011964827775955,
0.053559496998786926,
0.1477217674255371,
0.04371440410614014,
0.14814245700836182,
0.03865966573357582,
-0.10275976359844208,
-0.00979517586529255,
0.06396012008190155,
-0.061040472239255905,
-0.15165230631828308,
-0.025674523785710335,
0.01518646813929081,
-0.1253003478050232,
-0.02646065503358841,
0.10709124803543091,
-0.02756846509873867,
0.0021430475171655416,
0.02504311315715313,
0.0653160810470581,
-0.04519207030534744,
0.22819483280181885,
0.02328546531498432,
0.11477954685688019,
-0.082282654941082,
0.06434325128793716,
0.05239809304475784,
-0.08450532704591751,
0.036486752331256866,
0.12499581277370453,
-0.03430072218179703,
-0.04100104048848152,
0.0008081356645561755,
0.07766479253768921,
0.07895767688751221,
-0.05328414961695671,
-0.12857817113399506,
-0.16024120151996613,
0.09764266014099121,
0.05606260150671005,
0.023146426305174828,
0.021346522495150566,
0.005821682047098875,
0.025137515738606453,
-0.0843423381447792,
0.12348805367946625,
0.10365880280733109,
0.05863189697265625,
-0.114360012114048,
0.09766152501106262,
-0.0019310207571834326,
0.01166612096130848,
0.008421674370765686,
-0.010813776403665543,
-0.10600307583808899,
0.04005070775747299,
-0.14169257879257202,
-0.004651534836739302,
-0.05244933068752289,
0.0001900489442050457,
0.019338039681315422,
-0.05902847275137901,
-0.05386154726147652,
0.028730805963277817,
-0.11084499955177307,
-0.05528993159532547,
-0.03239773213863373,
0.08571421355009079,
-0.09532357007265091,
-0.01659972220659256,
0.04090670868754387,
-0.14417831599712372,
0.09313816577196121,
0.024660412222146988,
0.003611529478803277,
0.0007911360589787364,
-0.0885658785700798,
-0.01617063209414482,
0.0016462508356198668,
0.017738938331604004,
0.02295679785311222,
-0.21649210155010223,
0.0025029487442225218,
-0.03504330292344093,
0.0001308078208239749,
-0.01875961385667324,
-0.006154153496026993,
-0.11901108920574188,
0.009579923003911972,
-0.033963337540626526,
-0.04671879857778549,
-0.04685882478952408,
0.06257182359695435,
0.06764866411685944,
0.015316043049097061,
0.15262065827846527,
-0.06689246743917465,
0.08623069524765015,
-0.22349222004413605,
0.005585895851254463,
-0.0060888067819178104,
-0.051538702100515366,
-0.024722278118133545,
-0.02170197293162346,
0.11136071383953094,
-0.06730424612760544,
0.06858479231595993,
-0.02558407001197338,
0.05024787038564682,
0.029297469183802605,
-0.11233416199684143,
0.024789800867438316,
0.07084046304225922,
0.12070348113775253,
0.030358608812093735,
-0.01908240094780922,
0.05994490161538124,
-0.043166857212781906,
0.035077523440122604,
0.09600260108709335,
0.14119966328144073,
0.16741468012332916,
0.0847867801785469,
0.06297095865011215,
0.10641064494848251,
-0.14922918379306793,
-0.10400957614183426,
0.1522679328918457,
-0.06324773281812668,
0.1421859711408615,
-0.03978312015533447,
0.15864653885364532,
0.10524784028530121,
-0.20306512713432312,
0.09025786817073822,
-0.059550583362579346,
-0.08631882816553116,
-0.08555137366056442,
-0.09123209863901138,
-0.07244787365198135,
-0.15670588612556458,
0.017945509403944016,
-0.09925758093595505,
0.07354458421468735,
0.04879934713244438,
0.04775020480155945,
0.021313905715942383,
0.09657827019691467,
0.051745809614658356,
-0.007025516126304865,
0.12385986000299454,
-0.005004288163036108,
-0.01619274541735649,
-0.02959013544023037,
-0.0809750184416771,
0.05477948486804962,
-0.024782981723546982,
0.0681968405842781,
-0.01758175529539585,
-0.09653957933187485,
0.0445031002163887,
0.01304691657423973,
-0.10389399528503418,
0.03725782781839371,
-0.035755813121795654,
0.044772762805223465,
0.10020934045314789,
0.04224482178688049,
-0.010042982175946236,
-0.006538036745041609,
0.17798547446727753,
-0.07744908332824707,
-0.06721528619527817,
-0.13270384073257446,
0.16285602748394012,
0.01033739186823368,
0.007652033120393753,
0.020321257412433624,
-0.07395162433385849,
-0.015460502356290817,
0.16563084721565247,
0.13051657378673553,
-0.02113177441060543,
-0.01957862079143524,
0.021576229482889175,
0.0005060231196694076,
-0.01823762059211731,
0.053973693400621414,
0.1156570091843605,
0.04928091540932655,
-0.027558736503124237,
0.0032240834552794695,
-0.02258364111185074,
-0.073183573782444,
-0.034395020455121994,
0.08182282745838165,
0.025052424520254135,
-0.0018695570761337876,
-0.008876234292984009,
0.12905268371105194,
-0.08525194972753525,
-0.17129109799861908,
0.0378132127225399,
-0.16359177231788635,
-0.19029048085212708,
-0.05124805122613907,
0.06074310466647148,
0.04918394237756729,
0.06518236547708511,
-0.00042001742986030877,
-0.048280440270900726,
0.11825579404830933,
0.0093064671382308,
-0.005786054767668247,
-0.0926087275147438,
0.06837614625692368,
-0.13425278663635254,
0.17195387184619904,
-0.03575555607676506,
0.04076836258172989,
0.13347174227237701,
0.02748221904039383,
-0.09970042109489441,
0.0237639881670475,
0.10155518352985382,
-0.1376573145389557,
0.057490572333335876,
0.19100916385650635,
-0.028994984924793243,
0.12743046879768372,
0.04106254130601883,
-0.06962849199771881,
0.010425216518342495,
-0.054869066923856735,
-0.019304467365145683,
-0.07416830211877823,
0.009569709189236164,
-0.04734070599079132,
0.11946567893028259,
0.21823418140411377,
-0.07731697708368301,
-0.00774413300678134,
-0.0449831448495388,
0.013776847161352634,
0.00392957404255867,
0.14441797137260437,
-0.04609157145023346,
-0.25919151306152344,
0.037962719798088074,
-0.011308208107948303,
0.042222268879413605,
-0.18306802213191986,
-0.08088912814855576,
0.04073181375861168,
-0.0564042329788208,
-0.055761147290468216,
0.12383685261011124,
0.07873521745204926,
0.04676463454961777,
-0.05240265280008316,
-0.1271238476037979,
-0.018489427864551544,
0.18991026282310486,
-0.16693902015686035,
-0.054538339376449585
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xls-r-1b-hi-cv7
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5878
- Wer: 0.3419
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 100.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.9859 | 2.72 | 400 | 1.1663 | 0.7948 |
| 1.2969 | 5.44 | 800 | 0.7725 | 0.6562 |
| 1.1954 | 8.16 | 1200 | 0.5940 | 0.4904 |
| 1.164 | 10.88 | 1600 | 0.5338 | 0.4316 |
| 1.1464 | 13.6 | 2000 | 0.5432 | 0.4226 |
| 1.1553 | 16.33 | 2400 | 0.5471 | 0.4260 |
| 1.0985 | 19.05 | 2800 | 0.5290 | 0.4076 |
| 1.0421 | 21.77 | 3200 | 0.5672 | 0.4181 |
| 0.9831 | 24.49 | 3600 | 0.5741 | 0.4141 |
| 0.9827 | 27.21 | 4000 | 0.5754 | 0.4179 |
| 0.9669 | 29.93 | 4400 | 0.5310 | 0.3889 |
| 0.9496 | 32.65 | 4800 | 0.5649 | 0.4062 |
| 0.9112 | 35.37 | 5200 | 0.5738 | 0.3926 |
| 0.8838 | 38.1 | 5600 | 0.5232 | 0.3768 |
| 0.8666 | 40.81 | 6000 | 0.5510 | 0.3852 |
| 0.8366 | 43.54 | 6400 | 0.5436 | 0.3837 |
| 0.7957 | 46.26 | 6800 | 0.5337 | 0.3775 |
| 0.7834 | 48.98 | 7200 | 0.5611 | 0.3844 |
| 0.7685 | 51.7 | 7600 | 0.5710 | 0.4008 |
| 0.7431 | 54.42 | 8000 | 0.5636 | 0.3726 |
| 0.7353 | 57.14 | 8400 | 0.5937 | 0.3836 |
| 0.7001 | 59.86 | 8800 | 0.5815 | 0.3858 |
| 0.6799 | 62.58 | 9200 | 0.5862 | 0.3696 |
| 0.6459 | 65.31 | 9600 | 0.6181 | 0.3762 |
| 0.6121 | 68.03 | 10000 | 0.5637 | 0.3590 |
| 0.5942 | 70.75 | 10400 | 0.6374 | 0.3882 |
| 0.5769 | 73.47 | 10800 | 0.6015 | 0.3640 |
| 0.5689 | 76.19 | 11200 | 0.5669 | 0.3508 |
| 0.5461 | 78.91 | 11600 | 0.5967 | 0.3621 |
| 0.5286 | 81.63 | 12000 | 0.5840 | 0.3605 |
| 0.5057 | 84.35 | 12400 | 0.5848 | 0.3489 |
| 0.482 | 87.07 | 12800 | 0.5860 | 0.3488 |
| 0.4655 | 89.79 | 13200 | 0.5780 | 0.3453 |
| 0.4523 | 92.52 | 13600 | 0.6150 | 0.3532 |
| 0.4422 | 95.24 | 14000 | 0.5930 | 0.3452 |
| 0.4436 | 97.96 | 14400 | 0.5867 | 0.3428 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_7_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-1b-hi --dataset mozilla-foundation/common_voice_7_0 --config hi --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-xls-r-1b-hi"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_7_0", "hi", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "तुम्हारे पास तीन महीने बचे हैं"
```
### Eval results on Common Voice 7 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 28.942 | 18.504 |
|
{"language": ["hi"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_7_0", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer"], "model-index": [{"name": "wav2vec2-xls-r-1b-hi-cv7", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "hi"}, "metrics": [{"type": "wer", "value": 18.504, "name": "Test WER"}, {"type": "cer", "value": 6.655, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-1b-hi
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_7_0",
"robust-speech-event",
"hi",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hi"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_7_0 #robust-speech-event #hi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-xls-r-1b-hi-cv7
========================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_7\_0 - HI dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5878
* Wer: 0.3419
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 100.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_7\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 7 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 100.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_7_0 #robust-speech-event #hi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 100.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
111,
160,
4,
41,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_7_0 #robust-speech-event #hi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 100.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 7 \"test\" (WER):"
] |
[
-0.1261293888092041,
0.11707478761672974,
-0.00637839874252677,
0.038008321076631546,
0.10004960000514984,
0.02172403782606125,
0.10867225378751755,
0.1630273163318634,
-0.05854635685682297,
0.10391257703304291,
0.053083181381225586,
0.0864708349108696,
0.0786508247256279,
0.07813471555709839,
-0.013902106322348118,
-0.2805381119251251,
0.015009494498372078,
-0.034140463918447495,
-0.08211314678192139,
0.1171901673078537,
0.10099446773529053,
-0.08742373436689377,
0.01674528419971466,
0.026938753202557564,
-0.06142932549118996,
-0.0015686945989727974,
-0.04313536733388901,
-0.04438157007098198,
0.07972133159637451,
0.038540054112672806,
0.04020345211029053,
0.042247168719768524,
0.06289159506559372,
-0.26748159527778625,
0.008732558228075504,
0.0664428249001503,
0.03934742137789726,
0.0570002943277359,
0.13942942023277283,
-0.033931881189346313,
0.11373703926801682,
-0.043317414820194244,
0.03231315687298775,
0.07440999150276184,
-0.08761142939329147,
-0.2582537829875946,
-0.09351903945207596,
0.056034933775663376,
0.14285072684288025,
0.0878838300704956,
-0.04724782332777977,
0.03547852858901024,
-0.10683480650186539,
0.10267411172389984,
0.23489180207252502,
-0.213027685880661,
-0.06407120078802109,
-0.024112313985824585,
0.04216775670647621,
-0.004550404846668243,
-0.10413547605276108,
-0.028612563386559486,
0.0026251468807458878,
0.013369954191148281,
0.053916025906801224,
-0.001685835886746645,
-0.0006531492690555751,
-0.000045469601900549605,
-0.14288921654224396,
-0.06477085500955582,
0.1158900111913681,
0.06904205679893494,
-0.0240163616836071,
-0.1181468516588211,
-0.010837242007255554,
-0.20394837856292725,
-0.04173742234706879,
0.026583323255181313,
0.010882367379963398,
-0.038448285311460495,
-0.01946749910712242,
0.03853416442871094,
-0.048904795199632645,
-0.08322750777006149,
0.056701574474573135,
0.11872829496860504,
0.06117679178714752,
-0.0486878827214241,
0.017269717529416084,
0.11975359916687012,
0.046578794717788696,
-0.17438672482967377,
-0.03987828642129898,
0.0392581969499588,
-0.12300504744052887,
-0.007215464487671852,
-0.01866338402032852,
-0.005588276777416468,
0.06859903782606125,
0.1577158272266388,
-0.024938318878412247,
0.08929499983787537,
0.00851595588028431,
0.018663275986909866,
-0.06814061850309372,
0.17320692539215088,
-0.07639003545045853,
-0.06724586337804794,
-0.04348163306713104,
0.13178040087223053,
-0.019505124539136887,
0.000983086065389216,
-0.04572254791855812,
0.0251929871737957,
0.11997153609991074,
0.08026225864887238,
0.014331341721117496,
0.02817048877477646,
-0.06873904168605804,
-0.02697279304265976,
0.011460752226412296,
-0.13857042789459229,
0.04324786365032196,
0.08520667254924774,
-0.08444982767105103,
-0.018632326275110245,
-0.007020824588835239,
0.0007887910469435155,
-0.06353524327278137,
0.10245954245328903,
-0.039658863097429276,
0.007512690033763647,
-0.08190778642892838,
-0.08806615322828293,
0.040830712765455246,
-0.018009496852755547,
-0.017504248768091202,
-0.042960673570632935,
-0.08936504274606705,
-0.08220844715833664,
0.03888843208551407,
-0.059855904430150986,
-0.06700392067432404,
-0.0640224739909172,
-0.10482525825500488,
0.03874046355485916,
-0.024779515340924263,
0.15908920764923096,
-0.05678331479430199,
0.08985074609518051,
0.037264853715896606,
0.025147899985313416,
0.1491752564907074,
0.07194546610116959,
-0.030969340354204178,
0.06223113462328911,
-0.13992682099342346,
0.11762254685163498,
-0.12799975275993347,
0.04354667663574219,
-0.12984535098075867,
-0.11679089069366455,
0.0029178657568991184,
-0.004005243070423603,
0.09494990855455399,
0.13959619402885437,
-0.15602420270442963,
-0.08424464613199234,
0.14971159398555756,
-0.05068488046526909,
-0.08093827962875366,
0.13862080872058868,
-0.0028650725726038218,
-0.05275808647274971,
0.016551148146390915,
0.176937073469162,
0.1623377501964569,
-0.0903322622179985,
0.0013336723204702139,
-0.05576314777135849,
0.09444048255681992,
0.08914504945278168,
0.09556885063648224,
-0.0533997043967247,
0.04674949124455452,
0.00603087805211544,
-0.04669070988893509,
0.044437311589717865,
-0.08020885288715363,
-0.07615390419960022,
-0.0019782709423452616,
-0.06297822296619415,
0.019194655120372772,
0.06327956169843674,
-0.009825803339481354,
-0.08159905672073364,
-0.153669074177742,
-0.009882042184472084,
0.11316145956516266,
-0.08461141586303711,
0.01312922965735197,
-0.09501196444034576,
0.08213382959365845,
0.012666295282542706,
0.003873721696436405,
-0.16111120581626892,
-0.017987897619605064,
0.0419025681912899,
-0.07348006963729858,
0.017921147868037224,
-0.03477170690894127,
0.05967191606760025,
0.03378340229392052,
-0.02964983880519867,
-0.08034002780914307,
-0.039708830416202545,
-0.018061945214867592,
-0.03687554597854614,
-0.2210928201675415,
-0.09380663931369781,
-0.008675708435475826,
0.17979906499385834,
-0.1799650490283966,
0.01870371215045452,
0.06976305693387985,
0.12326733767986298,
0.000006482929165940732,
-0.042889829725027084,
0.03423911705613136,
0.05874870717525482,
-0.017209503799676895,
-0.058775659650564194,
0.02245517447590828,
0.00372649566270411,
-0.07850735634565353,
0.007644413039088249,
-0.12482515722513199,
0.04984306916594505,
0.0723346546292305,
0.009845531545579433,
-0.08189862966537476,
-0.024918505921959877,
-0.0685657188296318,
-0.05872292444109917,
-0.025103673338890076,
-0.023271847516298294,
0.18007732927799225,
0.03299267962574959,
0.10251761972904205,
-0.08002364635467529,
-0.060808390378952026,
0.013929907232522964,
0.009985877200961113,
0.006765172351151705,
0.16717679798603058,
0.03804576024413109,
-0.02519233152270317,
0.09201294183731079,
-0.017130671069025993,
-0.0728345513343811,
0.14052081108093262,
-0.08361857384443283,
-0.09240073710680008,
-0.043982598930597305,
0.05673642084002495,
0.03355257213115692,
0.08196477591991425,
-0.1969560980796814,
-0.003609142266213894,
0.03468422591686249,
0.014853236265480518,
0.03381756693124771,
-0.18321779370307922,
0.004826354794204235,
0.02782776951789856,
-0.09336750954389572,
-0.011089800857007504,
0.028788050636649132,
0.011332951486110687,
0.07606659829616547,
-0.003307875245809555,
-0.08263479173183441,
-0.03955909609794617,
-0.05968111753463745,
-0.10070302337408066,
0.16921895742416382,
-0.08074967563152313,
-0.15134736895561218,
-0.12881454825401306,
-0.017103271558880806,
-0.0451081246137619,
-0.029228273779153824,
0.05631472170352936,
-0.0838775709271431,
-0.036262642592191696,
-0.07040036469697952,
0.018602605909109116,
-0.009454983286559582,
0.0011657841969281435,
-0.0006276954081840813,
0.00780128687620163,
0.07307452708482742,
-0.10885711014270782,
0.013057938776910305,
-0.006637414917349815,
-0.03634590283036232,
0.0005475946818478405,
0.03844403475522995,
0.08518277853727341,
0.18591386079788208,
0.06208715960383415,
0.05011671036481857,
-0.016540082171559334,
0.19232116639614105,
-0.14184758067131042,
-0.0034241736866533756,
0.12591983377933502,
-0.0008561916183680296,
0.04852011799812317,
0.14644268155097961,
0.03907868266105652,
-0.064580038189888,
0.003316447837278247,
0.05345558747649193,
-0.002928201574832201,
-0.2564150094985962,
-0.028108619153499603,
-0.07464661449193954,
-0.005915592424571514,
0.0726311057806015,
0.03902025148272514,
0.03980952873826027,
0.009396156296133995,
-0.03420140966773033,
-0.059001486748456955,
0.0644293874502182,
0.048000581562519073,
0.07678564637899399,
0.036406729370355606,
0.10605219751596451,
-0.01815151982009411,
-0.030215714126825333,
0.006100092548877001,
-0.00817917287349701,
0.21555468440055847,
-0.01951454021036625,
0.1880548894405365,
0.07730648666620255,
0.14485542476177216,
-0.005117911845445633,
0.03221764788031578,
-0.01353844441473484,
0.006925211753696203,
0.05373992770910263,
-0.060169536620378494,
-0.013592417351901531,
0.011919303797185421,
0.11502823233604431,
0.026914061978459358,
-0.0908270850777626,
-0.014596809633076191,
0.04032832756638527,
0.36336156725883484,
0.07631389051675797,
-0.25802159309387207,
-0.07211195677518845,
0.0015605274820700288,
-0.08426724374294281,
-0.03183078393340111,
0.015538671985268593,
0.09248729795217514,
-0.0985705778002739,
0.0794735997915268,
-0.04992802441120148,
0.09606362879276276,
-0.06661295890808105,
0.006779099814593792,
0.10187800973653793,
0.09995125234127045,
0.01825002208352089,
0.07590239495038986,
-0.23766623437404633,
0.24341097474098206,
-0.019978133961558342,
0.062038175761699677,
-0.045057475566864014,
0.061009377241134644,
0.04347332939505577,
-0.013713536784052849,
0.08155589550733566,
-0.012538379989564419,
-0.0380215123295784,
-0.14898914098739624,
-0.08865857869386673,
-0.006978804245591164,
0.11715047061443329,
-0.06575583666563034,
0.12044893205165863,
-0.036068305373191833,
-0.059942636638879776,
0.01841248758137226,
-0.06041033938527107,
-0.1136835515499115,
-0.07936941832304001,
0.06939256936311722,
-0.0022077863104641438,
0.0906635969877243,
-0.08469048887491226,
-0.09518757462501526,
-0.09337178617715836,
0.16222767531871796,
-0.13773389160633087,
-0.04277826473116875,
-0.12070275843143463,
0.0374053493142128,
0.17182718217372894,
-0.06987971812486649,
0.022201647982001305,
0.014381947927176952,
0.13562120497226715,
0.04252851381897926,
-0.0019387712236493826,
0.08830856531858444,
-0.07389946281909943,
-0.21840323507785797,
-0.029710719361901283,
0.20766957104206085,
0.0210868027061224,
0.06949100643396378,
-0.02629946731030941,
0.008834782987833023,
-0.005669636186212301,
-0.0756130963563919,
0.058275461196899414,
0.024420328438282013,
-0.010032642632722855,
0.0716601088643074,
-0.03767796978354454,
-0.015601718798279762,
-0.11125055700540543,
-0.0394965298473835,
0.09937019646167755,
0.23065632581710815,
-0.07315453886985779,
0.037672076374292374,
0.011599074117839336,
-0.06776012480258942,
-0.14715568721294403,
-0.01112213172018528,
0.143909752368927,
0.03475426882505417,
-0.022328605875372887,
-0.1866733878850937,
0.004607328213751316,
0.06885860860347748,
-0.020147064700722694,
0.10200227051973343,
-0.35638731718063354,
-0.13367073237895966,
0.08581183105707169,
0.04401353374123573,
-0.10742741078138351,
-0.17463842034339905,
-0.08087731897830963,
-0.0001803580962587148,
-0.08407757431268692,
0.01134869921952486,
0.0004910766147077084,
0.12098187208175659,
0.004021850414574146,
0.0196810495108366,
0.029869481921195984,
-0.048542387783527374,
0.15251962840557098,
0.030929800122976303,
0.03933292254805565,
-0.013460684567689896,
0.02190445363521576,
0.017553282901644707,
-0.0614926815032959,
0.04131700098514557,
-0.06822652369737625,
0.01762021891772747,
-0.15293613076210022,
-0.023167122155427933,
-0.10375689715147018,
0.021178942173719406,
-0.057475220412015915,
0.0002135486574843526,
-0.016858944669365883,
0.050091564655303955,
0.08194240927696228,
0.026251230388879776,
0.09544848650693893,
-0.08265107870101929,
0.12843601405620575,
0.1590811312198639,
0.11433945596218109,
0.013843463733792305,
-0.1499493271112442,
-0.005555939394980669,
0.02466689608991146,
0.01972898095846176,
-0.10094240307807922,
0.04461102932691574,
0.14342878758907318,
0.050904855132102966,
0.14958351850509644,
0.03906533494591713,
-0.1039426401257515,
-0.004677079617977142,
0.0578923225402832,
-0.07317349314689636,
-0.14533951878547668,
-0.031188521534204483,
-0.002703820588067174,
-0.1270124763250351,
-0.009121943265199661,
0.09630788117647171,
-0.024654820561408997,
-0.006462517194449902,
0.029949726536870003,
0.054720908403396606,
-0.04007642716169357,
0.23055902123451233,
0.019088273867964745,
0.11656099557876587,
-0.09507651627063751,
0.059587977826595306,
0.05499579757452011,
-0.08425729721784592,
0.017040474340319633,
0.1308424323797226,
-0.04397771134972572,
-0.04093456640839577,
-0.019497869536280632,
0.07665888220071793,
0.06451909989118576,
-0.054475799202919006,
-0.1346447765827179,
-0.1525888890028,
0.09331482648849487,
0.0752546563744545,
0.029394686222076416,
0.02346086874604225,
-0.004610549192875624,
0.02556937374174595,
-0.0896996483206749,
0.12817105650901794,
0.11549761146306992,
0.06124730780720711,
-0.12090793997049332,
0.10864308476448059,
0.01643792726099491,
0.02134622260928154,
0.011245446279644966,
-0.02404003217816353,
-0.09091512113809586,
0.04479698836803436,
-0.13188061118125916,
-0.017430560663342476,
-0.052636198699474335,
0.002862381050363183,
0.014588752761483192,
-0.056792423129081726,
-0.05328189209103584,
0.036057643592357635,
-0.11906153708696365,
-0.054584212601184845,
-0.031080398708581924,
0.07711243629455566,
-0.1018141582608223,
-0.008971065282821655,
0.04295332357287407,
-0.13879691064357758,
0.10161301493644714,
0.046074479818344116,
-0.001666210824623704,
0.00586398970335722,
-0.07890846580266953,
-0.02107171341776848,
0.01601671800017357,
0.01984858512878418,
0.021095359697937965,
-0.21078015863895416,
0.012027274817228317,
-0.028564002364873886,
-0.003028491046279669,
-0.014298975467681885,
0.006229595746845007,
-0.131602481007576,
0.007746330462396145,
-0.022139495238661766,
-0.04414639249444008,
-0.051813747733831406,
0.058042194694280624,
0.07337643206119537,
0.0234531257301569,
0.15982972085475922,
-0.06492874026298523,
0.09068673849105835,
-0.22773408889770508,
0.0009790081530809402,
-0.005537423305213451,
-0.05753077194094658,
-0.03008902445435524,
-0.016001004725694656,
0.10723717510700226,
-0.06285800039768219,
0.0714832991361618,
-0.017180008813738823,
0.05622877925634384,
0.022521477192640305,
-0.08680613338947296,
0.05102775618433952,
0.06491982936859131,
0.10942203551530838,
0.026971066370606422,
-0.019915558397769928,
0.07617708295583725,
-0.03990177437663078,
0.027913188561797142,
0.07235430926084518,
0.14155972003936768,
0.14573156833648682,
0.08005262911319733,
0.05272864177823067,
0.10919545590877533,
-0.13622888922691345,
-0.12022079527378082,
0.11807336658239365,
-0.06669013202190399,
0.14402681589126587,
-0.035977624356746674,
0.1934157758951187,
0.10568560659885406,
-0.19067765772342682,
0.10623956471681595,
-0.0381670817732811,
-0.08725760132074356,
-0.0884300246834755,
-0.1027771383523941,
-0.06759189814329147,
-0.166054829955101,
0.018682066351175308,
-0.09785163402557373,
0.0654999166727066,
0.04933418333530426,
0.04360009729862213,
0.014551151543855667,
0.11301320046186447,
0.04039294272661209,
-0.017035167664289474,
0.11923598498106003,
-0.0018144057830795646,
-0.01673227734863758,
-0.036548126488924026,
-0.07194402068853378,
0.059901464730501175,
-0.025563854724168777,
0.07071024179458618,
-0.021230418235063553,
-0.09902848303318024,
0.048499833792448044,
0.001712856930680573,
-0.10094700008630753,
0.03699998930096626,
-0.03054680861532688,
0.056118108332157135,
0.10352456569671631,
0.04025917127728462,
-0.003994214814156294,
-0.009511752054095268,
0.18853797018527985,
-0.07802432030439377,
-0.07799796760082245,
-0.12983882427215576,
0.19439451396465302,
0.006534167565405369,
-0.00820245873183012,
0.028260517865419388,
-0.07374241203069687,
-0.02152557298541069,
0.18094956874847412,
0.1445324867963791,
-0.0041080620139837265,
-0.020157720893621445,
0.022077854722738266,
-0.003723687259480357,
-0.02277051843702793,
0.05254369601607323,
0.12611675262451172,
0.0919649675488472,
-0.026513051241636276,
0.005545536521822214,
-0.024545952677726746,
-0.07031707465648651,
-0.03433309867978096,
0.0896398201584816,
0.01947048120200634,
-0.008317526429891586,
-0.017346732318401337,
0.12954829633235931,
-0.10071618854999542,
-0.15447036921977997,
0.03219841048121452,
-0.16502203047275543,
-0.19649304449558258,
-0.04767460748553276,
0.07154404371976852,
0.04106166958808899,
0.06278639286756516,
0.005002337042242289,
-0.049775391817092896,
0.1257769614458084,
0.005562067497521639,
-0.019254550337791443,
-0.11105631291866302,
0.06610177457332611,
-0.1377331167459488,
0.17794077098369598,
-0.03453165292739868,
0.05355637148022652,
0.1316116899251938,
0.02929413877427578,
-0.09658455103635788,
0.03774302080273628,
0.09804505109786987,
-0.15835827589035034,
0.05382581427693367,
0.19918552041053772,
-0.017295299097895622,
0.1224069595336914,
0.029024209827184677,
-0.07509126514196396,
0.015419253148138523,
-0.07294028997421265,
-0.014791717752814293,
-0.0853281319141388,
0.00794987753033638,
-0.04747932031750679,
0.11090364307165146,
0.22488273680210114,
-0.07088860124349594,
-0.00869623851031065,
-0.05607958883047104,
0.00019757906557060778,
0.017283644527196884,
0.12665244936943054,
-0.041867971420288086,
-0.2751823365688324,
0.043902330100536346,
-0.016708407551050186,
0.03570779412984848,
-0.19519861042499542,
-0.08061248064041138,
0.0442165732383728,
-0.06761720776557922,
-0.055950865149497986,
0.11426270753145218,
0.055953964591026306,
0.04906381294131279,
-0.04167528077960014,
-0.10338317602872849,
-0.021553777158260345,
0.1897404044866562,
-0.1833736002445221,
-0.055102188140153885
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Latvian
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - LV dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1660
- Wer: 0.1705
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.489 | 2.56 | 400 | 3.3590 | 1.0 |
| 2.9903 | 5.13 | 800 | 2.9704 | 1.0001 |
| 1.6712 | 7.69 | 1200 | 0.6179 | 0.6566 |
| 1.2635 | 10.26 | 1600 | 0.3176 | 0.4531 |
| 1.0819 | 12.82 | 2000 | 0.2517 | 0.3508 |
| 1.0136 | 15.38 | 2400 | 0.2257 | 0.3124 |
| 0.9625 | 17.95 | 2800 | 0.1975 | 0.2311 |
| 0.901 | 20.51 | 3200 | 0.1986 | 0.2097 |
| 0.8842 | 23.08 | 3600 | 0.1904 | 0.2039 |
| 0.8542 | 25.64 | 4000 | 0.1847 | 0.1981 |
| 0.8244 | 28.21 | 4400 | 0.1805 | 0.1847 |
| 0.7689 | 30.77 | 4800 | 0.1736 | 0.1832 |
| 0.7825 | 33.33 | 5200 | 0.1698 | 0.1821 |
| 0.7817 | 35.9 | 5600 | 0.1758 | 0.1803 |
| 0.7488 | 38.46 | 6000 | 0.1663 | 0.1760 |
| 0.7171 | 41.03 | 6400 | 0.1636 | 0.1721 |
| 0.7222 | 43.59 | 6800 | 0.1663 | 0.1729 |
| 0.7156 | 46.15 | 7200 | 0.1633 | 0.1715 |
| 0.7121 | 48.72 | 7600 | 0.1666 | 0.1718 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-lv-cv8-with-lm --dataset mozilla-foundation/common_voice_8_0 --config lv --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-lv-cv8-with-lm --dataset speech-recognition-community-v2/dev_data --config lv --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-xls-r-300m-lv-cv8-with-lm"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "lv", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "domāju ka viņam viss labi"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 16.997 | 9.633 |
|
{"language": ["lv"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "XLS-R-300M - Latvian", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "lv"}, "metrics": [{"type": "wer", "value": 9.633, "name": "Test WER"}, {"type": "cer", "value": 2.614, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "lv"}, "metrics": [{"type": "wer", "value": 36.11, "name": "Test WER"}, {"type": "cer", "value": 14.244, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "lv"}, "metrics": [{"type": "wer", "value": 44.12, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-300m-lv-cv8-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"lv",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"lv"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #lv #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Latvian
====================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - LV dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1660
* Wer: 0.1705
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 50.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #lv #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
111,
132,
4,
39,
60,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #lv #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.10391377657651901,
0.13912668824195862,
-0.006456209812313318,
0.013675927184522152,
0.10876055806875229,
0.03072018176317215,
0.10691812634468079,
0.16422303020954132,
-0.035633061081171036,
0.13999101519584656,
0.05331495404243469,
0.10876161605119705,
0.09155011177062988,
0.08730428665876389,
-0.03089730255305767,
-0.20925195515155792,
0.03905564546585083,
-0.07134860008955002,
-0.06900003552436829,
0.10152678936719894,
0.09289854764938354,
-0.08219046145677567,
0.027111846953630447,
-0.010704039596021175,
-0.05134036764502525,
-0.010895626619458199,
-0.0345572791993618,
-0.04242913797497749,
0.06091396510601044,
0.02936439774930477,
0.018507562577724457,
0.02874908037483692,
0.05396563187241554,
-0.3103705644607544,
-0.0009230549912899733,
0.0826307088136673,
0.02642812952399254,
0.039236217737197876,
0.09980002045631409,
-0.022720031440258026,
0.09878475219011307,
-0.0978572741150856,
0.035972803831100464,
0.08075336366891861,
-0.07644232362508774,
-0.23109783232212067,
-0.1119566485285759,
0.029638933017849922,
0.15198267996311188,
0.07746351510286331,
-0.04799271002411842,
0.039864055812358856,
-0.09251124411821365,
0.09214358776807785,
0.21083563566207886,
-0.20663462579250336,
-0.0430779792368412,
-0.00048734177835285664,
0.023948363959789276,
0.028770025819540024,
-0.08737828582525253,
-0.008183895610272884,
0.010360083542764187,
-0.0004718817654065788,
0.037257637828588486,
-0.004571273922920227,
0.04572850093245506,
-0.002880141604691744,
-0.1435059756040573,
-0.08033968508243561,
0.1505349576473236,
0.06596720963716507,
-0.03735719248652458,
-0.11564286798238754,
-0.009562620893120766,
-0.17003251612186432,
-0.04301426559686661,
0.01401333324611187,
0.015133447013795376,
-0.02384375035762787,
-0.004703011363744736,
0.03050404042005539,
-0.05350946635007858,
-0.0701645016670227,
0.06313338130712509,
0.10948244482278824,
0.048537325114011765,
-0.04522351175546646,
0.012446999549865723,
0.08445297181606293,
0.03344869241118431,
-0.1575969159603119,
-0.054165925830602646,
0.03801339492201805,
-0.14097632467746735,
-0.00653488514944911,
-0.024656808003783226,
0.01043062936514616,
0.09642372280359268,
0.18295784294605255,
0.025251608341932297,
0.09881237894296646,
-0.013699071481823921,
0.010799816809594631,
-0.04821638762950897,
0.15388479828834534,
-0.03390715271234512,
-0.0937059298157692,
-0.032831449061632156,
0.11848894506692886,
0.000022307885956251994,
-0.010887865908443928,
-0.030349256470799446,
0.03223327547311783,
0.11713044345378876,
0.10537172108888626,
0.030684469267725945,
0.007344780024141073,
-0.08383367210626602,
-0.017464779317378998,
-0.0292363204061985,
-0.15268878638744354,
0.06318383663892746,
0.08177470415830612,
-0.04459558054804802,
-0.013176298700273037,
-0.021470490843057632,
0.013774278573691845,
-0.068308986723423,
0.08837021887302399,
-0.04211626201868057,
-0.0007619297248311341,
-0.07231424748897552,
-0.1017327681183815,
0.05895713344216347,
-0.01934843510389328,
-0.03537578508257866,
-0.05112489312887192,
-0.0704849362373352,
-0.09154777228832245,
0.03961142525076866,
-0.05598200485110283,
-0.0457826666533947,
-0.08213172107934952,
-0.10095583647489548,
0.046848002821207047,
-0.013336405158042908,
0.1451364904642105,
-0.059028368443250656,
0.08443352580070496,
0.01875353418290615,
0.040967848151922226,
0.13671644032001495,
0.06693416088819504,
-0.014445235021412373,
0.06147490069270134,
-0.12047093361616135,
0.12220112979412079,
-0.1404944360256195,
0.04826454073190689,
-0.16097970306873322,
-0.08045017719268799,
0.013948481529951096,
-0.0008475296781398356,
0.09211350232362747,
0.148232102394104,
-0.1896667629480362,
-0.06767138838768005,
0.15701909363269806,
-0.05467658489942551,
-0.08639664202928543,
0.14069078862667084,
0.007295295596122742,
-0.047856226563453674,
0.025542650371789932,
0.1567143052816391,
0.139495849609375,
-0.10648949444293976,
-0.03241991996765137,
-0.06820392608642578,
0.07702242583036423,
0.06123879924416542,
0.1027650460600853,
-0.07657156884670258,
0.031330399215221405,
-0.005915681831538677,
-0.057007718831300735,
0.0047063822858035564,
-0.06341921538114548,
-0.07806053012609482,
-0.007500824052840471,
-0.0392598882317543,
-0.01877477765083313,
0.021672694012522697,
-0.03931213170289993,
-0.08950971066951752,
-0.1258188635110855,
-0.05226731300354004,
0.10493703931570053,
-0.08586651086807251,
0.029520850628614426,
-0.09608886390924454,
0.07598210871219635,
0.006190015003085136,
0.030581548810005188,
-0.1428961604833603,
-0.03458261117339134,
0.049414508044719696,
-0.07180767506361008,
-0.0014401815133169293,
-0.0352923758327961,
0.030746841803193092,
0.015887342393398285,
-0.002482715994119644,
-0.050163786858320236,
-0.03982856497168541,
-0.011434521526098251,
-0.04698684439063072,
-0.21680545806884766,
-0.06358243525028229,
-0.02239883504807949,
0.19548803567886353,
-0.18211862444877625,
0.01938360370695591,
0.10698229819536209,
0.10057148337364197,
0.0044569941237568855,
-0.049313709139823914,
0.016449235379695892,
0.05241376906633377,
-0.020822718739509583,
-0.058043550699949265,
0.01098608411848545,
-0.009019557386636734,
-0.08571898937225342,
-0.009880231693387032,
-0.14157377183437347,
0.0027804789133369923,
0.08092731982469559,
0.040397271513938904,
-0.05060029402375221,
-0.03807852789759636,
-0.06101485341787338,
-0.04251348227262497,
-0.05759148672223091,
-0.055856265127658844,
0.09572383016347885,
0.048348743468523026,
0.07967792451381683,
-0.06816711276769638,
-0.0601959265768528,
0.028418486937880516,
0.006141502410173416,
-0.018527209758758545,
0.157769113779068,
0.06448587775230408,
-0.04378844425082207,
0.08192519098520279,
0.024524161592125893,
-0.040621448308229446,
0.09629132598638535,
-0.0649096891283989,
-0.09333053976297379,
-0.05240043252706528,
0.07291698455810547,
0.04296213388442993,
0.07213981449604034,
-0.19351357221603394,
-0.013064800761640072,
0.03944388031959534,
0.0352800227701664,
0.01871238835155964,
-0.17023803293704987,
0.024862315505743027,
0.02449508011341095,
-0.09217581152915955,
-0.028662187978625298,
0.023665161803364754,
-0.00007116390770534053,
0.07419929653406143,
0.0064421528950333595,
-0.04945236071944237,
-0.03686907887458801,
-0.06886351108551025,
-0.12888656556606293,
0.14892643690109253,
-0.1028691977262497,
-0.14669398963451385,
-0.11317360401153564,
-0.03587041795253754,
-0.03809204325079918,
-0.030989062041044235,
0.07149975746870041,
-0.09846872091293335,
-0.06534843146800995,
-0.08022112399339676,
-0.012190314941108227,
-0.021255994215607643,
0.015356327407062054,
0.047121889889240265,
0.020240435376763344,
0.045272037386894226,
-0.10933947563171387,
-0.0108535410836339,
-0.00371982017531991,
-0.02272190898656845,
-0.0016209396999329329,
0.04795489460229874,
0.09054625034332275,
0.15807951986789703,
0.06571561098098755,
0.06463569402694702,
-0.016287654638290405,
0.21780619025230408,
-0.12059319019317627,
-0.005048835184425116,
0.10448199510574341,
0.007480723317712545,
0.06458143144845963,
0.15920786559581757,
0.02444886602461338,
-0.08646336197853088,
0.021775726228952408,
0.05772239714860916,
-0.008031479083001614,
-0.2621605694293976,
-0.033955663442611694,
-0.08097570389509201,
-0.01409087423235178,
0.07647491246461868,
0.04158225655555725,
0.014638804830610752,
-0.003143277484923601,
-0.02693040296435356,
-0.02179892733693123,
0.05808131769299507,
0.06347918510437012,
0.08850441128015518,
0.03426259756088257,
0.08742155134677887,
-0.02438449114561081,
0.0010468193795531988,
0.02004396729171276,
-0.002810909179970622,
0.23477715253829956,
0.019378110766410828,
0.1974148154258728,
0.08097191900014877,
0.13006751239299774,
-0.025248493999242783,
0.03658613562583923,
-0.005020443815737963,
0.024139052256941795,
0.04111531749367714,
-0.0702553242444992,
-0.04726218804717064,
0.03805428370833397,
0.13272890448570251,
-0.01046313438564539,
-0.08148147165775299,
0.031151432543992996,
0.05367143452167511,
0.3172183036804199,
0.08522675931453705,
-0.22513443231582642,
-0.03954223170876503,
0.030291389673948288,
-0.06649942696094513,
-0.01698463410139084,
0.006597487721592188,
0.10167098790407181,
-0.08214248716831207,
0.07033322006464005,
-0.04157978668808937,
0.0928562730550766,
-0.07425717264413834,
0.0015709513099864125,
0.06276760995388031,
0.10810280591249466,
0.01780865341424942,
0.0656711682677269,
-0.2607796788215637,
0.21454380452632904,
-0.0014679066371172667,
0.07750944048166275,
-0.0720440074801445,
0.059014059603214264,
0.02636127360165119,
-0.05799476057291031,
0.09924202412366867,
0.0017789943376556039,
-0.10643654316663742,
-0.1499384641647339,
-0.10383681207895279,
0.003106988500803709,
0.13904111087322235,
-0.06489338725805283,
0.13281042873859406,
-0.041691869497299194,
-0.060274139046669006,
0.009401465766131878,
-0.010587993077933788,
-0.13927893340587616,
-0.09448752552270889,
0.08097171783447266,
0.025431152433156967,
0.07534123957157135,
-0.07799766957759857,
-0.07124211639165878,
-0.062015000730752945,
0.14906135201454163,
-0.1556006819009781,
-0.02708418294787407,
-0.12828519940376282,
0.044403400272130966,
0.1540723443031311,
-0.06880935281515121,
0.020298369228839874,
0.012893165461719036,
0.1263405978679657,
0.03345726802945137,
0.0034912005066871643,
0.07943558692932129,
-0.08484265953302383,
-0.19631798565387726,
-0.039766617119312286,
0.20122560858726501,
0.009991965256631374,
0.06283524632453918,
-0.0069358679465949535,
0.008665908128023148,
0.006843362934887409,
-0.09288003295660019,
0.08650419116020203,
0.07830871641635895,
-0.0013814634876325727,
0.08230616897344589,
-0.05382402986288071,
-0.04318849369883537,
-0.11643778532743454,
-0.05641956999897957,
0.1183607205748558,
0.2525661587715149,
-0.05715460702776909,
0.04291515797376633,
0.028036106377840042,
-0.07038429379463196,
-0.13982592523097992,
-0.02335427701473236,
0.10638993978500366,
0.02588425576686859,
-0.02027205377817154,
-0.16902334988117218,
-0.0026314177084714174,
0.07833243906497955,
-0.012796892784535885,
0.10745593160390854,
-0.3431907594203949,
-0.13003034889698029,
0.05661401525139809,
0.055086590349674225,
-0.03180495649576187,
-0.17754264175891876,
-0.09661474823951721,
-0.029232386499643326,
-0.08715134114027023,
0.05897074192762375,
-0.018597256392240524,
0.12364116311073303,
0.0173557810485363,
0.011383069679141045,
0.023453401401638985,
-0.05573587492108345,
0.14902140200138092,
0.061016593128442764,
0.010056738741695881,
-0.016588885337114334,
0.01588296703994274,
0.02573580853641033,
-0.0722271278500557,
0.04881584271788597,
-0.0745045617222786,
0.014546768739819527,
-0.14852546155452728,
-0.018305398523807526,
-0.07145076990127563,
-0.0019468374084681273,
-0.06506434828042984,
0.0023065987043082714,
-0.022356266155838966,
0.0416003093123436,
0.11842487007379532,
0.012810993008315563,
0.07347823679447174,
-0.052205536514520645,
0.07542065531015396,
0.13241223990917206,
0.10965365171432495,
0.014152517542243004,
-0.12238802015781403,
0.0034373984672129154,
0.009994860738515854,
0.015119875781238079,
-0.10508591681718826,
0.055844224989414215,
0.12699469923973083,
0.041392769664525986,
0.16530722379684448,
0.03693074733018875,
-0.10730244964361191,
-0.00836984347552061,
0.06672999262809753,
-0.054998621344566345,
-0.18257977068424225,
-0.005039681680500507,
0.013813579455018044,
-0.13309872150421143,
-0.015094020403921604,
0.11705081164836884,
-0.01247962936758995,
0.006029326934367418,
0.01802198961377144,
0.07517296075820923,
-0.03225770220160484,
0.2247816026210785,
0.011440552771091461,
0.1161198616027832,
-0.0897572934627533,
0.07803584635257721,
0.045352451503276825,
-0.08792594075202942,
0.04322132095694542,
0.11811517179012299,
-0.05552151799201965,
-0.03589560464024544,
-0.035722944885492325,
0.06478077173233032,
0.07661335915327072,
-0.04876355454325676,
-0.10246941447257996,
-0.12987923622131348,
0.09759537875652313,
0.03194919228553772,
0.021512670442461967,
0.038597945123910904,
-0.00124570622574538,
0.02076707035303116,
-0.08915136009454727,
0.11541789770126343,
0.09619289636611938,
0.04219676926732063,
-0.10381962358951569,
0.06423917412757874,
0.004677704069763422,
0.0077987671829760075,
0.01639642007648945,
-0.01843404956161976,
-0.10096097737550735,
0.03679228946566582,
-0.0857514962553978,
-0.011211608536541462,
-0.07623744755983353,
-0.012394817546010017,
0.03281193971633911,
-0.06041457876563072,
-0.053226400166749954,
0.029851097613573074,
-0.10726810991764069,
-0.0845322459936142,
-0.04916048049926758,
0.09497324377298355,
-0.12126199901103973,
-0.006927268113940954,
0.03786772862076759,
-0.1538427472114563,
0.10361631959676743,
0.04066373035311699,
-0.0026313187554478645,
0.0004254514933563769,
-0.0821591168642044,
-0.02446109429001808,
0.0289579089730978,
0.024151315912604332,
0.03384535014629364,
-0.2318524420261383,
-0.00015331752365455031,
-0.0161654781550169,
0.0033496893011033535,
-0.0016525221290066838,
0.021978456526994705,
-0.12273546308279037,
-0.004495235159993172,
-0.04919895529747009,
-0.04845228046178818,
-0.04596787318587303,
0.04254427179694176,
0.0829884260892868,
0.013942424207925797,
0.17169569432735443,
-0.05093163996934891,
0.08469972759485245,
-0.19795839488506317,
0.0010400095488876104,
0.0030187577940523624,
-0.03323134034872055,
-0.008074876852333546,
-0.021433372050523758,
0.10639520734548569,
-0.05964916944503784,
0.05228133127093315,
-0.03990710526704788,
0.051110751926898956,
0.029125792905688286,
-0.0683191791176796,
0.013249666430056095,
0.04361477866768837,
0.13626667857170105,
0.059267327189445496,
-0.01709289848804474,
0.05639387294650078,
-0.056443192064762115,
0.04963461682200432,
0.046383410692214966,
0.162531778216362,
0.15272323787212372,
0.12638312578201294,
0.07462156563997269,
0.08421018719673157,
-0.14107853174209595,
-0.1300603300333023,
0.16115127503871918,
-0.08315525949001312,
0.14244423806667328,
-0.044848743826150894,
0.1724959760904312,
0.10476161539554596,
-0.19430094957351685,
0.08170773833990097,
-0.027293231338262558,
-0.08146364241838455,
-0.09925247728824615,
-0.1103590652346611,
-0.07848185300827026,
-0.1392667442560196,
0.013047424145042896,
-0.09787576645612717,
0.08247143030166626,
0.026583299040794373,
0.044789157807826996,
0.036423709243535995,
0.07718682289123535,
0.014662106521427631,
-0.014999808743596077,
0.11284272372722626,
-0.011356860399246216,
-0.02114225924015045,
0.00281250081025064,
-0.07448923587799072,
0.0542621836066246,
-0.022789767012000084,
0.10469892621040344,
0.01897648349404335,
-0.08503115177154541,
0.05935186520218849,
-0.010698551312088966,
-0.10108023881912231,
0.028985723853111267,
-0.027815764769911766,
0.02970709651708603,
0.11385449022054672,
0.05728960409760475,
-0.016777079552412033,
0.007375587243586779,
0.16871538758277893,
-0.07227195054292679,
-0.07568361610174179,
-0.13960787653923035,
0.12764334678649902,
0.02812172658741474,
0.01458886731415987,
0.016054626554250717,
-0.09530660510063171,
-0.030088093131780624,
0.17026151716709137,
0.11159203201532364,
0.0012474145041778684,
-0.01868831366300583,
0.03548791632056236,
-0.003575468435883522,
-0.0282943956553936,
0.06709492206573486,
0.11467850208282471,
0.09891559928655624,
-0.016351481899619102,
0.01337203849107027,
-0.021944092586636543,
-0.08309707790613174,
-0.04313097894191742,
0.07795941829681396,
0.005854628514498472,
-0.00968336034566164,
-0.006798222661018372,
0.13187378644943237,
-0.05291076749563217,
-0.15739116072654724,
0.022306788712739944,
-0.1393662691116333,
-0.1774263083934784,
-0.018901139497756958,
0.08604847639799118,
0.04156315699219704,
0.060427162796258926,
0.0018489948706701398,
-0.047294292598962784,
0.15447469055652618,
-0.007660249248147011,
-0.03051901049911976,
-0.09663227945566177,
0.055902350693941116,
-0.08119722455739975,
0.1682800054550171,
-0.0220944844186306,
0.06133463233709335,
0.1447814255952835,
0.031318988651037216,
-0.11541078239679337,
0.049351099878549576,
0.09599405527114868,
-0.13158604502677917,
0.0602627694606781,
0.18785981833934784,
-0.036394719034433365,
0.12062587589025497,
0.05122370272874832,
-0.07525861263275146,
0.006659861654043198,
-0.017531316727399826,
-0.002012707060202956,
-0.08588900417089462,
-0.0016449685208499432,
-0.06878705322742462,
0.11113028973340988,
0.1895512491464615,
-0.07428430020809174,
0.014177409000694752,
-0.04389408975839615,
0.012696108780801296,
-0.005607361905276775,
0.12581953406333923,
-0.0473795011639595,
-0.2679659426212311,
0.04808185622096062,
-0.0011036962969228625,
0.03728329390287399,
-0.17319442331790924,
-0.08513985574245453,
0.04078676924109459,
-0.04058181494474411,
-0.07077351957559586,
0.11623349785804749,
0.06021178513765335,
0.03125835210084915,
-0.048160966485738754,
-0.20396146178245544,
-0.007480072323232889,
0.19517748057842255,
-0.16745519638061523,
-0.05961376801133156
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - MR dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6693
- Wer: 0.5921
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 500.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:-----:|:---------------:|:------:|
| 4.9504 | 18.18 | 400 | 4.6730 | 1.0 |
| 3.3766 | 36.36 | 800 | 3.3464 | 1.0 |
| 3.1128 | 54.55 | 1200 | 3.0177 | 0.9980 |
| 1.7966 | 72.73 | 1600 | 0.8733 | 0.8039 |
| 1.4085 | 90.91 | 2000 | 0.5555 | 0.6458 |
| 1.1731 | 109.09 | 2400 | 0.4930 | 0.6438 |
| 1.0271 | 127.27 | 2800 | 0.4780 | 0.6093 |
| 0.9045 | 145.45 | 3200 | 0.4647 | 0.6578 |
| 0.807 | 163.64 | 3600 | 0.4505 | 0.5925 |
| 0.741 | 181.82 | 4000 | 0.4746 | 0.6025 |
| 0.6706 | 200.0 | 4400 | 0.5004 | 0.5844 |
| 0.6186 | 218.18 | 4800 | 0.4984 | 0.5997 |
| 0.5508 | 236.36 | 5200 | 0.5298 | 0.5636 |
| 0.5123 | 254.55 | 5600 | 0.5410 | 0.5110 |
| 0.4623 | 272.73 | 6000 | 0.5591 | 0.5383 |
| 0.4281 | 290.91 | 6400 | 0.5775 | 0.5600 |
| 0.4045 | 309.09 | 6800 | 0.5924 | 0.5580 |
| 0.3651 | 327.27 | 7200 | 0.5671 | 0.5684 |
| 0.343 | 345.45 | 7600 | 0.6083 | 0.5945 |
| 0.3085 | 363.64 | 8000 | 0.6243 | 0.5728 |
| 0.2941 | 381.82 | 8400 | 0.6245 | 0.5580 |
| 0.2735 | 400.0 | 8800 | 0.6458 | 0.5804 |
| 0.262 | 418.18 | 9200 | 0.6566 | 0.5824 |
| 0.2578 | 436.36 | 9600 | 0.6558 | 0.5965 |
| 0.2388 | 454.55 | 10000 | 0.6598 | 0.5993 |
| 0.2328 | 472.73 | 10400 | 0.6700 | 0.6041 |
| 0.2286 | 490.91 | 10800 | 0.6684 | 0.5957 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
|
{"language": ["mr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "", "results": []}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-300m-mr-cv8-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"mr",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mr"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - MR dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6693
* Wer: 0.5921
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 500.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 500.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 500.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
79,
132,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 500.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
-0.1374078392982483,
0.10635945945978165,
-0.0019521026406437159,
0.0285965483635664,
0.12841875851154327,
0.0039905160665512085,
0.132563516497612,
0.12138203531503677,
-0.10641589015722275,
0.06969692558050156,
0.09619876742362976,
0.07843878865242004,
0.060520563274621964,
0.08692926168441772,
-0.03461374714970589,
-0.28180304169654846,
0.032457731664180756,
0.021529855206608772,
-0.10286250710487366,
0.10708678513765335,
0.11792846024036407,
-0.10285685211420059,
0.052158910781145096,
0.05362135171890259,
-0.160005122423172,
-0.005288897547870874,
0.012198120355606079,
-0.09555735439062119,
0.10250097513198853,
0.05285231024026871,
0.0638347938656807,
0.013092027977108955,
0.07420288771390915,
-0.20306923985481262,
0.011975443921983242,
0.0599934458732605,
0.047473255544900894,
0.08398228883743286,
0.07592979818582535,
0.019790727645158768,
0.08400598913431168,
-0.0298011414706707,
0.05234043672680855,
0.0758572369813919,
-0.08225447684526443,
-0.3128928542137146,
-0.10021181404590607,
0.04834394156932831,
0.10265492647886276,
0.0986623540520668,
-0.014470580033957958,
0.09976406395435333,
-0.010968451388180256,
0.08593288064002991,
0.2136974334716797,
-0.22893235087394714,
-0.07916762679815292,
-0.06419909745454788,
0.07467735558748245,
0.024103285744786263,
-0.096588134765625,
0.0015018690610304475,
0.039297521114349365,
0.0378381609916687,
0.08432476222515106,
0.007166651077568531,
0.00384057080373168,
-0.014676198363304138,
-0.12515310943126678,
-0.05282540246844292,
0.22343961894512177,
0.07327276468276978,
-0.0597100630402565,
-0.07268965989351273,
-0.046297840774059296,
-0.14728158712387085,
-0.04093305766582489,
0.008999625220894814,
0.017745153978466988,
-0.047412190586328506,
-0.11142446100711823,
-0.03740575909614563,
-0.0685930848121643,
-0.11117734014987946,
0.009031863883137703,
0.25837671756744385,
0.027426516637206078,
-0.010356694459915161,
-0.027008339762687683,
0.07800160348415375,
0.05663801357150078,
-0.1671724021434784,
-0.03682975098490715,
0.04183993488550186,
-0.02123948000371456,
-0.0033676079474389553,
-0.061575617641210556,
-0.03655771166086197,
0.02696939744055271,
0.15558376908302307,
-0.020761482417583466,
0.061498552560806274,
-0.011870546266436577,
0.014530329965054989,
-0.10196919739246368,
0.20667175948619843,
-0.05335956811904907,
-0.03213707357645035,
0.020943310111761093,
0.09448257833719254,
0.051519691944122314,
-0.022589877247810364,
-0.09610144793987274,
0.017930738627910614,
0.11041678488254547,
0.0436495803296566,
0.00779005978256464,
0.02305668033659458,
-0.03958277776837349,
-0.03368692100048065,
0.03479835391044617,
-0.10206576436758041,
0.013467369601130486,
0.020454728975892067,
-0.07813680917024612,
0.011883703991770744,
-0.0005032587214373052,
0.027666624635457993,
-0.017560888081789017,
0.07039741426706314,
-0.08170519024133682,
-0.020649785175919533,
-0.08453010767698288,
-0.10371211171150208,
0.039809223264455795,
0.010176155716180801,
0.008836223743855953,
-0.10738715529441833,
-0.12159761786460876,
-0.020001458004117012,
0.0214113537222147,
-0.021074173972010612,
-0.07425060123205185,
-0.04108273610472679,
-0.10758813470602036,
0.05759759619832039,
-0.026249725371599197,
0.12566855549812317,
-0.0485253669321537,
0.10131949186325073,
0.09574855864048004,
0.027592912316322327,
-0.016784384846687317,
0.05535194277763367,
-0.03652443364262581,
0.03902947157621384,
-0.13259483873844147,
0.058154717087745667,
-0.0922047346830368,
0.05114838108420372,
-0.11195828020572662,
-0.11376442760229111,
0.029155582189559937,
-0.011349678970873356,
0.09991607069969177,
0.11014019697904587,
-0.11015456914901733,
-0.1054079681634903,
0.10375633090734482,
-0.08036722242832184,
-0.14651253819465637,
0.12255000323057175,
0.015106980688869953,
-0.03774314746260643,
0.05297103524208069,
0.14492473006248474,
0.12099569290876389,
-0.11375364661216736,
-0.022818440571427345,
-0.04611954092979431,
0.13643191754817963,
-0.01582639105618,
0.12346531450748444,
-0.028453638777136803,
-0.009980454109609127,
0.020005691796541214,
-0.06359663605690002,
0.07069635391235352,
-0.091573566198349,
-0.07416162639856339,
-0.04068610817193985,
-0.09860889613628387,
0.012434791773557663,
0.04133731126785278,
0.03142380714416504,
-0.09889952093362808,
-0.11594777554273605,
0.06145012006163597,
0.13717253506183624,
-0.0889219269156456,
0.02919754572212696,
-0.1106688380241394,
0.08360610902309418,
-0.06361737847328186,
-0.012099146842956543,
-0.15782777965068817,
-0.0327296108007431,
0.033967651426792145,
-0.05316495895385742,
0.014683635905385017,
-0.09059474617242813,
0.05978511646389961,
0.06732731312513351,
-0.04915124177932739,
-0.07098227739334106,
-0.09749963879585266,
-0.00531968055292964,
-0.04399346187710762,
-0.18980750441551208,
-0.09900473803281784,
-0.01599450595676899,
0.16438111662864685,
-0.13262812793254852,
0.0157588142901659,
0.036869265139102936,
0.13929109275341034,
0.028963087126612663,
-0.04174196720123291,
-0.00480475090444088,
0.06792233884334564,
-0.038097623735666275,
-0.059535447508096695,
0.01317017525434494,
0.03676951304078102,
-0.11067545413970947,
0.013010688126087189,
-0.13337039947509766,
0.14264114201068878,
0.12548348307609558,
-0.028781075030565262,
-0.01740546151995659,
0.02509978786110878,
-0.06071784719824791,
-0.041902583092451096,
-0.017740698531270027,
-0.04990650713443756,
0.10954299569129944,
0.014460373669862747,
0.13249868154525757,
-0.09187081456184387,
-0.03785533085465431,
0.04438517987728119,
0.005067803896963596,
-0.03199554234743118,
0.07982565462589264,
0.039619263261556625,
-0.03088853694498539,
0.10547612607479095,
0.06789594888687134,
-0.09782356023788452,
0.15586940944194794,
-0.07415106892585754,
-0.0727810189127922,
-0.03829609602689743,
-0.019698243588209152,
0.03421896696090698,
0.12477108836174011,
-0.1581445187330246,
-0.023093249648809433,
0.03145575523376465,
0.008568057790398598,
0.022376451641321182,
-0.19736142456531525,
0.011588575318455696,
0.03002483770251274,
-0.06846597790718079,
-0.05751464515924454,
0.023738591000437737,
-0.02536955662071705,
0.07490906119346619,
0.022260122001171112,
-0.047053005546331406,
0.007177104242146015,
-0.01569000817835331,
-0.07920287549495697,
0.1673690378665924,
-0.12212757766246796,
-0.14950692653656006,
-0.19072279334068298,
-0.03313637524843216,
-0.0940871387720108,
0.013451557606458664,
0.0496918149292469,
-0.09178555756807327,
-0.041905809193849564,
-0.03865601494908333,
0.05326852202415466,
-0.06361017376184464,
0.03446223586797714,
0.04712600260972977,
-0.00306497048586607,
0.08181305229663849,
-0.11941257119178772,
0.012892399914562702,
0.00689241848886013,
-0.011960992589592934,
-0.038951992988586426,
0.029431961476802826,
0.12090481072664261,
0.14109797775745392,
0.04030592367053032,
0.02773931249976158,
-0.023288315162062645,
0.22186577320098877,
-0.11785109341144562,
-0.03646469861268997,
0.18415555357933044,
0.021988095715641975,
0.027009736746549606,
0.08610687404870987,
0.05332859605550766,
-0.0795694887638092,
0.012368299067020416,
0.004839551169425249,
-0.01878073252737522,
-0.22713541984558105,
-0.038387641310691833,
-0.07416077703237534,
-0.031140051782131195,
0.06558627635240555,
0.027293726801872253,
0.055593010038137436,
0.039267417043447495,
-0.04629585146903992,
0.04262063652276993,
-0.00017710775136947632,
0.09382087737321854,
0.16065528988838196,
0.05232934281229973,
0.12129984050989151,
-0.023979097604751587,
-0.016850536689162254,
0.027892474085092545,
-0.00847607757896185,
0.1818619668483734,
0.03060176409780979,
0.2038903832435608,
0.035658009350299835,
0.1385471373796463,
0.007575454190373421,
0.07101678848266602,
0.038327571004629135,
0.003302318509668112,
0.027843719348311424,
-0.06399600207805634,
-0.06011063605546951,
0.018501564860343933,
0.053985126316547394,
0.08011631667613983,
-0.11873994022607803,
-0.010933002457022667,
0.006305191665887833,
0.365062952041626,
0.044860467314720154,
-0.3267408311367035,
-0.1401442289352417,
0.011657803319394588,
-0.06567636132240295,
-0.05661609768867493,
0.028025345876812935,
0.09421537816524506,
-0.07154473662376404,
0.08308722078800201,
-0.05471965670585632,
0.09623212367296219,
-0.041421785950660706,
0.014390604570508003,
0.07285550981760025,
0.11804406344890594,
0.013577943667769432,
0.05458715558052063,
-0.2573055624961853,
0.2490013986825943,
0.012361306697130203,
0.11143679171800613,
-0.03959351405501366,
0.04038676619529724,
0.034509483724832535,
0.02172243222594261,
0.02840248867869377,
-0.015522205270826817,
-0.06942916661500931,
-0.1599467694759369,
-0.060954634100198746,
0.023003332316875458,
0.12561582028865814,
-0.04595169052481651,
0.11822736263275146,
-0.0559251643717289,
-0.03041674755513668,
0.05940253660082817,
-0.05404095724225044,
-0.12832577526569366,
-0.0743756964802742,
0.0473247729241848,
0.08985735476016998,
0.10533754527568817,
-0.1045038029551506,
-0.10336292535066605,
-0.021836090832948685,
0.1008857861161232,
-0.10985355079174042,
-0.03386209160089493,
-0.1267385482788086,
0.04163143038749695,
0.14882951974868774,
-0.0665111169219017,
0.05179956927895546,
-0.0029171970672905445,
0.18304350972175598,
0.031212102621793747,
-0.03160056099295616,
0.07880990952253342,
-0.09883623570203781,
-0.22133539617061615,
-0.02799869142472744,
0.17602041363716125,
-0.009253987111151218,
0.05862575024366379,
-0.009133515879511833,
0.03799271211028099,
-0.03992224857211113,
-0.07078490406274796,
0.04157208278775215,
0.016648566350340843,
0.005829975008964539,
0.03306523710489273,
0.0025356959085911512,
0.021989144384860992,
-0.07545002549886703,
-0.0531829334795475,
0.11200685799121857,
0.2506886422634125,
-0.05496538430452347,
-0.02752469666302204,
0.06634368002414703,
-0.044100306928157806,
-0.13893604278564453,
0.0016485612140968442,
0.10628888756036758,
0.026427417993545532,
-0.05933848023414612,
-0.20719096064567566,
0.038013000041246414,
0.058503031730651855,
-0.04673299565911293,
0.10140705853700638,
-0.26625970005989075,
-0.1488686501979828,
0.09463487565517426,
0.09182922542095184,
0.02896372228860855,
-0.15614841878414154,
-0.07800526171922684,
-0.07868087291717529,
-0.12643977999687195,
0.08932171016931534,
-0.04524274170398712,
0.11874101310968399,
0.003610949032008648,
0.07579763978719711,
0.007466702722012997,
-0.042352113872766495,
0.16861358284950256,
-0.01199780311435461,
0.02925441414117813,
-0.0050536757335066795,
0.045742712914943695,
0.07026010006666183,
-0.04622872918844223,
0.020859260112047195,
-0.05140898376703262,
0.04464289918541908,
-0.11031400412321091,
-0.030156554654240608,
-0.09484533965587616,
0.03213983029127121,
-0.026792360469698906,
-0.02464972995221615,
-0.015269177965819836,
0.010999870486557484,
0.024701260030269623,
-0.002303801942616701,
0.18897296488285065,
-0.02202635258436203,
0.16704191267490387,
0.12939484417438507,
0.1219918429851532,
-0.048100247979164124,
-0.06770507991313934,
-0.002442800672724843,
-0.0416230782866478,
0.07902422547340393,
-0.11144684255123138,
0.026935353875160217,
0.10108565539121628,
0.06080593913793564,
0.11693728715181351,
0.06514190882444382,
-0.08639447391033173,
0.03595990315079689,
0.06762652099132538,
-0.10872960090637207,
-0.14770913124084473,
-0.06326956301927567,
0.05946989730000496,
-0.13221102952957153,
0.05173289775848389,
0.1347375512123108,
-0.07268063724040985,
-0.004633292555809021,
0.014147192239761353,
0.002387113170698285,
-0.054304979741573334,
0.22002418339252472,
0.06917150318622589,
0.08220964670181274,
-0.10747802257537842,
0.09315326809883118,
0.03121342323720455,
-0.09851735830307007,
0.018632806837558746,
0.05804150179028511,
-0.05155766382813454,
-0.015972401946783066,
-0.04138481244444847,
0.05024176463484764,
-0.05074261873960495,
-0.0906432643532753,
-0.1599389910697937,
-0.14477597177028656,
0.07883882522583008,
0.12739214301109314,
0.03793839365243912,
0.025174930691719055,
-0.04402445629239082,
0.059196583926677704,
-0.11680293083190918,
0.09491229057312012,
0.05266037583351135,
0.08392274379730225,
-0.16491027176380157,
0.12284685671329498,
0.013102952390909195,
0.03586433827877045,
-0.0062032425776124,
-0.009943423792719841,
-0.07390999048948288,
0.028882306069135666,
-0.16884055733680725,
-0.03315211087465286,
-0.03443385660648346,
0.001720466068945825,
0.002970050787553191,
-0.08262521028518677,
-0.0943712368607521,
0.05291485786437988,
-0.11207480728626251,
-0.055801764130592346,
0.0010938576888293028,
0.03743194788694382,
-0.1114102452993393,
0.006724537815898657,
0.053574420511722565,
-0.13384193181991577,
0.07761941850185394,
0.0771164745092392,
0.01940322294831276,
0.05704449862241745,
-0.025368621572852135,
-0.025516770780086517,
0.0376666896045208,
0.016287632286548615,
0.030394800007343292,
-0.15433113276958466,
-0.00720871752128005,
0.009373115375638008,
0.048706695437431335,
-0.016046596691012383,
0.07904775440692902,
-0.11151369661092758,
-0.04594668373465538,
0.0002965392777696252,
-0.01950896345078945,
-0.06301222741603851,
0.04014421999454498,
0.11002722382545471,
0.038224752992391586,
0.17050446569919586,
-0.06804956495761871,
0.010893546976149082,
-0.20467087626457214,
0.03396041318774223,
-0.041153162717819214,
-0.12568803131580353,
-0.09452248364686966,
-0.009397026151418686,
0.08406782895326614,
-0.047783009707927704,
0.0625406950712204,
-0.06917137652635574,
0.12514503300189972,
0.05123074725270271,
-0.020942717790603638,
0.0031001640018075705,
0.04326944798231125,
0.2363070249557495,
0.06230638548731804,
-0.006264698691666126,
0.0974469780921936,
-0.0018677444895729423,
0.0842076987028122,
0.11630311608314514,
0.11415031552314758,
0.11859948188066483,
0.03911333903670311,
0.12470690906047821,
0.10638934373855591,
-0.06602159142494202,
-0.16024793684482574,
0.03209976479411125,
-0.020443718880414963,
0.14566290378570557,
-0.0009706660639494658,
0.19018036127090454,
0.12198138236999512,
-0.1560467928647995,
0.03646819293498993,
-0.036813560873270035,
-0.05865022912621498,
-0.10695961117744446,
-0.025337660685181618,
-0.07757073640823364,
-0.18546774983406067,
0.012226752936840057,
-0.1259460747241974,
0.0438610278069973,
0.05043075978755951,
0.01774834468960762,
0.009692582301795483,
0.1314122974872589,
0.03747842460870743,
-0.025994649156928062,
0.1034729927778244,
-0.01506231352686882,
-0.042700715363025665,
-0.04479613155126572,
-0.12530893087387085,
0.062334828078746796,
-0.015311755239963531,
0.06823082268238068,
-0.026095787063241005,
-0.1285925805568695,
0.06131909787654877,
0.009317848831415176,
-0.12453026324510574,
0.03292761743068695,
0.008457259275019169,
0.07090214639902115,
0.06981637328863144,
0.006961405277252197,
0.00174599455203861,
0.010280783288180828,
0.23770494759082794,
-0.11538925021886826,
-0.05574250593781471,
-0.134104385972023,
0.19254164397716522,
0.0016025813529267907,
-0.02007421851158142,
0.030100733041763306,
-0.07830290496349335,
-0.05660254508256912,
0.1688218116760254,
0.11315694451332092,
-0.009138093329966068,
-0.03772657364606857,
0.007287457585334778,
-0.015968890860676765,
-0.07452119141817093,
0.08924766629934311,
0.11827065050601959,
0.03812924772500992,
-0.029749542474746704,
-0.015085671097040176,
-0.048815611749887466,
-0.04143870994448662,
-0.008382000029087067,
0.08387937396764755,
-0.025657257065176964,
-0.043854497373104095,
-0.014200765639543533,
0.09748215973377228,
-0.05718688666820526,
-0.13233017921447754,
-0.000705945654772222,
-0.15846088528633118,
-0.16644640266895294,
-0.03166554123163223,
0.06627737730741501,
0.0346902534365654,
0.05775948986411095,
0.0017162497388198972,
-0.005621765740215778,
0.12806567549705505,
-0.003574459347873926,
-0.04667269438505173,
-0.13790756464004517,
0.11419017612934113,
-0.09660355746746063,
0.19227741658687592,
-0.04963860660791397,
0.05270991101861,
0.11965072900056839,
0.05590265989303589,
-0.10580642521381378,
0.04881083220243454,
0.07889648526906967,
-0.15176907181739807,
0.048246219754219055,
0.2014714628458023,
-0.039029914885759354,
0.15206746757030487,
0.005622704513370991,
-0.12342319637537003,
-0.008315456099808216,
-0.04452240467071533,
-0.03777681291103363,
-0.057428404688835144,
-0.028004519641399384,
-0.07008805125951767,
0.12378256022930145,
0.1686963587999344,
-0.08169179409742355,
-0.041002873331308365,
-0.048472702503204346,
0.033443063497543335,
0.07535579800605774,
0.08219971507787704,
-0.03204331919550896,
-0.30682340264320374,
-0.014780952595174313,
0.005271106958389282,
-0.008084312081336975,
-0.250659316778183,
-0.06742215156555176,
0.02674231491982937,
-0.05911828577518463,
-0.03397563472390175,
0.07878153026103973,
0.06449443846940994,
0.013010498136281967,
-0.06335625052452087,
-0.0698302686214447,
-0.04893628507852554,
0.17608626186847687,
-0.18431387841701508,
-0.07321549952030182
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Maltese
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - MT dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1895
- Wer: 0.1984
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 60.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.4219 | 3.6 | 400 | 3.3127 | 1.0 |
| 3.0399 | 7.21 | 800 | 3.0330 | 1.0 |
| 1.5756 | 10.81 | 1200 | 0.6108 | 0.5724 |
| 1.0995 | 14.41 | 1600 | 0.3091 | 0.3154 |
| 0.9639 | 18.02 | 2000 | 0.2596 | 0.2841 |
| 0.9032 | 21.62 | 2400 | 0.2270 | 0.2514 |
| 0.8145 | 25.23 | 2800 | 0.2172 | 0.2483 |
| 0.7845 | 28.83 | 3200 | 0.2084 | 0.2333 |
| 0.7694 | 32.43 | 3600 | 0.1974 | 0.2234 |
| 0.7333 | 36.04 | 4000 | 0.2020 | 0.2185 |
| 0.693 | 39.64 | 4400 | 0.1947 | 0.2148 |
| 0.6802 | 43.24 | 4800 | 0.1960 | 0.2102 |
| 0.667 | 46.85 | 5200 | 0.1904 | 0.2072 |
| 0.6486 | 50.45 | 5600 | 0.1881 | 0.2009 |
| 0.6339 | 54.05 | 6000 | 0.1877 | 0.1989 |
| 0.6254 | 57.66 | 6400 | 0.1893 | 0.2003 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-mt-cv8-with-lm --dataset mozilla-foundation/common_voice_8_0 --config mt --split test
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-xls-r-300m-mt-cv8-with-lm"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "mt", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "għadu jilagħbu ċirku tant bilfondi"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 19.853 | 15.967 |
|
{"language": ["mt"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer"], "model-index": [{"name": "XLS-R-300M - Maltese", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "mt"}, "metrics": [{"type": "wer", "value": 15.967, "name": "Test WER"}, {"type": "cer", "value": 3.657, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-300m-mt-cv8-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"mt",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mt"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #mt #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Maltese
====================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - MT dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1895
* Wer: 0.1984
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 60.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #mt #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
112,
132,
4,
39,
36,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #mt #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.11140481382608414,
0.15088018774986267,
-0.005492306314408779,
0.005817790050059557,
0.11444887518882751,
0.0268270131200552,
0.10468676686286926,
0.16268710792064667,
-0.03993390500545502,
0.14059849083423615,
0.057842206209897995,
0.09728330373764038,
0.08932921290397644,
0.09629189968109131,
-0.025106102228164673,
-0.21539297699928284,
0.04015640541911125,
-0.07608690112829208,
-0.07120906561613083,
0.10687907785177231,
0.08945783972740173,
-0.07931998372077942,
0.0271969772875309,
0.00022622969117946923,
-0.05981539189815521,
-0.009023871272802353,
-0.03927797079086304,
-0.04743518680334091,
0.061275746673345566,
0.022115493193268776,
0.014486763626337051,
0.024416647851467133,
0.045202527195215225,
-0.29961442947387695,
-0.0007118710782378912,
0.0812050849199295,
0.035883188247680664,
0.04000787064433098,
0.10713709890842438,
-0.01680816523730755,
0.10069181025028229,
-0.08538556098937988,
0.029363613575696945,
0.08251479268074036,
-0.07448174059391022,
-0.21020913124084473,
-0.11510570347309113,
0.024578023701906204,
0.14108194410800934,
0.08750210702419281,
-0.052418436855077744,
0.04958343133330345,
-0.09519820660352707,
0.08968134224414825,
0.20289212465286255,
-0.2003454715013504,
-0.046888720244169235,
-0.0038886945694684982,
0.03233608230948448,
0.036705225706100464,
-0.08664771169424057,
-0.011180282570421696,
0.022752005606889725,
-0.0044044870883226395,
0.02666579745709896,
0.0042321034707129,
0.04251345619559288,
-0.007724987342953682,
-0.14743340015411377,
-0.07440491765737534,
0.1617491990327835,
0.07718785107135773,
-0.037020351737737656,
-0.11165598034858704,
-0.005633344408124685,
-0.17305892705917358,
-0.031261418014764786,
0.017759134992957115,
0.011122962459921837,
-0.02121773175895214,
-0.025143010541796684,
0.04362872987985611,
-0.046253617852926254,
-0.07386207580566406,
0.06513410061597824,
0.11478560417890549,
0.047136809676885605,
-0.047274164855480194,
0.0042315078899264336,
0.08109884709119797,
0.016831200569868088,
-0.16110672056674957,
-0.0519627146422863,
0.038093451410532,
-0.12873685359954834,
-0.008018005639314651,
-0.02524104155600071,
0.009758289903402328,
0.10294728726148605,
0.17626060545444489,
0.031004514545202255,
0.10239267349243164,
-0.00042118760757148266,
0.008013085462152958,
-0.0419514998793602,
0.16148759424686432,
-0.03946077823638916,
-0.10710613429546356,
-0.03634294122457504,
0.1198238953948021,
-0.0034907290246337652,
-0.010542025789618492,
-0.026366310194134712,
0.034784846007823944,
0.11566207557916641,
0.09926997870206833,
0.02748449705541134,
0.005523372441530228,
-0.07973459362983704,
-0.018105898052453995,
-0.021442966535687447,
-0.1519620567560196,
0.06156044825911522,
0.07924707978963852,
-0.052832912653684616,
-0.0015491839731112123,
-0.02796018496155739,
0.013333331793546677,
-0.05960948392748833,
0.08958669006824493,
-0.048614077270030975,
-0.006725444924086332,
-0.07829505205154419,
-0.09773537516593933,
0.06638935208320618,
-0.021789425984025,
-0.028047336265444756,
-0.04604726657271385,
-0.06937551498413086,
-0.09893845021724701,
0.029033752158284187,
-0.05198053643107414,
-0.05460681766271591,
-0.08166830986738205,
-0.10352682322263718,
0.047371361404657364,
-0.015161843970417976,
0.14455826580524445,
-0.061796724796295166,
0.0836581364274025,
0.024136807769536972,
0.03617590665817261,
0.13621076941490173,
0.07244966179132462,
-0.018061567097902298,
0.06085054203867912,
-0.12059634923934937,
0.12046527117490768,
-0.14450758695602417,
0.05344449356198311,
-0.15600037574768066,
-0.08513811975717545,
-0.005212522577494383,
-0.006582863628864288,
0.09591059386730194,
0.1504056304693222,
-0.18157905340194702,
-0.07698922604322433,
0.15364645421504974,
-0.05505870655179024,
-0.08586610853672028,
0.14081627130508423,
0.011515497229993343,
-0.05181027948856354,
0.028013383969664574,
0.1686936318874359,
0.1375257968902588,
-0.09924126416444778,
-0.03365033119916916,
-0.07205234467983246,
0.08097482472658157,
0.05973151698708534,
0.10872837901115417,
-0.08372187614440918,
0.021076971665024757,
-0.0055707222782075405,
-0.04844408482313156,
0.014289012178778648,
-0.06986928731203079,
-0.07583031803369522,
-0.007739703170955181,
-0.04584015905857086,
-0.02502426505088806,
0.02804933860898018,
-0.039762407541275024,
-0.10393636673688889,
-0.12238448113203049,
-0.0422709584236145,
0.1119188517332077,
-0.08768488466739655,
0.019740933552384377,
-0.09298522770404816,
0.06408374756574631,
-0.0010087772971019149,
0.02392912469804287,
-0.14108212292194366,
-0.03285881504416466,
0.05322008207440376,
-0.08047997951507568,
-0.000955620314925909,
-0.047349732369184494,
0.03225065395236015,
0.01220018696039915,
0.001979000400751829,
-0.0468762032687664,
-0.04590963199734688,
-0.010214662179350853,
-0.04716567322611809,
-0.21904373168945312,
-0.06467235088348389,
-0.024327388033270836,
0.2063726782798767,
-0.17892394959926605,
0.01633499190211296,
0.10327215492725372,
0.0905270129442215,
-0.0006793091306462884,
-0.0611121766269207,
0.025399839505553246,
0.056727442890405655,
-0.01952923834323883,
-0.06452508270740509,
0.0022521757055073977,
-0.0014417655766010284,
-0.09128279983997345,
-0.008928588591516018,
-0.14828090369701385,
0.0110542681068182,
0.08300243318080902,
0.04710732027888298,
-0.049126140773296356,
-0.02001781389117241,
-0.060661301016807556,
-0.05174684897065163,
-0.05997515469789505,
-0.05136727914214134,
0.09773989021778107,
0.04319708049297333,
0.07572417706251144,
-0.07121862471103668,
-0.057782337069511414,
0.02894418314099312,
0.011546292342245579,
-0.016525378450751305,
0.15694069862365723,
0.06939847022294998,
-0.04884583130478859,
0.08962737768888474,
0.027241777628660202,
-0.04487604275345802,
0.11360540986061096,
-0.06758089363574982,
-0.09915641695261002,
-0.043134208768606186,
0.06355199962854385,
0.036477379500865936,
0.07289061695337296,
-0.20882411301136017,
-0.014260774478316307,
0.03455767780542374,
0.040496181696653366,
0.030220771208405495,
-0.1725790798664093,
0.022851819172501564,
0.025898117572069168,
-0.09370916336774826,
-0.0462542399764061,
0.020327625796198845,
-0.00533160800114274,
0.06994877010583878,
0.0064466861076653,
-0.0461762435734272,
-0.03900780528783798,
-0.06418296694755554,
-0.13487155735492706,
0.14981360733509064,
-0.10114672780036926,
-0.14204896986484528,
-0.11777859181165695,
-0.015789039433002472,
-0.036901332437992096,
-0.031123636290431023,
0.07017219066619873,
-0.10552071034908295,
-0.061783310025930405,
-0.07631466537714005,
-0.020094873383641243,
-0.030098536983132362,
0.010897479020059109,
0.05622478947043419,
0.013103648088872433,
0.05400354042649269,
-0.1147531047463417,
-0.007466660812497139,
-0.0019244947470724583,
-0.021814435720443726,
-0.007894274778664112,
0.03420845791697502,
0.09734980762004852,
0.16938768327236176,
0.0636766254901886,
0.06217914819717407,
-0.01195567473769188,
0.21864911913871765,
-0.12921155989170074,
-0.012918191030621529,
0.0999268889427185,
0.009074652567505836,
0.060031671077013016,
0.14593638479709625,
0.02382637932896614,
-0.08574643731117249,
0.026142820715904236,
0.05834748595952988,
-0.012838369235396385,
-0.27235502004623413,
-0.03342109173536301,
-0.08186142146587372,
-0.021764853969216347,
0.06791152060031891,
0.044517483562231064,
0.02128966711461544,
0.0024211483541876078,
-0.027990242466330528,
-0.036633316427469254,
0.051849767565727234,
0.05962030589580536,
0.07492367923259735,
0.03999967500567436,
0.08746429532766342,
-0.028087828308343887,
0.0014714800054207444,
0.018812574446201324,
-0.009525435045361519,
0.22875669598579407,
0.0368259958922863,
0.1967921108007431,
0.07619643211364746,
0.1389254480600357,
-0.021232381463050842,
0.034126438200473785,
0.008696884848177433,
0.02205183357000351,
0.03994593024253845,
-0.06990594416856766,
-0.049523092806339264,
0.03289075568318367,
0.1322944015264511,
-0.0032117159571498632,
-0.08628159016370773,
0.03569180518388748,
0.050229646265506744,
0.31766462326049805,
0.08385089039802551,
-0.2388414591550827,
-0.04547802731394768,
0.0199990663677454,
-0.0754709243774414,
-0.013826326467096806,
0.013857622630894184,
0.10966756194829941,
-0.0884479433298111,
0.07897510379552841,
-0.03838604688644409,
0.10119730234146118,
-0.06141297519207001,
0.001621035858988762,
0.07186222821474075,
0.10936353355646133,
0.014346172101795673,
0.06829971820116043,
-0.24989847838878632,
0.2169586420059204,
-0.004740098956972361,
0.08450040221214294,
-0.06820689141750336,
0.05567804351449013,
0.03813289850950241,
-0.06289985775947571,
0.08405814319849014,
0.005245990119874477,
-0.11070403456687927,
-0.13893790543079376,
-0.10972785204648972,
0.006717415060847998,
0.139048233628273,
-0.059340886771678925,
0.12875288724899292,
-0.039879050105810165,
-0.05601193383336067,
0.014314808882772923,
-0.013485128059983253,
-0.14340724050998688,
-0.0831136703491211,
0.07410003989934921,
0.038247041404247284,
0.08086352050304413,
-0.07509128749370575,
-0.0778864249587059,
-0.07315270602703094,
0.1491919904947281,
-0.1451890468597412,
-0.02486347407102585,
-0.1268608719110489,
0.039395321160554886,
0.1634335219860077,
-0.06624989211559296,
0.02139985002577305,
0.017379026859998703,
0.12861847877502441,
0.04005659371614456,
0.004295865073800087,
0.08048581331968307,
-0.087620809674263,
-0.19739174842834473,
-0.037884876132011414,
0.20015451312065125,
0.013898602686822414,
0.06560704112052917,
-0.012389189563691616,
0.008889790624380112,
0.00455221813172102,
-0.09245447814464569,
0.08625564724206924,
0.07972496747970581,
-0.011087290942668915,
0.0878162607550621,
-0.03642670810222626,
-0.027871133759617805,
-0.1223563477396965,
-0.0630408525466919,
0.11365782469511032,
0.24541538953781128,
-0.050409168004989624,
0.03087189793586731,
0.03024688921868801,
-0.07030756771564484,
-0.13585945963859558,
-0.006569866556674242,
0.11292161047458649,
0.030550358816981316,
-0.023662490770220757,
-0.1746913641691208,
0.0009808916365727782,
0.09125164151191711,
-0.018115274608135223,
0.10430720448493958,
-0.3441424071788788,
-0.12930269539356232,
0.05056008696556091,
0.042548153549432755,
-0.0278435368090868,
-0.17782865464687347,
-0.09477496147155762,
-0.025332210585474968,
-0.0926356241106987,
0.03797256574034691,
-0.010328901931643486,
0.125417098402977,
0.025038637220859528,
0.02168797142803669,
0.021137569099664688,
-0.05821581557393074,
0.14978653192520142,
0.06666652113199234,
0.00761050172150135,
-0.011821994557976723,
0.021207790821790695,
0.01733788102865219,
-0.06619139015674591,
0.04925459995865822,
-0.0722821056842804,
0.015938322991132736,
-0.14478471875190735,
-0.017345435917377472,
-0.07368368655443192,
-0.0005257194861769676,
-0.05674326419830322,
0.011364322155714035,
-0.021022796630859375,
0.03643117845058441,
0.11056302487850189,
0.017511950805783272,
0.07314345985651016,
-0.05192239582538605,
0.08199214190244675,
0.11828571557998657,
0.11420028656721115,
0.010807931423187256,
-0.12835422158241272,
0.004454882349818945,
0.010317507199943066,
0.023156551644206047,
-0.09866666793823242,
0.053378716111183167,
0.12591499090194702,
0.048538703471422195,
0.16094709932804108,
0.03700048103928566,
-0.11002574861049652,
-0.005195626523345709,
0.06672796607017517,
-0.054364364594221115,
-0.17483052611351013,
-0.010037689469754696,
0.010390111245214939,
-0.13238345086574554,
-0.011128092184662819,
0.10583283007144928,
-0.01608595997095108,
0.006056283134967089,
0.011310422793030739,
0.07209612429141998,
-0.0393822081387043,
0.23569947481155396,
0.015108645893633366,
0.11255912482738495,
-0.09598832577466965,
0.08744461089372635,
0.043216221034526825,
-0.09961491078138351,
0.052307598292827606,
0.11182785034179688,
-0.05116865038871765,
-0.02425849810242653,
-0.02855890803039074,
0.06478552520275116,
0.07662289589643478,
-0.059533145278692245,
-0.10252460092306137,
-0.12593135237693787,
0.09685377776622772,
0.03557800501585007,
0.02439643256366253,
0.041724007576704025,
0.0018662314396351576,
0.024113522842526436,
-0.09794506430625916,
0.10689172148704529,
0.1056620329618454,
0.0453127957880497,
-0.1163736879825592,
0.06211400032043457,
0.007035901304334402,
-0.0007680854178033769,
0.01493143942207098,
-0.02170577645301819,
-0.09829892963171005,
0.035198189318180084,
-0.07301638275384903,
-0.01115954015403986,
-0.08306308090686798,
-0.01422171201556921,
0.034061580896377563,
-0.06404468417167664,
-0.06342138350009918,
0.03262358158826828,
-0.11165159940719604,
-0.08545897901058197,
-0.04728447273373604,
0.09598828852176666,
-0.11349675059318542,
0.005336688365787268,
0.04482654482126236,
-0.15146327018737793,
0.09627661108970642,
0.055531445890665054,
-0.0017569948686286807,
-0.005451646167784929,
-0.07921446114778519,
-0.03271477669477463,
0.034391336143016815,
0.021737074479460716,
0.025784015655517578,
-0.23205924034118652,
0.0028980302158743143,
-0.010693546384572983,
0.012473690323531628,
-0.005766689777374268,
0.016358494758605957,
-0.12656746804714203,
-0.013757536187767982,
-0.047024499624967575,
-0.052513573318719864,
-0.04400916025042534,
0.03564664348959923,
0.09115377813577652,
0.010217760689556599,
0.17294828593730927,
-0.054998546838760376,
0.07795330882072449,
-0.20212239027023315,
0.0019630773458629847,
-0.0013255761004984379,
-0.03617189824581146,
-0.02339160442352295,
-0.009237729012966156,
0.10927131026983261,
-0.05582752078771591,
0.057686448097229004,
-0.040516555309295654,
0.04148152843117714,
0.026622695848345757,
-0.07370463013648987,
0.022051973268389702,
0.04124879091978073,
0.15467065572738647,
0.06005113944411278,
-0.022206831723451614,
0.060836173593997955,
-0.055028628557920456,
0.05096713826060295,
0.04151691123843193,
0.16220496594905853,
0.1588181108236313,
0.11671381443738937,
0.08348134905099869,
0.08792752772569656,
-0.1387200504541397,
-0.12699384987354279,
0.14789824187755585,
-0.08211401849985123,
0.1448119729757309,
-0.03226142376661301,
0.1677766740322113,
0.1063687726855278,
-0.1910133808851242,
0.08084079623222351,
-0.03633710741996765,
-0.0786510482430458,
-0.1058942973613739,
-0.09880632907152176,
-0.0824991911649704,
-0.14969998598098755,
0.019139980897307396,
-0.10053777694702148,
0.08574412018060684,
0.035852521657943726,
0.04441479593515396,
0.030590493232011795,
0.09031906723976135,
0.008650165982544422,
-0.01990595832467079,
0.113880954682827,
-0.014872054569423199,
-0.021057581529021263,
-0.0042512258514761925,
-0.07907672226428986,
0.06939097493886948,
-0.01964482292532921,
0.11598610132932663,
0.011352802626788616,
-0.09099441766738892,
0.06578454375267029,
-0.006768448278307915,
-0.10387292504310608,
0.026135645806789398,
-0.030351413413882256,
0.03852183371782303,
0.10285354405641556,
0.052931223064661026,
-0.014354540035128593,
0.008738341741263866,
0.16237115859985352,
-0.07550489902496338,
-0.06799923628568649,
-0.14204612374305725,
0.14226627349853516,
0.023805392906069756,
0.007769363932311535,
0.015201840549707413,
-0.09462489187717438,
-0.02521958015859127,
0.16887950897216797,
0.11876239627599716,
0.00908670574426651,
-0.029603775590658188,
0.02647557482123375,
-0.006066079717129469,
-0.02920684963464737,
0.0696289986371994,
0.10800124704837799,
0.08645041286945343,
-0.01056372094899416,
0.017105471342802048,
-0.026909152045845985,
-0.08639092743396759,
-0.04038460925221443,
0.06909220665693283,
0.008144284598529339,
-0.022257087752223015,
-0.008372892625629902,
0.14098316431045532,
-0.05245949700474739,
-0.14975294470787048,
0.012423363514244556,
-0.13974790275096893,
-0.18542185425758362,
-0.021483633667230606,
0.0885767862200737,
0.051860615611076355,
0.06124157831072807,
0.000006746005510649411,
-0.043475646525621414,
0.14276711642742157,
-0.006033157929778099,
-0.03334978595376015,
-0.10182072967290878,
0.050786107778549194,
-0.08470398932695389,
0.16600263118743896,
-0.019615445286035538,
0.06615342199802399,
0.14182493090629578,
0.033690113574266434,
-0.11362612992525101,
0.05288456007838249,
0.0952351987361908,
-0.1365385204553604,
0.05832631140947342,
0.18844743072986603,
-0.03519010916352272,
0.11916875094175339,
0.0516684465110302,
-0.07609492540359497,
-0.0008023905684240162,
-0.03219393268227577,
0.005798512138426304,
-0.0914454385638237,
-0.0017072748159989715,
-0.06691183149814606,
0.12252546101808548,
0.19260737299919128,
-0.07997464388608932,
0.013357598334550858,
-0.03988146409392357,
0.01321222074329853,
-0.001316038891673088,
0.12641765177249908,
-0.048540256917476654,
-0.2802991569042206,
0.038914795964956284,
-0.023314964026212692,
0.029950251802802086,
-0.1725258231163025,
-0.07946409285068512,
0.03444299474358559,
-0.045057810842990875,
-0.06355202943086624,
0.11062520742416382,
0.05786670744419098,
0.029981188476085663,
-0.04959830641746521,
-0.20278236269950867,
-0.009030099026858807,
0.19861386716365814,
-0.17532847821712494,
-0.06187820807099342
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - PA-IN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6864
- Wer: 0.6707
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 200.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 4.3322 | 14.81 | 400 | 3.7450 | 1.0 |
| 3.2662 | 29.63 | 800 | 3.2571 | 0.9996 |
| 1.6408 | 44.44 | 1200 | 0.9098 | 0.8162 |
| 1.2289 | 59.26 | 1600 | 0.6757 | 0.7099 |
| 1.0551 | 74.07 | 2000 | 0.6417 | 0.7044 |
| 0.966 | 88.89 | 2400 | 0.6365 | 0.6789 |
| 0.8713 | 103.7 | 2800 | 0.6617 | 0.6954 |
| 0.8055 | 118.52 | 3200 | 0.6371 | 0.6762 |
| 0.7489 | 133.33 | 3600 | 0.6798 | 0.6911 |
| 0.7073 | 148.15 | 4000 | 0.6567 | 0.6731 |
| 0.6609 | 162.96 | 4400 | 0.6742 | 0.6840 |
| 0.6435 | 177.78 | 4800 | 0.6862 | 0.6633 |
| 0.6282 | 192.59 | 5200 | 0.6865 | 0.6731 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
|
{"language": ["pa-IN"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "", "results": []}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-300m-pa-IN-cv8-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"pa-IN"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - PA-IN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6864
* Wer: 0.6707
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 200.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
77,
132,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 200.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
-0.1335745006799698,
0.10796721279621124,
-0.0030623185448348522,
0.03545884042978287,
0.13365134596824646,
0.004435485228896141,
0.13398398458957672,
0.12598882615566254,
-0.09671462327241898,
0.0747636929154396,
0.08902042359113693,
0.08861565589904785,
0.06498333066701889,
0.08687300235033035,
-0.03067796118557453,
-0.2816574275493622,
0.027376608923077583,
0.028234845027327538,
-0.08136443048715591,
0.10815354436635971,
0.10784285515546799,
-0.1034279614686966,
0.044221699237823486,
0.04978732392191887,
-0.14742141962051392,
-0.004323868080973625,
0.011378699913620949,
-0.10226870328187943,
0.11189893633127213,
0.04863549768924713,
0.07084333896636963,
0.011061795987188816,
0.0699269026517868,
-0.1972302496433258,
0.0106540871784091,
0.06424489617347717,
0.04291870817542076,
0.08246980607509613,
0.08177702128887177,
0.015641750767827034,
0.0855894386768341,
-0.04196351766586304,
0.05253959074616432,
0.06697476655244827,
-0.08412321656942368,
-0.3217664062976837,
-0.0913664698600769,
0.050505757331848145,
0.09647254645824432,
0.10471885651350021,
-0.013401351869106293,
0.09474150836467743,
-0.015609586611390114,
0.09427913278341293,
0.22856177389621735,
-0.22625572979450226,
-0.0674121230840683,
-0.08235952258110046,
0.06363128870725632,
0.024847256019711494,
-0.08787462115287781,
-0.0076136281713843346,
0.02923240140080452,
0.042403802275657654,
0.09001397341489792,
-0.0002756953763309866,
-0.027483565732836723,
-0.023802535608410835,
-0.13172222673892975,
-0.05545930936932564,
0.19221541285514832,
0.052027806639671326,
-0.05143420025706291,
-0.06895698606967926,
-0.05790906399488449,
-0.1537460833787918,
-0.04241080582141876,
0.019850268959999084,
0.014384588226675987,
-0.04857519641518593,
-0.08614300191402435,
-0.02890878915786743,
-0.06922730058431625,
-0.09959831088781357,
0.0015899193240329623,
0.24197058379650116,
0.03872526064515114,
-0.004425980616360903,
-0.03179611265659332,
0.07525452971458435,
0.04047015309333801,
-0.16995978355407715,
-0.019078580662608147,
0.046272311359643936,
-0.016586419194936752,
0.008729463443160057,
-0.049750134348869324,
-0.03330487757921219,
0.029669519513845444,
0.14983350038528442,
-0.04127504676580429,
0.06678510457277298,
-0.015386316925287247,
0.020031653344631195,
-0.09744452685117722,
0.20680922269821167,
-0.04827548563480377,
-0.026448504999279976,
0.025103090330958366,
0.09498883038759232,
0.058705128729343414,
-0.03051641210913658,
-0.09334976971149445,
0.007636439520865679,
0.10906697809696198,
0.04830489307641983,
0.009584801271557808,
0.02082420140504837,
-0.03920383378863335,
-0.029409848153591156,
0.03918232396245003,
-0.11222377419471741,
0.019798407331109047,
0.02727239951491356,
-0.06632985174655914,
0.023820258677005768,
0.008892802521586418,
0.014671257697045803,
-0.04218660667538643,
0.08033491671085358,
-0.07201582193374634,
-0.013038765639066696,
-0.07762573659420013,
-0.1079406812787056,
0.04031304642558098,
-0.023681845515966415,
0.009267685934901237,
-0.10489007830619812,
-0.11288216710090637,
-0.019564254209399223,
0.01998981274664402,
-0.024921318516135216,
-0.07595144212245941,
-0.03574571758508682,
-0.11175577342510223,
0.06066899374127388,
-0.03246685117483139,
0.11871296912431717,
-0.053961317986249924,
0.10588907450437546,
0.08704586327075958,
0.03427701070904732,
-0.014703826047480106,
0.05928527191281319,
-0.043378185480833054,
0.03493395075201988,
-0.14576782286167145,
0.05898534879088402,
-0.09252326935529709,
0.03925875946879387,
-0.10354214906692505,
-0.1129261925816536,
0.035674192011356354,
-0.00980329792946577,
0.11117593944072723,
0.10600952804088593,
-0.11385331302881241,
-0.11006560176610947,
0.11954513192176819,
-0.08915519714355469,
-0.12638956308364868,
0.13372154533863068,
0.017431700602173805,
-0.049599360674619675,
0.05162001773715019,
0.16442586481571198,
0.1254298835992813,
-0.11461400240659714,
-0.024042190983891487,
-0.04280905798077583,
0.12097335606813431,
-0.02357618510723114,
0.1065865308046341,
-0.034447379410266876,
0.005885795224457979,
0.01409881841391325,
-0.04178153723478317,
0.07204916328191757,
-0.08760350942611694,
-0.07239360362291336,
-0.04463639855384827,
-0.08572256565093994,
0.014326839707791805,
0.04774687439203262,
0.015205045230686665,
-0.09741416573524475,
-0.12035279721021652,
0.04952065646648407,
0.12965044379234314,
-0.0899435356259346,
0.03986981138586998,
-0.10215731710195541,
0.09016898274421692,
-0.050130002200603485,
-0.007435384672135115,
-0.16675812005996704,
-0.021763358265161514,
0.03591843694448471,
-0.05431874468922615,
0.01473027840256691,
-0.1005791649222374,
0.05379912629723549,
0.06784633547067642,
-0.03695155680179596,
-0.07483533769845963,
-0.08591065555810928,
-0.0077935135923326015,
-0.05457768961787224,
-0.19883762300014496,
-0.09030596911907196,
-0.014853551983833313,
0.13985000550746918,
-0.12347602099180222,
0.017222678288817406,
0.03656340762972832,
0.12897588312625885,
0.020223986357450485,
-0.03433990105986595,
-0.0022990526631474495,
0.08180727064609528,
-0.03233757242560387,
-0.05819442868232727,
0.019136052578687668,
0.029767535626888275,
-0.09390062838792801,
0.023915475234389305,
-0.14257073402404785,
0.1335005760192871,
0.12785151600837708,
-0.027054104954004288,
-0.016171269118785858,
0.03237713873386383,
-0.05579492449760437,
-0.04623553529381752,
-0.018297864124178886,
-0.04349803179502487,
0.1329522728919983,
0.01491062343120575,
0.13377591967582703,
-0.08954848349094391,
-0.03197059780359268,
0.048007093369960785,
0.0014466444263234735,
-0.018591873347759247,
0.08857881277799606,
0.05831049382686615,
-0.025219637900590897,
0.10789936780929565,
0.06655491143465042,
-0.10267467796802521,
0.1591702103614807,
-0.07816119492053986,
-0.07690102607011795,
-0.0306655615568161,
-0.025251712650060654,
0.03031351789832115,
0.12327047437429428,
-0.17833665013313293,
-0.02437560446560383,
0.035383038222789764,
0.006806207820773125,
0.021098271012306213,
-0.20728573203086853,
0.017137689515948296,
0.025603704154491425,
-0.07307703793048859,
-0.06126776337623596,
0.018849171698093414,
-0.008609580807387829,
0.07882575690746307,
0.011439156718552113,
-0.0654059574007988,
0.0033774180337786674,
-0.015139526687562466,
-0.07690981030464172,
0.16667497158050537,
-0.1202651709318161,
-0.15925338864326477,
-0.1802888661623001,
-0.040051065385341644,
-0.07387405633926392,
0.007416946347802877,
0.06273387372493744,
-0.08714406192302704,
-0.04780382290482521,
-0.03918968886137009,
0.052523087710142136,
-0.04930104315280914,
0.04090742766857147,
0.038060083985328674,
-0.011992629617452621,
0.07545816898345947,
-0.11965248733758926,
0.004381199833005667,
-0.0004428733664099127,
-0.01162682007998228,
-0.0285855270922184,
0.045556459575891495,
0.11270657926797867,
0.15138594806194305,
0.035998180508613586,
0.01857915334403515,
-0.02654011733829975,
0.21228449046611786,
-0.11819636821746826,
-0.027063071727752686,
0.18649430572986603,
0.010981596074998379,
0.04417708143591881,
0.09264271706342697,
0.056302644312381744,
-0.07086681574583054,
0.0015053643146529794,
0.00912229623645544,
-0.025065049529075623,
-0.23541481792926788,
-0.04151366651058197,
-0.0716521292924881,
-0.013088349252939224,
0.06874574720859528,
0.02756459452211857,
0.04437169432640076,
0.036301903426647186,
-0.04250568151473999,
0.039291657507419586,
-0.0012181057827547193,
0.08732608705759048,
0.1716306507587433,
0.04927496239542961,
0.12338811159133911,
-0.028744133189320564,
-0.018476486206054688,
0.0283191055059433,
-0.013619551435112953,
0.19269825518131256,
0.020229820162057877,
0.20411467552185059,
0.035088710486888885,
0.14811810851097107,
0.012198980897665024,
0.07356711477041245,
0.025969108566641808,
0.009025268256664276,
0.0274828989058733,
-0.06222657486796379,
-0.056119173765182495,
0.00723256915807724,
0.04466521739959717,
0.07112006843090057,
-0.11856362968683243,
-0.004474347457289696,
0.0071374401450157166,
0.3605150282382965,
0.04332846403121948,
-0.3183424174785614,
-0.1295240819454193,
0.004939128644764423,
-0.07071900367736816,
-0.06147652119398117,
0.031157881021499634,
0.09584265947341919,
-0.07176688313484192,
0.07873979955911636,
-0.05220310762524605,
0.09559410065412521,
-0.052680954337120056,
0.01623116433620453,
0.07023829221725464,
0.1207171082496643,
0.014464532025158405,
0.05011361464858055,
-0.24986539781093597,
0.24896669387817383,
0.01294083520770073,
0.10251561552286148,
-0.04831275716423988,
0.04413191229104996,
0.028739240020513535,
0.019934363663196564,
0.025029871612787247,
-0.017456848174333572,
-0.05594240501523018,
-0.16286063194274902,
-0.07389958947896957,
0.020397573709487915,
0.1205051988363266,
-0.02480465918779373,
0.11698299646377563,
-0.05136614292860031,
-0.032018210738897324,
0.05334758386015892,
-0.07050812989473343,
-0.10920627415180206,
-0.0799020305275917,
0.045985087752342224,
0.10014388710260391,
0.08242034912109375,
-0.09679925441741943,
-0.10779471695423126,
-0.027475841343402863,
0.09695813059806824,
-0.1093132272362709,
-0.049948547035455704,
-0.1202823594212532,
0.028105437755584717,
0.15053528547286987,
-0.07323823869228363,
0.06104325130581856,
-0.0017120350385084748,
0.15842896699905396,
0.024486815556883812,
-0.038477398455142975,
0.08889001607894897,
-0.09922461211681366,
-0.21356773376464844,
-0.030375780537724495,
0.18827345967292786,
-0.007838675752282143,
0.05813094601035118,
-0.02010090835392475,
0.03191184997558594,
-0.036483388394117355,
-0.06579741090536118,
0.04186629876494408,
0.023616690188646317,
0.004894969519227743,
0.03168367221951485,
0.0018184661166742444,
0.00785356666892767,
-0.08177302032709122,
-0.05170243978500366,
0.1179157942533493,
0.23233731091022491,
-0.05549762770533562,
0.0011843364918604493,
0.07517348974943161,
-0.049624159932136536,
-0.14941789209842682,
-0.012722685001790524,
0.1085098460316658,
0.021997099742293358,
-0.06004998832941055,
-0.2107112854719162,
0.03472311794757843,
0.05859653279185295,
-0.04089272767305374,
0.11001818627119064,
-0.29033613204956055,
-0.14463457465171814,
0.10269539058208466,
0.08283348381519318,
0.03314382582902908,
-0.16000407934188843,
-0.07393541187047958,
-0.06230895593762398,
-0.11921848356723785,
0.08879929035902023,
-0.059529490768909454,
0.12353145331144333,
0.0003319708921480924,
0.07727126032114029,
0.009112018160521984,
-0.046123258769512177,
0.15394611656665802,
-0.014614315703511238,
0.02864680252969265,
-0.015258989296853542,
0.04395056143403053,
0.06709963828325272,
-0.0422099307179451,
0.02755190245807171,
-0.053826410323381424,
0.040560707449913025,
-0.10774024575948715,
-0.02989114262163639,
-0.09986361116170883,
0.030538499355316162,
-0.026232384145259857,
-0.020304419100284576,
-0.011027871631085873,
-0.0046763732098042965,
0.02726687118411064,
-0.002981573808938265,
0.18084432184696198,
-0.011899354867637157,
0.15007524192333221,
0.13903342187404633,
0.11553605645895004,
-0.055221110582351685,
-0.08941406011581421,
-0.007307001855224371,
-0.042367495596408844,
0.08112972229719162,
-0.10854335874319077,
0.030162764713168144,
0.10531460493803024,
0.05856364220380783,
0.10822561383247375,
0.0641244426369667,
-0.08894473314285278,
0.02598373405635357,
0.061371952295303345,
-0.11635925620794296,
-0.14201928675174713,
-0.05033520981669426,
0.050987593829631805,
-0.11916123330593109,
0.055465102195739746,
0.13782387971878052,
-0.06637641042470932,
-0.010319037362933159,
0.015305249951779842,
0.007652108557522297,
-0.05062423273921013,
0.2274724841117859,
0.05805191397666931,
0.08273427188396454,
-0.11382143199443817,
0.0916675254702568,
0.04028002917766571,
-0.10814615339040756,
0.032383132725954056,
0.08540799468755722,
-0.057889264076948166,
-0.01883787102997303,
-0.035455286502838135,
0.050995469093322754,
-0.04601661115884781,
-0.09365084767341614,
-0.15266217291355133,
-0.14551794528961182,
0.08391566574573517,
0.14420302212238312,
0.033397335559129715,
0.01788719743490219,
-0.04737946018576622,
0.053264062851667404,
-0.12248274683952332,
0.09829185158014297,
0.052462682127952576,
0.0825849249958992,
-0.15946367383003235,
0.12898874282836914,
0.021051418036222458,
0.039952442049980164,
-0.005078259855508804,
-0.005644295830279589,
-0.0776180550456047,
0.03181328624486923,
-0.16332250833511353,
-0.023293672129511833,
-0.02916102483868599,
-0.00036968549829907715,
0.001675007282756269,
-0.07757959514856339,
-0.08030113577842712,
0.05723750963807106,
-0.10426478832960129,
-0.053395699709653854,
-0.0034728737082332373,
0.03356039896607399,
-0.1179966852068901,
-0.000882621097844094,
0.047299761325120926,
-0.12673984467983246,
0.08583451807498932,
0.08536797016859055,
0.013134168460965157,
0.05553000047802925,
-0.0303491298109293,
-0.02865116111934185,
0.04905838146805763,
0.017505358904600143,
0.03361927717924118,
-0.15576587617397308,
-0.007484036032110453,
0.0027228102553635836,
0.03831152245402336,
-0.01482666376978159,
0.08242785185575485,
-0.11837751418352127,
-0.026725776493549347,
0.0010272301733493805,
-0.026455769315361977,
-0.05997028201818466,
0.03932857885956764,
0.1025678813457489,
0.03249423950910568,
0.16934078931808472,
-0.07700394093990326,
0.02776254527270794,
-0.1996377855539322,
0.03161737322807312,
-0.04446026310324669,
-0.12064898014068604,
-0.08291736990213394,
-0.011879943311214447,
0.08739736676216125,
-0.05315670371055603,
0.061540231108665466,
-0.052463021129369736,
0.11876492202281952,
0.04433399811387062,
-0.030095450580120087,
-0.010163344442844391,
0.03966652601957321,
0.22090712189674377,
0.05882486328482628,
-0.016041245311498642,
0.08936472982168198,
-0.007006077561527491,
0.08652524650096893,
0.12487170100212097,
0.11490127444267273,
0.11106054484844208,
0.0636286586523056,
0.11721888184547424,
0.10870427638292313,
-0.07212647795677185,
-0.17944137752056122,
0.020345915108919144,
-0.028371509164571762,
0.14602532982826233,
-0.004299139138311148,
0.2047538161277771,
0.12081461399793625,
-0.1409768909215927,
0.04940374568104744,
-0.031083157286047935,
-0.06400123983621597,
-0.11286429315805435,
-0.03523979336023331,
-0.07096275687217712,
-0.17070350050926208,
0.0049325549043715,
-0.12037155777215958,
0.054424870759248734,
0.05805767700076103,
0.016639305278658867,
0.009893118403851986,
0.13723787665367126,
0.06159709393978119,
-0.027695972472429276,
0.1030396819114685,
-0.0019225069554522634,
-0.03838130831718445,
-0.0422048345208168,
-0.11418324708938599,
0.054322514683008194,
-0.013350519351661205,
0.062153223901987076,
-0.017559748142957687,
-0.12163086235523224,
0.067941814661026,
0.00808835867792368,
-0.11909335851669312,
0.035939041525125504,
0.004630837589502335,
0.06956008076667786,
0.08147738128900528,
0.014561864547431469,
0.0009530486422590911,
0.0060663167387247086,
0.22935518622398376,
-0.11155696958303452,
-0.07122159004211426,
-0.11771188676357269,
0.21016308665275574,
0.0017201079754158854,
-0.02642299421131611,
0.031475670635700226,
-0.07641920447349548,
-0.058867115527391434,
0.17225974798202515,
0.12307064980268478,
-0.0015971306711435318,
-0.028929468244314194,
0.004636226687580347,
-0.01317250169813633,
-0.07088100910186768,
0.09150364249944687,
0.13680759072303772,
0.06539875268936157,
-0.033361922949552536,
-0.016274215653538704,
-0.05232592299580574,
-0.03275831788778305,
-0.03326934948563576,
0.07945148646831512,
-0.022251222282648087,
-0.04198124259710312,
-0.013462298549711704,
0.08515613526105881,
-0.05930950120091438,
-0.13507460057735443,
-0.005262914579361677,
-0.16639646887779236,
-0.16692772507667542,
-0.02886279858648777,
0.07979067414999008,
0.0414390005171299,
0.05833986774086952,
0.002616708166897297,
-0.00874865148216486,
0.13225670158863068,
-0.0013467700919136405,
-0.05723949894309044,
-0.12322011590003967,
0.09974747896194458,
-0.11747557669878006,
0.18354317545890808,
-0.04447920247912407,
0.06337424367666245,
0.11519928276538849,
0.06876189261674881,
-0.10444498807191849,
0.04948458448052406,
0.07909107953310013,
-0.1504998356103897,
0.03924231231212616,
0.19447386264801025,
-0.036491554230451584,
0.1317342221736908,
0.009121960960328579,
-0.11690068989992142,
-0.014954814687371254,
-0.04106120020151138,
-0.03145353123545647,
-0.051828473806381226,
-0.03608289361000061,
-0.06808080524206161,
0.11987577378749847,
0.1638822853565216,
-0.07498133182525635,
-0.041838593780994415,
-0.05203299596905708,
0.01643221639096737,
0.07244564592838287,
0.06868356466293335,
-0.03634560853242874,
-0.2946893572807312,
-0.0017831885488703847,
0.006678336299955845,
0.004062369000166655,
-0.23970194160938263,
-0.06410940736532211,
0.025697115808725357,
-0.05965402349829674,
-0.062226034700870514,
0.07161751389503479,
0.04993699491024017,
0.017364930361509323,
-0.059271518141031265,
-0.062381960451602936,
-0.04748386889696121,
0.18128502368927002,
-0.1930537223815918,
-0.07300852239131927
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Slovak
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - SK dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3067
- Wer: 0.2678
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- num_epochs: 60.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 5.175 | 2.41 | 400 | 4.6909 | 1.0 |
| 3.3785 | 4.82 | 800 | 3.3080 | 1.0 |
| 2.6964 | 7.23 | 1200 | 2.0651 | 1.1055 |
| 1.3008 | 9.64 | 1600 | 0.5845 | 0.6207 |
| 1.1185 | 12.05 | 2000 | 0.4195 | 0.4193 |
| 1.0252 | 14.46 | 2400 | 0.3824 | 0.3570 |
| 0.935 | 16.87 | 2800 | 0.3693 | 0.3462 |
| 0.8818 | 19.28 | 3200 | 0.3587 | 0.3318 |
| 0.8534 | 21.69 | 3600 | 0.3420 | 0.3180 |
| 0.8137 | 24.1 | 4000 | 0.3426 | 0.3130 |
| 0.7968 | 26.51 | 4400 | 0.3349 | 0.3102 |
| 0.7558 | 28.92 | 4800 | 0.3216 | 0.3019 |
| 0.7313 | 31.33 | 5200 | 0.3451 | 0.3060 |
| 0.7358 | 33.73 | 5600 | 0.3272 | 0.2967 |
| 0.718 | 36.14 | 6000 | 0.3315 | 0.2882 |
| 0.6991 | 38.55 | 6400 | 0.3299 | 0.2830 |
| 0.6529 | 40.96 | 6800 | 0.3140 | 0.2836 |
| 0.6225 | 43.37 | 7200 | 0.3128 | 0.2751 |
| 0.633 | 45.78 | 7600 | 0.3211 | 0.2774 |
| 0.5876 | 48.19 | 8000 | 0.3162 | 0.2764 |
| 0.588 | 50.6 | 8400 | 0.3082 | 0.2722 |
| 0.5915 | 53.01 | 8800 | 0.3120 | 0.2681 |
| 0.5798 | 55.42 | 9200 | 0.3133 | 0.2709 |
| 0.5736 | 57.83 | 9600 | 0.3086 | 0.2676 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-sk-cv8-with-lm --dataset mozilla-foundation/common_voice_8_0 --config sk --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-sk-cv8-with-lm --dataset speech-recognition-community-v2/dev_data --config sk --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-xls-r-300m-sk-cv8-with-lm"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "sk", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => ""
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 26.707 | 18.609 |
|
{"language": ["sk"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "XLS-R-300M - Slovak", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "sk"}, "metrics": [{"type": "wer", "value": 18.609, "name": "Test WER"}, {"type": "cer", "value": 5.488, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "sk"}, "metrics": [{"type": "wer", "value": 40.548, "name": "Test WER"}, {"type": "cer", "value": 17.733, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "sk"}, "metrics": [{"type": "wer", "value": 44.1, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-300m-sk-cv8-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"sk",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sk"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #sk #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Slovak
===================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - SK dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3067
* Wer: 0.2678
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1500
* num\_epochs: 60.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #sk #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
111,
132,
4,
39,
60,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #sk #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.10816436260938644,
0.14327478408813477,
-0.006550380494445562,
0.021264344453811646,
0.10753609985113144,
0.033748965710401535,
0.10501787066459656,
0.16457761824131012,
-0.03809376806020737,
0.14146360754966736,
0.05372905358672142,
0.1040072962641716,
0.08695638924837112,
0.09018045663833618,
-0.035453490912914276,
-0.20592732727527618,
0.037694986909627914,
-0.07138730585575104,
-0.08748101443052292,
0.10155938565731049,
0.09052729606628418,
-0.08268935978412628,
0.029947755858302116,
-0.0071204109117388725,
-0.06036657467484474,
-0.0129569536074996,
-0.03128877282142639,
-0.04126257821917534,
0.06070004031062126,
0.038380324840545654,
0.024432050064206123,
0.030596911907196045,
0.04879746213555336,
-0.30875521898269653,
0.002407362684607506,
0.07618803530931473,
0.02372489683330059,
0.04262789338827133,
0.0999661460518837,
-0.027194540947675705,
0.09961491823196411,
-0.09208155423402786,
0.04370609670877457,
0.08230654895305634,
-0.08162329345941544,
-0.22675003111362457,
-0.10839734226465225,
0.03410034254193306,
0.15300138294696808,
0.07891538739204407,
-0.046323154121637344,
0.03510568290948868,
-0.09189128875732422,
0.08985871821641922,
0.19411879777908325,
-0.2079850733280182,
-0.04827577993273735,
-0.0003654724278021604,
0.014361345209181309,
0.0329640693962574,
-0.0917782187461853,
-0.011177482083439827,
0.005616811104118824,
0.008345787413418293,
0.04320913925766945,
-0.00426848791539669,
0.05082527548074722,
0.007256554439663887,
-0.14690318703651428,
-0.07987884432077408,
0.1398775279521942,
0.06429778784513474,
-0.03875588998198509,
-0.1181548610329628,
-0.010603467933833599,
-0.16874264180660248,
-0.04580536112189293,
0.009772700257599354,
0.01272146962583065,
-0.023879803717136383,
-0.004481592681258917,
0.027567824348807335,
-0.05785878375172615,
-0.07128416001796722,
0.06210998073220253,
0.11944152414798737,
0.047177258878946304,
-0.04293948784470558,
0.01575058326125145,
0.08253896236419678,
0.0357646718621254,
-0.15523967146873474,
-0.052039217203855515,
0.03648992255330086,
-0.1421133279800415,
-0.011224043555557728,
-0.026916945353150368,
-0.003796699456870556,
0.09099997580051422,
0.1922444850206375,
0.004273276310414076,
0.10271056741476059,
-0.015690598636865616,
0.008182784542441368,
-0.04769013449549675,
0.1477481722831726,
-0.0328923724591732,
-0.07937254756689072,
-0.03750275820493698,
0.12157992273569107,
-0.00416428642347455,
-0.013518569059669971,
-0.030926862731575966,
0.03462269902229309,
0.11110994964838028,
0.09818670153617859,
0.026343513280153275,
0.006740346550941467,
-0.08464431762695312,
-0.024821976199746132,
-0.003301840741187334,
-0.14937949180603027,
0.05896289646625519,
0.08043275028467178,
-0.043272923678159714,
-0.026654241606593132,
-0.013125159777700901,
0.016913918778300285,
-0.06197226792573929,
0.10004189610481262,
-0.04803766682744026,
0.0005628169164992869,
-0.06298015266656876,
-0.09850164502859116,
0.05633768066763878,
-0.008471948094666004,
-0.03549797087907791,
-0.05404195934534073,
-0.0714525356888771,
-0.08991280943155289,
0.04384739324450493,
-0.05993727967143059,
-0.05128102749586105,
-0.077788345515728,
-0.09876342117786407,
0.05178862810134888,
-0.009632746689021587,
0.13240478932857513,
-0.05809900164604187,
0.08615018427371979,
0.021486185491085052,
0.03651779890060425,
0.13136205077171326,
0.06615882366895676,
-0.014798221178352833,
0.05945996195077896,
-0.1439119428396225,
0.12663845717906952,
-0.14205357432365417,
0.04647337645292282,
-0.1616877168416977,
-0.08379259705543518,
0.010610071010887623,
0.0007814185810275376,
0.0896858498454094,
0.15059223771095276,
-0.18375034630298615,
-0.07106097787618637,
0.15234586596488953,
-0.06314761936664581,
-0.09050066024065018,
0.14485478401184082,
0.0009181774221360683,
-0.05019361153244972,
0.016723796725273132,
0.16690179705619812,
0.13568270206451416,
-0.11434166133403778,
-0.02921033836901188,
-0.06833788007497787,
0.07372939586639404,
0.061571624130010605,
0.10824616253376007,
-0.07849162817001343,
0.02436196617782116,
-0.0072389086708426476,
-0.06087744981050491,
0.013130285777151585,
-0.0619468055665493,
-0.08233392238616943,
-0.0027487012557685375,
-0.044860560446977615,
-0.01461096853017807,
0.020227007567882538,
-0.0315081812441349,
-0.08491268008947372,
-0.1281798779964447,
-0.03477273881435394,
0.10435285419225693,
-0.08359707146883011,
0.028776023536920547,
-0.09638708084821701,
0.07143282145261765,
0.001706954906694591,
0.02755332551896572,
-0.14827625453472137,
-0.032846275717020035,
0.043634865432977676,
-0.07301952689886093,
0.005743789952248335,
-0.04691905900835991,
0.03680165484547615,
0.02120579406619072,
-0.0096511859446764,
-0.0504433810710907,
-0.03931586816906929,
-0.010035013779997826,
-0.04718510061502457,
-0.2206137776374817,
-0.0632183700799942,
-0.018734579905867577,
0.20476050674915314,
-0.1884951889514923,
0.025540851056575775,
0.10875961184501648,
0.10128825902938843,
0.004889394622296095,
-0.050680097192525864,
0.019203737378120422,
0.055589400231838226,
-0.019109230488538742,
-0.05334543436765671,
0.013769385404884815,
-0.005584393627941608,
-0.08685839176177979,
-0.014959841966629028,
-0.14196133613586426,
0.011277157813310623,
0.08086689561605453,
0.04284222796559334,
-0.058227844536304474,
-0.04529241845011711,
-0.06004621833562851,
-0.04529089853167534,
-0.05077647045254707,
-0.05053858831524849,
0.09673311561346054,
0.05054429545998573,
0.08453728258609772,
-0.06578255444765091,
-0.05473250523209572,
0.029321908950805664,
0.007658248767256737,
-0.019226737320423126,
0.15665464103221893,
0.06207229942083359,
-0.034486860036849976,
0.08363959193229675,
0.02177361771464348,
-0.035625241696834564,
0.09870035201311111,
-0.06528646498918533,
-0.0889657512307167,
-0.05750727280974388,
0.07406267523765564,
0.04612693563103676,
0.07342776656150818,
-0.18157371878623962,
-0.006638627033680677,
0.03739531710743904,
0.03590543940663338,
0.021397367119789124,
-0.16651083528995514,
0.01977030374109745,
0.026757633313536644,
-0.0893736407160759,
-0.024848952889442444,
0.024871056899428368,
-0.001334444503299892,
0.07124034315347672,
0.010537118650972843,
-0.03874645009636879,
-0.032457590103149414,
-0.0638633444905281,
-0.1262868046760559,
0.1525879055261612,
-0.11059807986021042,
-0.149014413356781,
-0.12845420837402344,
-0.03304881602525711,
-0.041166502982378006,
-0.02962169237434864,
0.07211969792842865,
-0.10172028094530106,
-0.06138105317950249,
-0.07701926678419113,
-0.007342191878706217,
-0.02058780938386917,
0.013247580267488956,
0.041953589767217636,
0.01243077777326107,
0.05156407505273819,
-0.10851060599088669,
-0.009506098926067352,
-0.001032804837450385,
-0.014631201513111591,
0.0024404539726674557,
0.05494895577430725,
0.09633778780698776,
0.1609879732131958,
0.06407611072063446,
0.06155324727296829,
-0.017850419506430626,
0.23349536955356598,
-0.11767903715372086,
-0.007549519184976816,
0.10439223051071167,
-0.003061117837205529,
0.0657535046339035,
0.16136518120765686,
0.031545694917440414,
-0.08594508469104767,
0.020625095814466476,
0.0575895793735981,
-0.006644223816692829,
-0.26059800386428833,
-0.03649888187646866,
-0.07769251614809036,
-0.008826623670756817,
0.07771729677915573,
0.04463962838053703,
0.021507548168301582,
-0.005066693760454655,
-0.021060163155198097,
-0.02595360577106476,
0.05352550745010376,
0.0664062425494194,
0.10351042449474335,
0.03862554207444191,
0.08732975274324417,
-0.022236844524741173,
0.0014448725851252675,
0.023969324305653572,
-0.0031589537393301725,
0.21753862500190735,
0.013040355406701565,
0.1990502029657364,
0.07902435213327408,
0.13334976136684418,
-0.019236022606492043,
0.02899564616382122,
-0.005133918486535549,
0.022634893655776978,
0.042574137449264526,
-0.07077261060476303,
-0.05174574628472328,
0.04110604152083397,
0.12413066625595093,
-0.012260224670171738,
-0.08275388926267624,
0.03288546949625015,
0.05613727122545242,
0.3131828010082245,
0.07207076251506805,
-0.2218226194381714,
-0.05345100164413452,
0.026433059945702553,
-0.06390084326267242,
-0.016941452398896217,
0.005232612136751413,
0.10449367761611938,
-0.08244550228118896,
0.07046663016080856,
-0.04830936715006828,
0.09122247993946075,
-0.060802049934864044,
0.004666280001401901,
0.06830503046512604,
0.10642354935407639,
0.014100651256740093,
0.05944746732711792,
-0.2569601535797119,
0.21149951219558716,
0.0013232867931947112,
0.07286082208156586,
-0.06453953683376312,
0.05929094925522804,
0.02828568033874035,
-0.05436704307794571,
0.10094287246465683,
0.0024163208436220884,
-0.106058768928051,
-0.1571177989244461,
-0.10575320571660995,
0.004114694893360138,
0.13244162499904633,
-0.07236441224813461,
0.13089841604232788,
-0.03514096140861511,
-0.06059068813920021,
0.011616511270403862,
-0.015124783851206303,
-0.12843693792819977,
-0.10203547775745392,
0.07773193717002869,
0.014208239503204823,
0.06962697952985764,
-0.07847968488931656,
-0.06983061879873276,
-0.07098998129367828,
0.14797040820121765,
-0.15246354043483734,
-0.029291031882166862,
-0.13369613885879517,
0.047165557742118835,
0.15124405920505524,
-0.0729864165186882,
0.022827399894595146,
0.013799956999719143,
0.12387953698635101,
0.029985032975673676,
0.0012838520342484117,
0.08120469003915787,
-0.08271746337413788,
-0.19283808767795563,
-0.03673076629638672,
0.19592833518981934,
0.014716166071593761,
0.06307054311037064,
-0.016038108617067337,
0.012393888086080551,
0.007141581270843744,
-0.09529489278793335,
0.08479674905538559,
0.06277093291282654,
-0.003525978419929743,
0.08062152564525604,
-0.045123763382434845,
-0.04975654557347298,
-0.11440484970808029,
-0.054194752126932144,
0.11592364311218262,
0.26200589537620544,
-0.06257636845111847,
0.04802659526467323,
0.02909206412732601,
-0.06474192440509796,
-0.137605682015419,
-0.02197316288948059,
0.10841626673936844,
0.02243478037416935,
-0.009625466540455818,
-0.1750539094209671,
0.007246070541441441,
0.08150516450405121,
-0.01278693601489067,
0.09310393035411835,
-0.3320632874965668,
-0.13152970373630524,
0.05496344342827797,
0.0595041923224926,
-0.03587374836206436,
-0.18135838210582733,
-0.0985599160194397,
-0.01698523387312889,
-0.10319369286298752,
0.058545731008052826,
-0.009209003299474716,
0.11454447358846664,
0.014709352515637875,
0.012130853720009327,
0.026070835068821907,
-0.05525944009423256,
0.15290160477161407,
0.06058332324028015,
0.015740487724542618,
-0.01192319393157959,
0.015034565702080727,
0.01419105101376772,
-0.07306040823459625,
0.04944799095392227,
-0.0732470452785492,
0.015983622521162033,
-0.1485210508108139,
-0.018647339195013046,
-0.07162842899560928,
-0.005523011088371277,
-0.06555898487567902,
0.002226337557658553,
-0.018535839393734932,
0.045428093522787094,
0.1142764687538147,
0.010568839497864246,
0.07688872516155243,
-0.054673708975315094,
0.08970732241868973,
0.14455963671207428,
0.10378355532884598,
0.016782676801085472,
-0.12681865692138672,
0.0038943334948271513,
0.005540103651583195,
0.01407761313021183,
-0.11763733625411987,
0.05738762766122818,
0.131135493516922,
0.04005621373653412,
0.17050954699516296,
0.033973000943660736,
-0.10294412076473236,
-0.008692803792655468,
0.06789946556091309,
-0.06310620903968811,
-0.18545885384082794,
-0.009927625767886639,
0.01002264954149723,
-0.13870196044445038,
-0.02443672902882099,
0.11847161501646042,
-0.010928497649729252,
0.004652355331927538,
0.021368995308876038,
0.07172197103500366,
-0.02984839864075184,
0.22151808440685272,
0.01360998023301363,
0.11620011180639267,
-0.09128235280513763,
0.07537858188152313,
0.05284871160984039,
-0.09826038777828217,
0.04457027092576027,
0.11567378044128418,
-0.058044709265232086,
-0.03536754101514816,
-0.03997305780649185,
0.07220520824193954,
0.06326816231012344,
-0.045785386115312576,
-0.10389959812164307,
-0.12735405564308167,
0.09708499908447266,
0.041866764426231384,
0.025284942239522934,
0.0367511622607708,
-0.0040650032460689545,
0.027506127953529358,
-0.08803647011518478,
0.11868035048246384,
0.09331773966550827,
0.04231937974691391,
-0.10541820526123047,
0.06918070465326309,
0.0012736094649881124,
0.009326508268713951,
0.016501672565937042,
-0.017220035195350647,
-0.10349385440349579,
0.029718317091464996,
-0.09264562278985977,
-0.009793462231755257,
-0.07750190049409866,
-0.007210155017673969,
0.03297162428498268,
-0.057922229170799255,
-0.050031647086143494,
0.027666324749588966,
-0.10916905105113983,
-0.08139587193727493,
-0.04442775249481201,
0.09518006443977356,
-0.12410333007574081,
-0.005819373764097691,
0.038663387298583984,
-0.15043756365776062,
0.10506666451692581,
0.04707428067922592,
0.00016773975221440196,
0.003339825663715601,
-0.06957631558179855,
-0.021149441599845886,
0.02646719664335251,
0.020490199327468872,
0.03370840102434158,
-0.22181910276412964,
-0.0022512830328196287,
-0.02119392342865467,
-0.0007120250375010073,
0.0016746275359764695,
0.02938251942396164,
-0.12224320322275162,
0.006320587825030088,
-0.041635096073150635,
-0.04710322991013527,
-0.05057869851589203,
0.041883692145347595,
0.08470463752746582,
0.01585778407752514,
0.16744065284729004,
-0.05414751172065735,
0.08223110437393188,
-0.2017214149236679,
-0.0010455991141498089,
-0.0007701926515437663,
-0.03281401842832565,
-0.02023538015782833,
-0.022842885926365852,
0.10117791593074799,
-0.05899190902709961,
0.04860169440507889,
-0.042666275054216385,
0.052337512373924255,
0.03216153755784035,
-0.06628577411174774,
0.007617639843374491,
0.04174492880702019,
0.12908339500427246,
0.05709667131304741,
-0.0217500701546669,
0.0641937181353569,
-0.05355435237288475,
0.04339519515633583,
0.04762847349047661,
0.15689565241336823,
0.1563979834318161,
0.11502739042043686,
0.06737256795167923,
0.08423880487680435,
-0.1360078752040863,
-0.1336330622434616,
0.15986000001430511,
-0.08729101717472076,
0.13545894622802734,
-0.04257786646485329,
0.17170438170433044,
0.09973152726888657,
-0.1995689570903778,
0.08314643055200577,
-0.027655430138111115,
-0.08402961492538452,
-0.09867813438177109,
-0.09774906188249588,
-0.07344091683626175,
-0.14321839809417725,
0.01039990782737732,
-0.099359892308712,
0.07842270284891129,
0.03028123825788498,
0.04532089829444885,
0.03867927938699722,
0.08727312833070755,
0.01051965169608593,
-0.008965798653662205,
0.10389368236064911,
-0.011455697007477283,
-0.023121438920497894,
0.000536169798579067,
-0.06827841699123383,
0.05831511691212654,
-0.016338353976607323,
0.10309188067913055,
0.012321466580033302,
-0.07847832143306732,
0.061947427690029144,
-0.014256919734179974,
-0.1043781116604805,
0.03199608251452446,
-0.023046059533953667,
0.02786368317902088,
0.11659480631351471,
0.05584218353033066,
-0.012984583154320717,
0.005316238384693861,
0.17668268084526062,
-0.07131258398294449,
-0.07235424220561981,
-0.14766095578670502,
0.1387258619070053,
0.02730604261159897,
0.014720150269567966,
0.015946120023727417,
-0.09581752121448517,
-0.02938438579440117,
0.16587696969509125,
0.11201494932174683,
-0.006373344920575619,
-0.015406789258122444,
0.04058556631207466,
-0.0018294805195182562,
-0.024939222261309624,
0.061008721590042114,
0.11126931011676788,
0.09720081090927124,
-0.01750904694199562,
0.010278945788741112,
-0.024146005511283875,
-0.08265513926744461,
-0.032447218894958496,
0.06908907741308212,
0.009265080094337463,
-0.009301096200942993,
-0.0026308775413781404,
0.12788806855678558,
-0.05888160318136215,
-0.164398193359375,
0.018615281209349632,
-0.14700646698474884,
-0.1775747537612915,
-0.021583670750260353,
0.07998920977115631,
0.03807784989476204,
0.05571558326482773,
0.003172246739268303,
-0.050455242395401,
0.15749762952327728,
-0.00744128692895174,
-0.026474684476852417,
-0.094009630382061,
0.06474854052066803,
-0.07511577010154724,
0.1641688346862793,
-0.01957014389336109,
0.0582951083779335,
0.14437608420848846,
0.037935417145490646,
-0.11176495254039764,
0.0452752560377121,
0.09406735748052597,
-0.13029424846172333,
0.05256699398159981,
0.18788690865039825,
-0.03318806737661362,
0.12678217887878418,
0.05006913095712662,
-0.08795417100191116,
0.003052614163607359,
-0.02574169635772705,
-0.00849533174186945,
-0.07885801047086716,
-0.003241542959585786,
-0.06616959720849991,
0.11149086803197861,
0.19350150227546692,
-0.06923193484544754,
0.013010171242058277,
-0.03944816067814827,
0.01570386439561844,
-0.0001942736707860604,
0.1370452493429184,
-0.04827200621366501,
-0.269437700510025,
0.052666399627923965,
-0.010455157607793808,
0.03410736843943596,
-0.1860520839691162,
-0.09021252393722534,
0.0419577993452549,
-0.03060043603181839,
-0.0645819902420044,
0.12126751244068146,
0.06364982575178146,
0.03490952402353287,
-0.04762066900730133,
-0.18192724883556366,
-0.011323394253849983,
0.19586871564388275,
-0.1587725728750229,
-0.056833695620298386
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLS-R-300M - Slovenian
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - SL dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2578
- Wer: 0.2273
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 60.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.1829 | 4.88 | 400 | 3.1228 | 1.0 |
| 2.8675 | 9.76 | 800 | 2.8616 | 0.9993 |
| 1.583 | 14.63 | 1200 | 0.6392 | 0.6239 |
| 1.1959 | 19.51 | 1600 | 0.3602 | 0.3651 |
| 1.0276 | 24.39 | 2000 | 0.3021 | 0.2981 |
| 0.9671 | 29.27 | 2400 | 0.2872 | 0.2739 |
| 0.873 | 34.15 | 2800 | 0.2593 | 0.2459 |
| 0.8513 | 39.02 | 3200 | 0.2617 | 0.2473 |
| 0.8132 | 43.9 | 3600 | 0.2548 | 0.2426 |
| 0.7935 | 48.78 | 4000 | 0.2637 | 0.2353 |
| 0.7565 | 53.66 | 4400 | 0.2629 | 0.2322 |
| 0.7359 | 58.54 | 4800 | 0.2579 | 0.2253 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-sl-cv8-with-lm --dataset mozilla-foundation/common_voice_8_0 --config sl --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id anuragshas/wav2vec2-xls-r-300m-sl-cv8-with-lm --dataset speech-recognition-community-v2/dev_data --config sl --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
### Inference With LM
```python
import torch
from datasets import load_dataset
from transformers import AutoModelForCTC, AutoProcessor
import torchaudio.functional as F
model_id = "anuragshas/wav2vec2-xls-r-300m-sl-cv8-with-lm"
sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "sl", split="test", streaming=True, use_auth_token=True))
sample = next(sample_iter)
resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy()
model = AutoModelForCTC.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
input_values = processor(resampled_audio, return_tensors="pt").input_values
with torch.no_grad():
logits = model(input_values).logits
transcription = processor.batch_decode(logits.numpy()).text
# => "zmago je divje od letel s helikopterjem visoko vzrak"
```
### Eval results on Common Voice 8 "test" (WER):
| Without LM | With LM (run `./eval.py`) |
|---|---|
| 19.938 | 12.736 |
|
{"language": ["sl"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "XLS-R-300M - Slovenian", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "sl"}, "metrics": [{"type": "wer", "value": 12.736, "name": "Test WER"}, {"type": "cer", "value": 3.605, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "sl"}, "metrics": [{"type": "wer", "value": 45.587, "name": "Test WER"}, {"type": "cer", "value": 20.886, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "sl"}, "metrics": [{"type": "wer", "value": 45.42, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xls-r-300m-sl-cv8-with-lm
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"sl",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sl"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #sl #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
XLS-R-300M - Slovenian
======================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - SL dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2578
* Wer: 0.2273
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 60.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
#### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8\_0' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
### Inference With LM
### Eval results on Common Voice 8 "test" (WER):
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #sl #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Inference With LM",
"### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
111,
132,
4,
39,
60,
8,
15
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #sl #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 60.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0#### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8\\_0' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'### Inference With LM### Eval results on Common Voice 8 \"test\" (WER):"
] |
[
-0.10429129749536514,
0.14857910573482513,
-0.0063051762990653515,
0.02102547511458397,
0.10683080554008484,
0.030841441825032234,
0.10758494585752487,
0.16390392184257507,
-0.036446504294872284,
0.14295056462287903,
0.056946150958538055,
0.10918714106082916,
0.08740437030792236,
0.09282730519771576,
-0.03130931779742241,
-0.21205128729343414,
0.03941049054265022,
-0.0695280134677887,
-0.07229981571435928,
0.10237176716327667,
0.08948121964931488,
-0.08400921523571014,
0.028834648430347443,
-0.011142304167151451,
-0.05768518149852753,
-0.011506600305438042,
-0.03146427869796753,
-0.043801844120025635,
0.06333949416875839,
0.029082929715514183,
0.0226435624063015,
0.02764551341533661,
0.05839928239583969,
-0.30659785866737366,
-0.0013251530472189188,
0.07829150557518005,
0.02682526223361492,
0.04130570590496063,
0.09629282355308533,
-0.027261069044470787,
0.10387521237134933,
-0.09604962915182114,
0.04560982435941696,
0.07859121263027191,
-0.08272922784090042,
-0.23475492000579834,
-0.10921657830476761,
0.031639739871025085,
0.15367259085178375,
0.08067169785499573,
-0.04874849691987038,
0.04102169722318649,
-0.09239660203456879,
0.08999123424291611,
0.19304949045181274,
-0.21015861630439758,
-0.042453646659851074,
-0.0008110199123620987,
0.023306600749492645,
0.03233073279261589,
-0.09062023460865021,
-0.013679440133273602,
0.006719143595546484,
0.005012013018131256,
0.04344206303358078,
-0.003764516906812787,
0.045020632445812225,
0.0025107511319220066,
-0.14501622319221497,
-0.07951024919748306,
0.1404840350151062,
0.06524043530225754,
-0.03934873268008232,
-0.12126807123422623,
-0.01126482430845499,
-0.1777336299419403,
-0.042441386729478836,
0.009573247283697128,
0.0153468893840909,
-0.024289213120937347,
-0.01168715301901102,
0.03073745220899582,
-0.05626250430941582,
-0.07197536528110504,
0.06021725386381149,
0.11208751797676086,
0.04651838168501854,
-0.04261460527777672,
0.014323854818940163,
0.08746377378702164,
0.02990584634244442,
-0.15541641414165497,
-0.051331404596567154,
0.0392395444214344,
-0.14134222269058228,
-0.008117906749248505,
-0.02564724162220955,
0.007522217463701963,
0.08953472971916199,
0.1918727606534958,
0.017253397032618523,
0.10375645011663437,
-0.011929702945053577,
0.008667855523526669,
-0.047266583889722824,
0.15356230735778809,
-0.02916746959090233,
-0.09203165024518967,
-0.03874152898788452,
0.12043420970439911,
-0.00047722013550810516,
-0.012700691819190979,
-0.03247345983982086,
0.03337806090712547,
0.11243660002946854,
0.09764082729816437,
0.026815008372068405,
0.009385796263813972,
-0.08549288660287857,
-0.017572544515132904,
-0.019482165575027466,
-0.15050923824310303,
0.06260369718074799,
0.07932382076978683,
-0.047248631715774536,
-0.018903831019997597,
-0.01758691295981407,
0.01565510593354702,
-0.06371215730905533,
0.09587932378053665,
-0.0474855937063694,
-0.0012296142522245646,
-0.06880491971969604,
-0.09956303983926773,
0.05721307173371315,
-0.02090407907962799,
-0.03598856180906296,
-0.04956590756773949,
-0.07323917746543884,
-0.0918993353843689,
0.04062280058860779,
-0.05517937242984772,
-0.0475638210773468,
-0.0855892151594162,
-0.09871082007884979,
0.04909048229455948,
-0.011380873620510101,
0.1347873955965042,
-0.05898825451731682,
0.08548971265554428,
0.02058268152177334,
0.04035584256052971,
0.12636925280094147,
0.06496783345937729,
-0.018574871122837067,
0.06138220429420471,
-0.13072341680526733,
0.12306823581457138,
-0.13829734921455383,
0.0461643822491169,
-0.16206537187099457,
-0.08343177288770676,
0.011916408315300941,
0.0003772734198719263,
0.09213080257177353,
0.1473417580127716,
-0.19152720272541046,
-0.06865799427032471,
0.1585642248392105,
-0.05744457617402077,
-0.08317197114229202,
0.1424427628517151,
0.0009529680828563869,
-0.04914671927690506,
0.025592315942049026,
0.16534428298473358,
0.1317819058895111,
-0.10564735531806946,
-0.03237728402018547,
-0.06791027635335922,
0.07310830056667328,
0.05949140340089798,
0.10454278439283371,
-0.07877711206674576,
0.03243603929877281,
-0.005542123690247536,
-0.05421256646513939,
0.00257835048250854,
-0.0605902336537838,
-0.07918746024370193,
-0.003511768067255616,
-0.039400018751621246,
-0.016184944659471512,
0.020278947427868843,
-0.03658482804894447,
-0.0877716988325119,
-0.12876659631729126,
-0.04095309227705002,
0.10315688699483871,
-0.08588523417711258,
0.029156018048524857,
-0.09734099358320236,
0.07217586040496826,
0.008071843534708023,
0.02688576839864254,
-0.1476711481809616,
-0.033012229949235916,
0.04631395637989044,
-0.07261212915182114,
0.005066387355327606,
-0.04277893155813217,
0.03480183333158493,
0.020798038691282272,
-0.0046487851068377495,
-0.0505993627011776,
-0.04033692553639412,
-0.01076615322381258,
-0.04980788007378578,
-0.21595552563667297,
-0.06469141691923141,
-0.02348119392991066,
0.20540586113929749,
-0.19263510406017303,
0.0231939647346735,
0.10925095528364182,
0.09798552095890045,
0.004672004841268063,
-0.05331943929195404,
0.01408584974706173,
0.055167026817798615,
-0.020588787272572517,
-0.05720064043998718,
0.012051442638039589,
-0.008515968918800354,
-0.09335964173078537,
-0.018449805676937103,
-0.15109749138355255,
0.0030311206355690956,
0.08317702263593674,
0.04340154677629471,
-0.05882711708545685,
-0.04268420487642288,
-0.060565799474716187,
-0.043273668736219406,
-0.05340481922030449,
-0.05313059687614441,
0.10263504087924957,
0.05124886706471443,
0.08228778839111328,
-0.0653812363743782,
-0.05919835716485977,
0.02732820436358452,
0.006022376473993063,
-0.017207982018589973,
0.1562020480632782,
0.06121309846639633,
-0.03921987861394882,
0.08513253927230835,
0.027297858148813248,
-0.03867622837424278,
0.09754586219787598,
-0.06441433727741241,
-0.09140681475400925,
-0.05909351259469986,
0.0712788850069046,
0.04219185560941696,
0.07195557653903961,
-0.18474340438842773,
-0.011190366931259632,
0.03713184967637062,
0.036780983209609985,
0.01930556260049343,
-0.17198747396469116,
0.02366054430603981,
0.024757463485002518,
-0.0886794850230217,
-0.03055562637746334,
0.019766880199313164,
-0.0003957872686441988,
0.07295819371938705,
0.011450580321252346,
-0.043908100575208664,
-0.033461298793554306,
-0.06428631395101547,
-0.1258462369441986,
0.1545393019914627,
-0.10555421561002731,
-0.14237649738788605,
-0.12111245840787888,
-0.03491585701704025,
-0.0351390577852726,
-0.027173887938261032,
0.06881709396839142,
-0.09828893095254898,
-0.0625256896018982,
-0.08115997910499573,
-0.01062022801488638,
-0.02406829036772251,
0.011935041286051273,
0.042385756969451904,
0.015864796936511993,
0.047295067459344864,
-0.10913338512182236,
-0.013468149118125439,
-0.0024399796966463327,
-0.020231788977980614,
0.0007661849376745522,
0.05413364619016647,
0.09072214365005493,
0.15983161330223083,
0.0625406876206398,
0.06267113238573074,
-0.017586296424269676,
0.22145476937294006,
-0.11804406344890594,
-0.0052515314891934395,
0.1025959849357605,
-0.001795428921468556,
0.06670839339494705,
0.15899188816547394,
0.0280145313590765,
-0.08870507031679153,
0.021370045840740204,
0.05583812668919563,
-0.009085210971534252,
-0.25794288516044617,
-0.03365662321448326,
-0.07616128772497177,
-0.008166322484612465,
0.08109628409147263,
0.04187212884426117,
0.013108506798744202,
-0.0012404834851622581,
-0.019696727395057678,
-0.021316103637218475,
0.05511164292693138,
0.0676111951470375,
0.09464136511087418,
0.035094909369945526,
0.08843401074409485,
-0.02332509122788906,
-0.0030861159320920706,
0.02170545607805252,
-0.005610177293419838,
0.22085556387901306,
0.018784476444125175,
0.19824329018592834,
0.07789793610572815,
0.13383744657039642,
-0.022017039358615875,
0.03435712680220604,
0.0005355622852221131,
0.02275739796459675,
0.039444614201784134,
-0.06980791687965393,
-0.051450952887535095,
0.0400916151702404,
0.1280640810728073,
-0.011874104849994183,
-0.08090876787900925,
0.02773372270166874,
0.051970645785331726,
0.30764058232307434,
0.08359412848949432,
-0.2376822978258133,
-0.04640978202223778,
0.03012857213616371,
-0.06599748134613037,
-0.01941000111401081,
0.006641177460551262,
0.10140924155712128,
-0.08524250984191895,
0.07098058611154556,
-0.044618766754865646,
0.08953149616718292,
-0.071226567029953,
0.004259323701262474,
0.0646761953830719,
0.10290440171957016,
0.01632804609835148,
0.060702335089445114,
-0.252523809671402,
0.2127349078655243,
-0.0013755043037235737,
0.07500682026147842,
-0.069740891456604,
0.05938192829489708,
0.024284007027745247,
-0.0631137415766716,
0.1026214137673378,
0.00004374502532300539,
-0.11234039813280106,
-0.1599346101284027,
-0.10455041378736496,
0.0031022406183183193,
0.13959184288978577,
-0.072114959359169,
0.13751806318759918,
-0.03735814243555069,
-0.05734533071517944,
0.011041907593607903,
-0.015766369178891182,
-0.12588943541049957,
-0.10084360092878342,
0.07545692473649979,
0.0232180617749691,
0.07611540704965591,
-0.07626654952764511,
-0.06784316152334213,
-0.06264309585094452,
0.14751888811588287,
-0.15705455839633942,
-0.02696244791150093,
-0.13128797709941864,
0.04926592484116554,
0.15543238818645477,
-0.07004068791866302,
0.022099975496530533,
0.01591564156115055,
0.12088997662067413,
0.03292805701494217,
0.0026931348256766796,
0.08152779191732407,
-0.07935583591461182,
-0.19768795371055603,
-0.03845161572098732,
0.20010073482990265,
0.016318481415510178,
0.0632096603512764,
-0.011526728980243206,
0.014400461688637733,
0.0067327553406357765,
-0.09425251930952072,
0.08448279649019241,
0.07494183629751205,
0.0023620042484253645,
0.08057244122028351,
-0.047037139534950256,
-0.046890683472156525,
-0.11225142329931259,
-0.050047989934682846,
0.1212184950709343,
0.2601091265678406,
-0.06082411855459213,
0.045477643609046936,
0.03149885684251785,
-0.06688907742500305,
-0.13696949183940887,
-0.02126213349401951,
0.10699723660945892,
0.025701817125082016,
-0.014170807786285877,
-0.17049802839756012,
0.007630861829966307,
0.082243412733078,
-0.011624343693256378,
0.10042472928762436,
-0.3380735516548157,
-0.1305507868528366,
0.05643325299024582,
0.05566902086138725,
-0.03441784903407097,
-0.1748083233833313,
-0.09675872325897217,
-0.027046332135796547,
-0.09126903116703033,
0.06360379606485367,
-0.013764475472271442,
0.12006320059299469,
0.016431637108325958,
0.019591744989156723,
0.02689298428595066,
-0.056502293795347214,
0.1488185077905655,
0.06245231255888939,
0.016509689390659332,
-0.017558759078383446,
0.01001790165901184,
0.022980231791734695,
-0.07212022691965103,
0.05297354236245155,
-0.0748441070318222,
0.013103517703711987,
-0.14901509881019592,
-0.021159393712878227,
-0.07028692215681076,
-0.004348617047071457,
-0.06598110496997833,
0.00010320015280740336,
-0.020443560555577278,
0.04263864830136299,
0.115170918405056,
0.007335060741752386,
0.07084253430366516,
-0.05108487606048584,
0.0852155089378357,
0.13834281265735626,
0.10757777839899063,
0.0260535329580307,
-0.12352078408002853,
0.006268983706831932,
0.006569421850144863,
0.013127229176461697,
-0.11119237542152405,
0.05242108553647995,
0.13315708935260773,
0.04159124568104744,
0.1683841347694397,
0.03849947452545166,
-0.10533389449119568,
-0.00715535506606102,
0.06675992161035538,
-0.06052550673484802,
-0.1834857165813446,
-0.002920177299529314,
0.008101237006485462,
-0.1376754641532898,
-0.017421701923012733,
0.11785197257995605,
-0.013495951890945435,
0.005560860503464937,
0.020091982558369637,
0.07194197922945023,
-0.03436797857284546,
0.22239594161510468,
0.011556736193597317,
0.11443406343460083,
-0.09091104567050934,
0.07745809853076935,
0.050067540258169174,
-0.1003558561205864,
0.045405689626932144,
0.11582186818122864,
-0.05734613165259361,
-0.03317731246352196,
-0.037881214171648026,
0.06794513761997223,
0.07196113467216492,
-0.04615253955125809,
-0.10217740386724472,
-0.12794898450374603,
0.09866642951965332,
0.038615066558122635,
0.02387979067862034,
0.0396692268550396,
-0.007445765659213066,
0.026117319241166115,
-0.09102857857942581,
0.11350978165864944,
0.09756641089916229,
0.042175330221652985,
-0.10353944450616837,
0.0756344273686409,
0.00099472445435822,
0.004011479672044516,
0.017165588214993477,
-0.017781058326363564,
-0.10433676838874817,
0.03293865919113159,
-0.08112660050392151,
-0.0111826341599226,
-0.07683613896369934,
-0.011427353136241436,
0.032942600548267365,
-0.05557640269398689,
-0.048384714871644974,
0.02527177706360817,
-0.10870774835348129,
-0.08434157818555832,
-0.04520639404654503,
0.09296756237745285,
-0.12255945056676865,
-0.005071296822279692,
0.03739431872963905,
-0.1515577733516693,
0.10540050268173218,
0.04720894247293472,
-0.0014387447154149413,
0.0029209048952907324,
-0.06932516396045685,
-0.020493939518928528,
0.02891373634338379,
0.02155129425227642,
0.03318643942475319,
-0.22245149314403534,
-0.004150026477873325,
-0.021144554018974304,
0.004975985735654831,
0.0012676625046879053,
0.027640454471111298,
-0.12397237122058868,
-0.0010904341470450163,
-0.04258965700864792,
-0.05263889953494072,
-0.05218777060508728,
0.040452808141708374,
0.08183790743350983,
0.01622755639255047,
0.1743580400943756,
-0.052059151232242584,
0.0838434174656868,
-0.19925685226917267,
-0.001515882438980043,
0.002112935297191143,
-0.03284914791584015,
-0.01673860475420952,
-0.023681985214352608,
0.10481173545122147,
-0.05962108075618744,
0.050144366919994354,
-0.042839985340833664,
0.04911070689558983,
0.02819185145199299,
-0.06654169410467148,
0.00808958150446415,
0.042012158781290054,
0.1375758945941925,
0.05948888510465622,
-0.024861736223101616,
0.05869013071060181,
-0.05489484965801239,
0.04517757520079613,
0.04744337871670723,
0.1574297845363617,
0.15613602101802826,
0.11848297715187073,
0.07139106839895248,
0.08180369436740875,
-0.13352271914482117,
-0.1351545751094818,
0.15962353348731995,
-0.08797409385442734,
0.13660144805908203,
-0.045824967324733734,
0.17997658252716064,
0.09503283351659775,
-0.1957043707370758,
0.08223598450422287,
-0.02783525176346302,
-0.08563238382339478,
-0.09947473555803299,
-0.11006861180067062,
-0.07680804282426834,
-0.1342494934797287,
0.011692930944263935,
-0.09702637791633606,
0.0828934758901596,
0.03161045163869858,
0.04482657462358475,
0.03762475401163101,
0.08577261120080948,
0.004756670445203781,
-0.01353373657912016,
0.10736224800348282,
-0.007755878381431103,
-0.022056441754102707,
0.001706167939119041,
-0.07030981034040451,
0.05471855401992798,
-0.020095087587833405,
0.10597237199544907,
0.014916623011231422,
-0.08056856691837311,
0.05814697593450546,
-0.013305909931659698,
-0.10115420073270798,
0.029617641121149063,
-0.02343195304274559,
0.031134584918618202,
0.10891272872686386,
0.05548064783215523,
-0.01819278672337532,
0.004211283288896084,
0.1732279509305954,
-0.07665588706731796,
-0.07681949436664581,
-0.14035718142986298,
0.14444401860237122,
0.028913438320159912,
0.012644129805266857,
0.014894294552505016,
-0.09681714326143265,
-0.026864664629101753,
0.16150538623332977,
0.11395753920078278,
-0.002652339171618223,
-0.017225494608283043,
0.03750298172235489,
-0.0027955984696745872,
-0.025860948488116264,
0.06571502238512039,
0.1092100664973259,
0.099685437977314,
-0.01712265983223915,
0.013176634907722473,
-0.02262900210916996,
-0.08232980966567993,
-0.036697592586278915,
0.07670193910598755,
0.006158898584544659,
-0.009343872778117657,
-0.007848436012864113,
0.1295013576745987,
-0.057237785309553146,
-0.15860582888126373,
0.020758718252182007,
-0.1460404247045517,
-0.17733973264694214,
-0.01793219894170761,
0.08420738577842712,
0.04009260609745979,
0.05882451310753822,
0.004136965610086918,
-0.046797242015600204,
0.15169520676136017,
-0.007263448555022478,
-0.03221540525555611,
-0.08906012773513794,
0.05868200585246086,
-0.07467444241046906,
0.16822326183319092,
-0.02096613124012947,
0.06461440026760101,
0.14522124826908112,
0.03479413688182831,
-0.11157319694757462,
0.052584029734134674,
0.0932168960571289,
-0.12779821455478668,
0.056149814277887344,
0.18480651080608368,
-0.03374595195055008,
0.12237044423818588,
0.049140579998493195,
-0.08514156192541122,
0.006958144251257181,
-0.020490769296884537,
-0.007763300556689501,
-0.085255466401577,
-0.005284252110868692,
-0.06590769439935684,
0.11578258872032166,
0.1923215687274933,
-0.0718257874250412,
0.013961474411189556,
-0.04183464124798775,
0.013192116282880306,
-0.0017750466940924525,
0.12499840557575226,
-0.04721459746360779,
-0.2692827582359314,
0.0471382699906826,
-0.004513251595199108,
0.03358667343854904,
-0.18185119330883026,
-0.08740387111902237,
0.04235456883907318,
-0.03602873533964157,
-0.06776857376098633,
0.11812150478363037,
0.05931228771805763,
0.03432207554578781,
-0.048220571130514145,
-0.19276490807533264,
-0.009274511598050594,
0.1980644166469574,
-0.16472530364990234,
-0.06047298014163971
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Punjabi
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Punjabi using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "pa-IN", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-xlsr-53-pa-in")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-xlsr-53-pa-in")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Punjabi test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "pa-IN", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-xlsr-53-pa-in")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-xlsr-53-pa-in")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\।\’\'\…]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 58.05 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "pa-IN", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Punjabi", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice pa-IN", "type": "common_voice", "args": "pa-IN"}, "metrics": [{"type": "wer", "value": 58.05, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xlsr-53-pa-in
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"pa-IN"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-XLSR-53-Punjabi
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Punjabi using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Punjabi test data of Common Voice.
Test Result: 58.05 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Punjabi\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Punjabi using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Punjabi test data of Common Voice.\n\nTest Result: 58.05 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Punjabi\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Punjabi using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Punjabi test data of Common Voice.\n\nTest Result: 58.05 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
82,
62,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-XLSR-53-Punjabi\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Punjabi using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Punjabi test data of Common Voice.\n\nTest Result: 58.05 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.14348715543746948,
0.032597169280052185,
-0.0012996835866943002,
-0.018268929794430733,
0.06371895223855972,
-0.06871356815099716,
0.15803135931491852,
0.09872307628393173,
-0.003570184577256441,
0.001289453823119402,
0.02561793103814125,
0.05266820639371872,
0.055067479610443115,
0.09576582163572311,
-0.0031315514352172613,
-0.23100867867469788,
0.01082946453243494,
0.03128526359796524,
0.09213034808635712,
0.1310037523508072,
0.11224423348903656,
-0.07579927146434784,
-0.023420751094818115,
0.10658352822065353,
-0.17365969717502594,
0.03587770462036133,
0.01908642239868641,
-0.11059118062257767,
0.1228962168097496,
0.05070509389042854,
0.06941352784633636,
0.05752990022301674,
0.08607275038957596,
-0.19666628539562225,
0.030514657497406006,
0.020075051113963127,
0.04300302267074585,
0.02399340271949768,
0.07683337479829788,
0.03472163900732994,
0.08389709889888763,
0.11636638641357422,
-0.029636181890964508,
0.0631304532289505,
-0.044108156114816666,
-0.2077936977148056,
-0.004218338523060083,
0.06639803946018219,
0.08683360368013382,
0.11573009938001633,
-0.06914237886667252,
0.10330207645893097,
-0.12546111643314362,
0.09748297929763794,
0.07233957201242447,
-0.19544529914855957,
0.01029939390718937,
0.09214453399181366,
0.08442573994398117,
0.1088918074965477,
-0.07532685995101929,
0.007009814027696848,
0.0458793081343174,
0.021457655355334282,
0.006753746885806322,
-0.04286328703165054,
-0.19520162045955658,
-0.014023437164723873,
-0.12128166109323502,
-0.020162906497716904,
0.20083357393741608,
-0.026589790359139442,
-0.06759972870349884,
-0.12761861085891724,
-0.032348375767469406,
0.009430747479200363,
-0.014541168697178364,
-0.07493226230144501,
-0.006371071096509695,
0.04893213510513306,
0.016164759173989296,
-0.006502415519207716,
-0.10977654159069061,
-0.12219534814357758,
-0.002647342160344124,
0.09013603627681732,
0.04023924097418785,
0.011883583851158619,
-0.1376417875289917,
0.03966910019516945,
-0.09452622383832932,
-0.060835856944322586,
-0.01525146421045065,
0.03671140968799591,
-0.046734519302845,
0.022749392315745354,
-0.08597937226295471,
-0.1555587500333786,
0.013604387640953064,
-0.06462401896715164,
0.01443217322230339,
0.041011445224285126,
-0.03923213481903076,
0.0621979646384716,
0.06045800447463989,
0.08950908482074738,
-0.06983378529548645,
0.009872366674244404,
0.0249323733150959,
0.01315996889024973,
-0.027590395882725716,
-0.021206151694059372,
-0.04777337983250618,
-0.054709628224372864,
0.019731463864445686,
0.033791687339544296,
-0.061962250620126724,
0.017403583973646164,
-0.022413523867726326,
-0.029711144044995308,
0.046986937522888184,
-0.10137911140918732,
-0.04688400775194168,
0.0597391352057457,
0.02868558093905449,
0.10668159276247025,
0.03931337594985962,
0.054029062390327454,
-0.06351693719625473,
-0.08810533583164215,
-0.005370737053453922,
0.05905533954501152,
-0.03146098926663399,
-0.08708707988262177,
-0.01133604533970356,
0.014890152029693127,
-0.02498440258204937,
-0.09890224784612656,
-0.12106876820325851,
-0.05367996543645859,
-0.011201686225831509,
0.021278033033013344,
-0.056908730417490005,
-0.08219444006681442,
-0.00755461398512125,
-0.020762097090482712,
-0.08581894636154175,
0.0737437754869461,
-0.04021752253174782,
0.08958954364061356,
0.026682782918214798,
0.06706792116165161,
0.0529031865298748,
0.10124187916517258,
-0.060990866273641586,
-0.026162199676036835,
0.012777113355696201,
0.12799464166164398,
-0.039180006831884384,
-0.0788356214761734,
-0.09109363704919815,
-0.08200880885124207,
-0.021673621609807014,
0.0758894681930542,
0.03003341145813465,
0.0705118253827095,
-0.2485702633857727,
-0.07708514481782913,
0.1714557558298111,
-0.1327388733625412,
-0.012120205909013748,
0.18783621490001678,
-0.00968489795923233,
0.12965184450149536,
0.0993376150727272,
0.2650530934333801,
0.1243874803185463,
-0.16348512470722198,
0.047770533710718155,
0.03556006774306297,
0.0025352572556585073,
-0.056172315031290054,
0.04958876967430115,
-0.051550958305597305,
-0.03879197686910629,
0.03762548789381981,
-0.05310651287436485,
0.057213325053453445,
-0.0542605035007,
-0.044256433844566345,
-0.022056961432099342,
-0.10243427753448486,
0.06770633161067963,
0.04732265695929527,
0.0318598710000515,
-0.004542505368590355,
-0.02704709582030773,
0.08446741849184036,
0.13666406273841858,
-0.12747631967067719,
0.058794159442186356,
-0.11698269844055176,
0.0824868232011795,
-0.1345033347606659,
-0.03291066363453865,
-0.151510551571846,
0.20294089615345,
0.006211681291460991,
0.043911632150411606,
0.030507676303386688,
0.20748892426490784,
0.005205946043133736,
0.0034230072051286697,
-0.04052921384572983,
-0.01332249864935875,
0.039949774742126465,
-0.00476405443623662,
-0.07319483906030655,
-0.08002275973558426,
-0.036420147866010666,
-0.07106941193342209,
0.06270620971918106,
-0.1598445028066635,
-0.016744140535593033,
-0.0209759883582592,
0.009469615295529366,
-0.005222288891673088,
-0.009312677197158337,
0.08950013667345047,
0.09187910705804825,
0.0186730045825243,
0.027765363454818726,
0.04830078035593033,
0.012773574329912663,
-0.028697801753878593,
0.1775376945734024,
-0.143255814909935,
0.0030376531649380922,
0.07354975491762161,
-0.10815446823835373,
0.006456329021602869,
0.03783487528562546,
-0.021950548514723778,
-0.027173491194844246,
-0.05949879810214043,
-0.004173582419753075,
0.2842361330986023,
-0.03104187361896038,
0.13125678896903992,
-0.09894341230392456,
-0.007862798869609833,
0.008585735224187374,
-0.08488026261329651,
0.03488331288099289,
0.05570501461625099,
0.0064293392933905125,
0.05511121451854706,
0.04669579118490219,
-0.03875363990664482,
-0.0671577975153923,
0.21442170441150665,
-0.015937425196170807,
-0.08747701346874237,
0.026139188557863235,
-0.020751794800162315,
-0.06860748678445816,
0.04051189124584198,
-0.1588265746831894,
-0.05835069715976715,
0.04783734679222107,
0.05170207470655441,
0.06453549116849899,
-0.1338997334241867,
0.01932390406727791,
0.01016935147345066,
-0.11074048280715942,
-0.1638239324092865,
0.08928994089365005,
-0.03800733759999275,
0.04914965108036995,
-0.11542041599750519,
-0.02391582913696766,
0.026250451803207397,
-0.03415898233652115,
-0.18043255805969238,
0.12180225551128387,
-0.07053636759519577,
-0.21627341210842133,
-0.11422127485275269,
-0.017781106755137444,
0.03836023062467575,
0.010572940111160278,
0.08501338958740234,
-0.12786023318767548,
-0.03194202482700348,
-0.05044148489832878,
0.08778572827577591,
-0.0005400269292294979,
-0.03402696177363396,
-0.08957646042108536,
-0.0354112945497036,
0.0757339671254158,
-0.1605217456817627,
0.021398909389972687,
-0.036435194313526154,
-0.0338299386203289,
0.019688712432980537,
-0.0035095911007374525,
-0.020515145733952522,
0.16565531492233276,
0.015667902305722237,
0.0048436508513987064,
-0.013691440224647522,
0.17512807250022888,
-0.06810592114925385,
-0.015400970354676247,
0.21385470032691956,
-0.0029977718368172646,
-0.017011096701025963,
0.1149207130074501,
0.009984544478356838,
-0.060174740850925446,
-0.013125533238053322,
-0.015104173682630062,
-0.06139819324016571,
-0.2203853726387024,
-0.1159021183848381,
-0.07331152260303497,
-0.08969201892614365,
-0.037054263055324554,
0.010380858555436134,
0.04094201698899269,
0.029626203700900078,
-0.0036072498187422752,
-0.06571869552135468,
-0.0015283600660040975,
-0.021389907225966454,
0.10916195809841156,
-0.0215601809322834,
0.11203130334615707,
-0.05712497606873512,
-0.005418787710368633,
-0.00405884487554431,
0.05581730604171753,
0.15322302281856537,
0.05930479243397713,
0.051456600427627563,
0.11367160081863403,
0.16260972619056702,
0.1489957571029663,
0.0366312637925148,
-0.07508198916912079,
-0.03337430581450462,
0.015743792057037354,
-0.03346015512943268,
-0.05798790976405144,
0.03856464847922325,
0.15566067397594452,
-0.030312437564134598,
-0.02959461510181427,
-0.08473827689886093,
-0.009976875968277454,
0.18794403970241547,
0.06620129197835922,
-0.17722702026367188,
-0.06829030066728592,
-0.01616445556282997,
-0.1133507490158081,
-0.013590413145720959,
0.07216870784759521,
0.15943244099617004,
-0.10434161871671677,
-0.011751419864594936,
0.003076887922361493,
0.08742876350879669,
-0.012197410687804222,
0.024480927735567093,
-0.10745353251695633,
0.002483492949977517,
0.0057220845483243465,
0.07757353782653809,
-0.2567991018295288,
0.2286316156387329,
0.009695768356323242,
0.0959949865937233,
-0.03490524739027023,
-0.017439143732190132,
-0.004195182118564844,
0.06495501846075058,
0.06914330273866653,
0.014125138521194458,
0.041783884167671204,
-0.10782001167535782,
-0.04544855281710625,
0.05952644720673561,
-0.018845880404114723,
0.07713828980922699,
0.03875213861465454,
0.006230614148080349,
0.01738310046494007,
0.010240338742733002,
-0.022892305627465248,
-0.13697375357151031,
-0.03305305913090706,
0.021616853773593903,
0.14884231984615326,
0.10299708694219589,
-0.030679702758789062,
-0.09712082892656326,
-0.1726420521736145,
0.0297554899007082,
-0.0768786072731018,
-0.08696462959051132,
-0.05215636268258095,
-0.06016596406698227,
0.08917152136564255,
-0.050974566489458084,
0.011250076815485954,
0.08936882764101028,
0.09801585227251053,
-0.024676870554685593,
-0.03242988884449005,
0.025493962690234184,
-0.09493984282016754,
-0.09616348147392273,
-0.010762391611933708,
0.16265906393527985,
0.10982728749513626,
0.06340143829584122,
0.06200609728693962,
0.0026593846268951893,
-0.022832421585917473,
-0.03527344390749931,
-0.013108436018228531,
0.08210069686174393,
-0.10495256632566452,
-0.0019626514986157417,
0.010578968562185764,
-0.14453718066215515,
-0.07134471088647842,
-0.07520217448472977,
0.14056269824504852,
0.06796663999557495,
-0.03828583285212517,
0.195578932762146,
0.2437615990638733,
-0.09708336740732193,
-0.18621964752674103,
-0.048040952533483505,
0.08321470767259598,
0.11368396878242493,
0.016002777963876724,
-0.1468195915222168,
0.0828719511628151,
-0.007042820565402508,
-0.03541379049420357,
-0.08587910979986191,
-0.2674781084060669,
-0.13876065611839294,
0.1701032966375351,
-0.007260170299559832,
0.15229134261608124,
-0.02659609727561474,
-0.04394512251019478,
0.004985107108950615,
-0.01785067282617092,
-0.003571008797734976,
-0.08280547708272934,
0.10293560475111008,
0.000498770852573216,
0.1547836810350418,
0.0468796044588089,
-0.021875469014048576,
0.08471528440713882,
0.09334934502840042,
-0.012527313083410263,
-0.025620386004447937,
0.07911313325166702,
0.01218464132398367,
0.050945062190294266,
0.1266048550605774,
-0.0836053341627121,
0.03969144448637962,
-0.1373668760061264,
-0.08720656484365463,
-0.0927296057343483,
0.03232324495911598,
0.02053534798324108,
-0.04709627851843834,
0.016511183232069016,
-0.03788740187883377,
0.03520650789141655,
0.004291067365556955,
-0.060947664082050323,
-0.13985194265842438,
0.10882215201854706,
0.16779734194278717,
0.1853833645582199,
-0.03970853611826897,
-0.09769053012132645,
-0.052971940487623215,
-0.03228094428777695,
0.13284386694431305,
-0.1164364293217659,
0.018653981387615204,
0.05063627287745476,
0.0494094155728817,
0.135138601064682,
0.0018371808109804988,
-0.054160501807928085,
0.10701758414506912,
0.018367089331150055,
-0.010232750326395035,
-0.1388937085866928,
0.01448269747197628,
0.004530324600636959,
-0.029479246586561203,
0.013480530120432377,
0.09704108536243439,
-0.10779929906129837,
-0.030697021633386612,
-0.024993248283863068,
0.026423271745443344,
-0.12276513874530792,
0.20491814613342285,
0.0504424087703228,
0.07013718038797379,
-0.10639522224664688,
0.0014749558176845312,
0.0007985991542227566,
-0.03764783591032028,
0.021990269422531128,
-0.03169015422463417,
-0.08047231286764145,
-0.07572358846664429,
-0.031760551035404205,
0.055468108505010605,
0.04121597856283188,
-0.1122724786400795,
-0.03442716225981712,
-0.10719899088144302,
0.015443786978721619,
0.06919276714324951,
0.029373882338404655,
0.0035849171690642834,
-0.1300908327102661,
-0.05602050945162773,
-0.10003664344549179,
0.05249706655740738,
0.05853041261434555,
-0.009375115856528282,
-0.09647458046674728,
0.19787026941776276,
0.05535686016082764,
0.06618405878543854,
-0.05916617810726166,
-0.07082032412290573,
-0.006676316261291504,
0.08542165905237198,
-0.12652327120304108,
0.003131301375105977,
-0.06847456097602844,
0.0006655383622273803,
-0.010767257772386074,
-0.07362400740385056,
-0.006421965546905994,
0.07590299844741821,
-0.09461496025323868,
0.0751270204782486,
0.029844112694263458,
0.08586236089468002,
-0.052333008497953415,
0.026867952197790146,
0.039436545222997665,
-0.025467684492468834,
0.08033139258623123,
0.10942518711090088,
-0.1363057792186737,
0.10997767001390457,
-0.17675766348838806,
-0.04028872400522232,
0.03724163770675659,
0.07251982390880585,
-0.018469057977199554,
-0.09115048497915268,
0.029523100703954697,
0.09809761494398117,
0.040831249207258224,
-0.0038306841161102057,
0.12238410860300064,
-0.06220075860619545,
0.013044832274317741,
-0.11004715412855148,
0.017433449625968933,
-0.060786761343479156,
0.03178033232688904,
0.036832742393016815,
0.138312429189682,
0.13564826548099518,
-0.11599297821521759,
0.06439360976219177,
-0.11871954053640366,
0.010499781928956509,
-0.03910960257053375,
-0.01733025163412094,
-0.11106511950492859,
-0.08168329298496246,
0.05732966214418411,
-0.05444670096039772,
0.15231697261333466,
-0.0004003295034635812,
-0.0044321403838694096,
-0.02312603034079075,
-0.08372872322797775,
0.032745327800512314,
-0.03237023577094078,
0.2865004539489746,
0.04065795987844467,
0.04700227081775665,
-0.004842322785407305,
-0.004786364268511534,
-0.01553365308791399,
0.1261482834815979,
-0.052531931549310684,
0.18947532773017883,
-0.016940895467996597,
0.061579134315252304,
0.06474842131137848,
-0.08454154431819916,
-0.04897318407893181,
-0.026759356260299683,
-0.14764542877674103,
0.035903725773096085,
-0.05770062282681465,
0.17214460670948029,
0.16993184387683868,
-0.08016093075275421,
0.10017556697130203,
0.010326545685529709,
-0.08111926913261414,
-0.11779163777828217,
-0.10788972675800323,
-0.06211542710661888,
-0.15477415919303894,
0.030229980126023293,
-0.08583580702543259,
0.03764968737959862,
0.10630416125059128,
0.03311163932085037,
-0.019597092643380165,
0.1513165384531021,
0.01765788160264492,
-0.10486822575330734,
0.04416865110397339,
-0.09231311082839966,
0.020351385697722435,
-0.13463981449604034,
0.019574066624045372,
0.16325005888938904,
0.002189534017816186,
0.05202839896082878,
0.0058891670778393745,
-0.05363988131284714,
0.013008798472583294,
-0.09150455892086029,
-0.05929781496524811,
-0.009986605495214462,
-0.017244910821318626,
0.0769922211766243,
0.15954288840293884,
0.10613580793142319,
-0.06137235835194588,
-0.01657533086836338,
0.0712442472577095,
-0.02284800261259079,
-0.12931261956691742,
-0.12411972135305405,
0.1698453426361084,
0.031021570786833763,
0.006575748324394226,
0.02449677884578705,
-0.018214449286460876,
-0.02484181337058544,
0.22777961194515228,
0.19467084109783173,
0.018692193552851677,
0.03691445663571358,
-0.0227435864508152,
-0.013659022748470306,
-0.05080892890691757,
0.08230852335691452,
0.08523260802030563,
0.214151531457901,
-0.025650832802057266,
0.004945642314851284,
-0.1046433076262474,
-0.0599609799683094,
-0.022867893800139427,
0.06589411199092865,
-0.07520269602537155,
-0.1044173389673233,
0.003682684851810336,
0.1253838688135147,
-0.06545492261648178,
-0.07531992346048355,
-0.10760807991027832,
-0.057914867997169495,
-0.09046377241611481,
-0.013431437313556671,
-0.010487663559615612,
0.10271617025136948,
-0.021597612649202347,
-0.07346070557832718,
0.055459775030612946,
0.1582893282175064,
-0.002849745564162731,
-0.0260392464697361,
-0.043487101793289185,
0.08067238330841064,
-0.07808816432952881,
-0.0028071205597370863,
0.004335709847509861,
0.18772810697555542,
0.010444859974086285,
0.11009722948074341,
-0.008973183110356331,
0.18251089751720428,
0.002533776219934225,
-0.08922341465950012,
0.011488205753266811,
0.15059198439121246,
0.0032203691080212593,
0.12045937031507492,
0.010360954329371452,
-0.10585398972034454,
0.0370626337826252,
-0.13965648412704468,
-0.009137840941548347,
-0.0840679407119751,
0.04629591479897499,
-0.019404398277401924,
0.09449398517608643,
0.049077510833740234,
-0.06003270298242569,
-0.07209460437297821,
-0.059381600469350815,
0.051249273121356964,
-0.0022130545694381,
-0.07153857499361038,
-0.05673452466726303,
-0.2630535960197449,
0.0004216382803861052,
-0.09808336198329926,
-0.02303890697658062,
-0.21049554646015167,
-0.02930804342031479,
-0.015154431574046612,
-0.07954688370227814,
0.01352872047573328,
0.036162424832582474,
0.0881948247551918,
0.021821895614266396,
0.008362905122339725,
-0.022356942296028137,
0.03800266981124878,
0.1283496618270874,
-0.17182055115699768,
-0.0991690531373024
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-53-rm-vallader-with-lm
This model is a fine-tuned version of [anuragshas/wav2vec2-large-xlsr-53-rm-vallader](https://huggingface.co/anuragshas/wav2vec2-large-xlsr-53-rm-vallader) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4552
- Wer: 0.3206
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.112
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.2379 | 3.12 | 100 | 0.4041 | 0.3396 |
| 0.103 | 6.25 | 200 | 0.4400 | 0.3337 |
| 0.0664 | 9.38 | 300 | 0.4239 | 0.3315 |
| 0.0578 | 12.5 | 400 | 0.4303 | 0.3267 |
| 0.0446 | 15.62 | 500 | 0.4575 | 0.3274 |
| 0.041 | 18.75 | 600 | 0.4451 | 0.3223 |
| 0.0402 | 21.88 | 700 | 0.4507 | 0.3206 |
| 0.0374 | 25.0 | 800 | 0.4649 | 0.3208 |
| 0.0371 | 28.12 | 900 | 0.4552 | 0.3206 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-xlsr-53-rm-vallader-with-lm", "results": []}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xlsr-53-rm-vallader-with-lm
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-xlsr-53-rm-vallader-with-lm
====================================
This model is a fine-tuned version of anuragshas/wav2vec2-large-xlsr-53-rm-vallader on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4552
* Wer: 0.3206
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.112
* num\_epochs: 30
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.112\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.112\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
65,
145,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.112\n* num\\_epochs: 30### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.1425461620092392,
0.10209082067012787,
-0.0021139297168701887,
0.06789661943912506,
0.13499626517295837,
0.01641153357923031,
0.12181458622217178,
0.12922941148281097,
-0.11033757030963898,
0.08026584982872009,
0.10184770822525024,
0.09544564038515091,
0.03812205046415329,
0.10113634169101715,
-0.030435891821980476,
-0.28985798358917236,
-0.007294026203453541,
0.03396369516849518,
-0.12629255652427673,
0.12609177827835083,
0.09097573906183243,
-0.12954792380332947,
0.04946264624595642,
0.039182260632514954,
-0.17665618658065796,
-0.0005755992024205625,
-0.0012399291153997183,
-0.09367380291223526,
0.13177677989006042,
0.025699757039546967,
0.10169562697410583,
0.022808928042650223,
0.10101406276226044,
-0.18079335987567902,
0.00716287549585104,
0.05758645012974739,
0.035312578082084656,
0.10203724354505539,
0.08619845658540726,
-0.004172263666987419,
0.13113808631896973,
-0.06550730019807816,
0.06502778083086014,
0.04001684486865997,
-0.09285108745098114,
-0.2824729084968567,
-0.08680913597345352,
0.0819351002573967,
0.08892440795898438,
0.1014404296875,
-0.013483239337801933,
0.10644277930259705,
-0.07647814601659775,
0.09118065237998962,
0.26675283908843994,
-0.27937281131744385,
-0.07106150686740875,
-0.0210383590310812,
0.037314511835575104,
0.014814935624599457,
-0.12352457642555237,
-0.02137742005288601,
0.04212958738207817,
0.03883785381913185,
0.10073137283325195,
0.001269679982215166,
-0.04760028421878815,
0.011664153076708317,
-0.1409321427345276,
-0.04636412113904953,
0.12776251137256622,
0.04451853036880493,
-0.039729367941617966,
-0.0749029740691185,
-0.053031690418720245,
-0.22489191591739655,
-0.035866156220436096,
0.0009233571472577751,
0.03042607381939888,
-0.07704547792673111,
-0.12851941585540771,
0.005774674471467733,
-0.08675272017717361,
-0.10246725380420685,
-0.013456516899168491,
0.2038225680589676,
0.05088334530591965,
0.004688617307692766,
-0.02079569362103939,
0.11896748840808868,
0.040534213185310364,
-0.15569236874580383,
0.018948664888739586,
0.05424141138792038,
-0.04733693227171898,
-0.0007779102888889611,
-0.05144490301609039,
-0.03294842317700386,
0.00786240678280592,
0.1423882246017456,
-0.06867542117834091,
0.03674599900841713,
0.024315379559993744,
0.030623963102698326,
-0.0949840247631073,
0.21335899829864502,
-0.08552659302949905,
-0.03279823809862137,
-0.016809778288006783,
0.09755702316761017,
0.026966769248247147,
-0.012903195805847645,
-0.0918956845998764,
0.0030672522261738777,
0.1100609302520752,
0.03483567014336586,
-0.02262263558804989,
0.03150958567857742,
-0.03214055299758911,
-0.03163047134876251,
0.027076423168182373,
-0.10290909558534622,
0.025965582579374313,
0.021916121244430542,
-0.09754128754138947,
0.018005792051553726,
0.010357017628848553,
0.007222272921353579,
-0.025434842333197594,
0.1575477570295334,
-0.08707519620656967,
0.0028590557631105185,
-0.0786978155374527,
-0.09521510452032089,
0.023940233513712883,
-0.07280623912811279,
0.0009545851498842239,
-0.07453653216362,
-0.11588646471500397,
-0.021936530247330666,
0.024434268474578857,
-0.037994228303432465,
-0.08103464543819427,
-0.04317009076476097,
-0.11104876548051834,
0.04138266295194626,
-0.022076841443777084,
0.15807729959487915,
-0.05086855962872505,
0.12562036514282227,
0.06309574842453003,
0.05310046672821045,
0.005774590652436018,
0.05892647057771683,
-0.06211244687438011,
0.01552885863929987,
-0.1628558188676834,
0.03892676532268524,
-0.06726814806461334,
0.029993992298841476,
-0.10123099386692047,
-0.1337135136127472,
0.013924985192716122,
-0.006570213008671999,
0.09249109774827957,
0.10028762370347977,
-0.15741044282913208,
-0.11849329620599747,
0.14908894896507263,
-0.08620920032262802,
-0.09932682663202286,
0.1306876838207245,
-0.006739520467817783,
-0.040868017822504044,
0.0422053225338459,
0.15330320596694946,
0.06146205589175224,
-0.11887858808040619,
-0.03086935356259346,
-0.03568890690803528,
0.10377376526594162,
-0.020619381219148636,
0.09774316847324371,
-0.03125930204987526,
0.051856447011232376,
0.016169780865311623,
-0.04654698818922043,
0.04265645891427994,
-0.1091674417257309,
-0.08916604518890381,
-0.03977474197745323,
-0.09933565557003021,
0.05116807296872139,
0.06687762588262558,
0.054545145481824875,
-0.0886678472161293,
-0.1259784996509552,
0.04527202621102333,
0.12587636709213257,
-0.07909443974494934,
0.032651446759700775,
-0.08332044631242752,
0.08576321601867676,
-0.04237111657857895,
-0.02442268654704094,
-0.19087950885295868,
-0.0006575818406417966,
0.02033984661102295,
-0.031643711030483246,
0.025102820247411728,
-0.02392820455133915,
0.07137062400579453,
0.0665663629770279,
-0.06450917571783066,
-0.06498310714960098,
-0.08049925416707993,
-0.021856103092432022,
-0.08386591821908951,
-0.2327786237001419,
-0.08503670990467072,
-0.009434509091079235,
0.13534489274024963,
-0.1777954399585724,
0.01575196534395218,
0.016968250274658203,
0.13190993666648865,
0.03187211975455284,
-0.032930511981248856,
-0.019177492707967758,
0.09409180283546448,
-0.0270091500133276,
-0.05208880826830864,
0.02785428613424301,
0.0018833965295925736,
-0.09294459968805313,
-0.018278131261467934,
-0.11426613479852676,
0.1511766016483307,
0.13775750994682312,
-0.03004208393394947,
-0.07288099080324173,
0.021847128868103027,
-0.08673349022865295,
-0.0511440634727478,
-0.037155453115701675,
-0.0013967478880658746,
0.1679374724626541,
0.024239571765065193,
0.1369398683309555,
-0.08284557610750198,
-0.05567321926355362,
0.04222375899553299,
0.005155615974217653,
0.004128645174205303,
0.11222542822360992,
0.08771402388811111,
-0.0029936188366264105,
0.12176170200109482,
0.0810345783829689,
-0.12047196179628372,
0.14731140434741974,
-0.07126898318529129,
-0.10465185344219208,
-0.026401907205581665,
-0.017974188551306725,
0.021084778010845184,
0.143166184425354,
-0.1371237188577652,
-0.0218557920306921,
0.028329839929938316,
-0.001805784646421671,
0.024924401193857193,
-0.22499336302280426,
-0.004899582825601101,
0.01724744401872158,
-0.06459569185972214,
-0.029286595061421394,
-0.0008671906543895602,
0.013306322507560253,
0.10676375776529312,
0.0005647827056236565,
-0.07331303507089615,
-0.003991627134382725,
-0.0004269756027497351,
-0.0609283372759819,
0.19121865928173065,
-0.07734087854623795,
-0.15183870494365692,
-0.14884519577026367,
-0.0051346877589821815,
-0.061542823910713196,
-0.008711098693311214,
0.0474795326590538,
-0.1025402843952179,
-0.02698965184390545,
-0.03643769025802612,
0.05857178196310997,
-0.02767910249531269,
0.05372651666402817,
0.014936057850718498,
0.013044648803770542,
0.07920191437005997,
-0.12136224657297134,
0.02097952924668789,
-0.05467197671532631,
-0.052663758397102356,
0.004988310392946005,
0.08568300306797028,
0.11186566948890686,
0.16462770104408264,
0.007429806981235743,
0.016489848494529724,
-0.031041894108057022,
0.16758905351161957,
-0.10448574274778366,
-0.044803865253925323,
0.13978877663612366,
0.00020874662732239813,
0.03855287283658981,
0.10683023929595947,
0.06955414265394211,
-0.06844811141490936,
-0.012196452356874943,
0.039850782603025436,
-0.020283637568354607,
-0.24214355647563934,
-0.048145841807127,
-0.043881479650735855,
-0.0009141602786257863,
0.10002298653125763,
0.03453338146209717,
0.03725586086511612,
0.03544628247618675,
-0.015162699855864048,
0.04096556082367897,
-0.04132866486907005,
0.06139008700847626,
0.09621429443359375,
0.051088377833366394,
0.1334472894668579,
-0.028075354173779488,
-0.053519200533628464,
0.026030296459794044,
-0.019056934863328934,
0.20889273285865784,
-0.02971438504755497,
0.15458665788173676,
0.04318097606301308,
0.1767600029706955,
0.015252356417477131,
0.08698002249002457,
0.010290011763572693,
-0.025095336139202118,
0.026657691225409508,
-0.06110718473792076,
-0.03114878572523594,
0.011473181657493114,
0.05369500443339348,
0.08579575270414352,
-0.1279795914888382,
-0.009680927731096745,
0.03433946520090103,
0.34376683831214905,
0.05363425239920616,
-0.31406134366989136,
-0.1191687136888504,
-0.026530735194683075,
-0.06391344964504242,
-0.028918370604515076,
0.03141196444630623,
0.11334654688835144,
-0.0912969633936882,
0.052101802080869675,
-0.06936030834913254,
0.082575224339962,
-0.058508168905973434,
0.022240985184907913,
0.09300278127193451,
0.09920769929885864,
0.007557015400379896,
0.0519079752266407,
-0.25635138154029846,
0.2916725277900696,
0.0044969613663852215,
0.08758603036403656,
-0.0521327368915081,
0.02632923424243927,
0.022349081933498383,
-0.004366385284811258,
0.05502723902463913,
-0.026309048756957054,
-0.014921761117875576,
-0.1808752864599228,
-0.08608655631542206,
0.018556656315922737,
0.13051412999629974,
-0.06119133159518242,
0.11956887692213058,
-0.026236727833747864,
-0.03088712878525257,
0.05358066409826279,
-0.08167622238397598,
-0.06571865826845169,
-0.08259912580251694,
0.023197023198008537,
0.03971206396818161,
0.030579019337892532,
-0.09225235879421234,
-0.13696862757205963,
-0.08355537801980972,
0.13229115307331085,
-0.10114038735628128,
-0.035316839814186096,
-0.12039358168840408,
0.09585283696651459,
0.16975587606430054,
-0.07468528300523758,
0.053497690707445145,
0.017863212153315544,
0.11848820000886917,
0.021457882598042488,
-0.03874494880437851,
0.08942252397537231,
-0.07953005284070969,
-0.24028462171554565,
-0.04177755489945412,
0.17625543475151062,
0.015638327226042747,
0.06868395954370499,
-0.03709237650036812,
0.03595985099673271,
-0.027898559346795082,
-0.07621607929468155,
0.03950439766049385,
0.0012828354956582189,
0.036300722509622574,
0.035358864814043045,
-0.01953004114329815,
-0.019879702478647232,
-0.06803151965141296,
-0.037930168211460114,
0.14306895434856415,
0.25812020897865295,
-0.09312139451503754,
0.022027991712093353,
0.06732096523046494,
-0.037340763956308365,
-0.16340985894203186,
0.013804771937429905,
0.11885426938533783,
0.029849277809262276,
-0.010743802413344383,
-0.19491325318813324,
0.08782053738832474,
0.08641016483306885,
-0.03272394463419914,
0.10206393897533417,
-0.3090014159679413,
-0.1451958417892456,
0.11171474307775497,
0.09607718139886856,
0.009317656978964806,
-0.15930381417274475,
-0.05689389631152153,
-0.012735524214804173,
-0.10565976053476334,
0.0893925130367279,
-0.041751787066459656,
0.12435450404882431,
-0.02247565984725952,
0.07800491899251938,
0.01829013042151928,
-0.05869164690375328,
0.1241101548075676,
0.0025815237313508987,
0.05233710631728172,
-0.004909811541438103,
-0.006835710257291794,
0.05201689526438713,
-0.03257473185658455,
0.018411967903375626,
-0.0597783662378788,
0.03658498451113701,
-0.08060725033283234,
-0.022774094715714455,
-0.1097259521484375,
0.0413680300116539,
-0.04384254291653633,
-0.046164192259311676,
-0.01660379208624363,
0.016046056523919106,
0.012490132823586464,
-0.017917491495609283,
0.14344938099384308,
0.007355326786637306,
0.17072045803070068,
0.11362279206514359,
0.08093864470720291,
-0.025546234101057053,
-0.11175508052110672,
-0.00956160482019186,
-0.02430896647274494,
0.07393277436494827,
-0.12073679268360138,
0.012719598598778248,
0.1395503133535385,
0.079571932554245,
0.1110369935631752,
0.07213345170021057,
-0.07219117879867554,
0.013640710152685642,
0.07094970345497131,
-0.14236198365688324,
-0.09497054666280746,
-0.02530207484960556,
-0.013586965389549732,
-0.13235099613666534,
0.07196362316608429,
0.10821680724620819,
-0.07030227780342102,
-0.01910383813083172,
0.011291099712252617,
0.0010674508521333337,
-0.04603377357125282,
0.24256472289562225,
0.05892918258905411,
0.0816178098320961,
-0.11892950534820557,
0.07662089914083481,
0.04393935948610306,
-0.1423557996749878,
0.01992996782064438,
0.07333649694919586,
-0.05759292468428612,
-0.009406052529811859,
0.014743577688932419,
0.08155027776956558,
-0.04295109957456589,
-0.064984530210495,
-0.157034233212471,
-0.13050688803195953,
0.08500274270772934,
0.1520197093486786,
0.06579859554767609,
0.03108217939734459,
-0.05276580527424812,
0.04840182512998581,
-0.1388225257396698,
0.10775185376405716,
0.069329634308815,
0.07933036983013153,
-0.15859703719615936,
0.17374646663665771,
0.022950896993279457,
0.036869462579488754,
-0.005359143018722534,
0.01686915010213852,
-0.09500173479318619,
0.018054837360978127,
-0.11217498034238815,
-0.035852354019880295,
-0.03095352277159691,
-0.005096007604151964,
-0.0005774250021204352,
-0.06443526595830917,
-0.06815038621425629,
0.03666723892092705,
-0.11647938936948776,
-0.03688203915953636,
0.008329958654940128,
0.029918482527136803,
-0.1320343315601349,
0.006072492804378271,
0.0354929193854332,
-0.10407929122447968,
0.09871348738670349,
0.08732568472623825,
0.029232200235128403,
0.0683663934469223,
-0.07360448688268661,
-0.016779063269495964,
0.05044853687286377,
0.0016764722531661391,
0.055217523127794266,
-0.11666596680879593,
-0.004032369703054428,
-0.02062881551682949,
0.05567106232047081,
0.0017160526476800442,
0.062311168760061264,
-0.14348076283931732,
-0.007720551453530788,
-0.012193245813250542,
-0.0457177571952343,
-0.0712449699640274,
0.033887382596731186,
0.09361325204372406,
0.030752327293157578,
0.17309878766536713,
-0.07756826281547546,
0.03788592666387558,
-0.2191273421049118,
0.01373359002172947,
-0.04808182269334793,
-0.09205757826566696,
-0.10944484174251556,
-0.015052342787384987,
0.0879940614104271,
-0.059756506234407425,
0.091079942882061,
-0.030158238485455513,
0.09057281166315079,
0.03370252996683121,
-0.04483247548341751,
-0.018166739493608475,
0.04723631218075752,
0.21186532080173492,
0.04460055008530617,
-0.021381987258791924,
0.06312786787748337,
0.01758543774485588,
0.07443218678236008,
0.13224419951438904,
0.16699577867984772,
0.12771224975585938,
0.030377265065908432,
0.09642142057418823,
0.0899091511964798,
-0.09630095213651657,
-0.1767856627702713,
0.05985574051737785,
-0.06244153156876564,
0.12386918067932129,
-0.010747452266514301,
0.22441014647483826,
0.10160726308822632,
-0.16640140116214752,
0.050541508942842484,
-0.03680618479847908,
-0.07564573734998703,
-0.10217209905385971,
-0.008942069485783577,
-0.06636641919612885,
-0.17081066966056824,
0.020030608400702477,
-0.12044123560190201,
0.039532218128442764,
0.08036026358604431,
0.025858713313937187,
0.010490712709724903,
0.15467151999473572,
0.03870734944939613,
0.00871891900897026,
0.09340779483318329,
0.03537846729159355,
-0.031235825270414352,
-0.06577131897211075,
-0.07643653452396393,
0.018924526870250702,
-0.01685560867190361,
0.05082152411341667,
-0.061170466244220734,
-0.1228109821677208,
0.06081869453191757,
0.012344326823949814,
-0.10032900422811508,
0.039087727665901184,
-0.011865147389471531,
0.09532689303159714,
0.06450267136096954,
0.01057931873947382,
0.00783687736839056,
-0.025233466178178787,
0.2515604496002197,
-0.11712466925382614,
-0.07787478715181351,
-0.12346234172582626,
0.2554333806037903,
0.018161796033382416,
-0.03022359125316143,
0.03943725675344467,
-0.08393720537424088,
-0.04197491332888603,
0.18756872415542603,
0.17910511791706085,
-0.017658820375800133,
-0.019756373018026352,
0.027638768777251244,
-0.015920307487249374,
-0.06632756441831589,
0.07034572213888168,
0.13910946249961853,
0.12276731431484222,
-0.0667039230465889,
-0.03553761541843414,
-0.05138201266527176,
-0.041001904755830765,
-0.013861318118870258,
0.08229070156812668,
0.004398362245410681,
-0.03064046800136566,
-0.03481864556670189,
0.07468212395906448,
-0.06070888787508011,
-0.162210151553154,
0.05205436795949936,
-0.22632557153701782,
-0.18789462745189667,
-0.015230557881295681,
0.1070193275809288,
0.024926960468292236,
0.06343477964401245,
0.004817890003323555,
-0.024470927193760872,
0.094630666077137,
-0.00327615556307137,
-0.0722217857837677,
-0.1092250645160675,
0.09408427029848099,
-0.09247012436389923,
0.18384307622909546,
-0.05085491016507149,
0.06962370127439499,
0.11419111490249634,
0.07448815554380417,
-0.08710632473230362,
0.031077614054083824,
0.06898244470357895,
-0.16435804963111877,
0.03070784918963909,
0.19549578428268433,
-0.028511155396699905,
0.104544997215271,
0.019855357706546783,
-0.1334671825170517,
0.0027621902991086245,
-0.07539721578359604,
-0.03660504147410393,
-0.05824713408946991,
-0.022952528670430183,
-0.04601331800222397,
0.12476713955402374,
0.2103336602449417,
-0.053114376962184906,
-0.012660945765674114,
-0.06091906875371933,
0.018461573868989944,
0.06711041927337646,
0.0813993513584137,
-0.03344989940524101,
-0.29351913928985596,
0.022807760164141655,
0.0024019493721425533,
-0.009756791405379772,
-0.2549464702606201,
-0.07659605145454407,
0.0514465793967247,
-0.07105281949043274,
-0.08294085413217545,
0.07540753483772278,
0.05879165232181549,
0.04448840394616127,
-0.04297179728746414,
-0.03026481345295906,
-0.05471920967102051,
0.17181792855262756,
-0.20653711259365082,
-0.07548081874847412
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Tamil
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Tamil using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ta", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-xlsr-53-tamil")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-xlsr-53-tamil")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Tamil test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "ta", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("anuragshas/wav2vec2-xlsr-53-tamil")
model = Wav2Vec2ForCTC.from_pretrained("anuragshas/wav2vec2-xlsr-53-tamil")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\।\’\']'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 71.87 %
## Training
The Common Voice `train` and `validation` datasets were used for training.
|
{"language": "ta", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Anurag Singh XLSR Wav2Vec2 Large 53 Tamil", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ta", "type": "common_voice", "args": "ta"}, "metrics": [{"type": "wer", "value": 71.87, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
anuragshas/wav2vec2-xlsr-53-tamil
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"ta",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ta"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ta #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Tamil
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Tamil using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Tamil test data of Common Voice.
Test Result: 71.87 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.
|
[
"# Wav2Vec2-Large-XLSR-53-Tamil\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Tamil using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Tamil test data of Common Voice.\n\nTest Result: 71.87 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ta #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Tamil\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Tamil using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Tamil test data of Common Voice.\n\nTest Result: 71.87 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
80,
61,
20,
28,
23
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ta #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Tamil\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Tamil using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Tamil test data of Common Voice.\n\nTest Result: 71.87 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training."
] |
[
-0.13683108985424042,
-0.0014030971797183156,
-0.0022794066462665796,
-0.018839672207832336,
0.11203114688396454,
-0.05953948199748993,
0.13181789219379425,
0.09365323930978775,
0.004586796276271343,
-0.0064373137429356575,
-0.005860212258994579,
0.03541452810168266,
0.06532036513090134,
0.09760569036006927,
-0.0013513392768800259,
-0.25343769788742065,
0.0016396832652390003,
0.03379352018237114,
0.08206304907798767,
0.1235998198390007,
0.09226307272911072,
-0.0692664310336113,
-0.016293207183480263,
0.09293853491544724,
-0.0965287983417511,
0.03183084353804588,
0.03596360608935356,
-0.11250181496143341,
0.13413609564304352,
0.026121526956558228,
0.05961143970489502,
0.05040543153882027,
0.09958965331315994,
-0.1644856035709381,
0.030130768194794655,
0.017324596643447876,
0.0686030164361,
0.0031736702658236027,
0.06298849731683731,
0.04359852895140648,
0.14502374827861786,
0.12062502652406693,
-0.004599190782755613,
0.06111779436469078,
-0.03396366909146309,
-0.1672152727842331,
-0.012137779034674168,
-0.01202766690403223,
0.11845237016677856,
0.1564628779888153,
-0.07974499464035034,
0.16258881986141205,
-0.09115580469369888,
0.10742346197366714,
0.061431050300598145,
-0.21030643582344055,
-0.004326554015278816,
0.10581942647695541,
0.08600956946611404,
0.11098930239677429,
-0.05055819824337959,
0.023235609754920006,
0.023050552234053612,
0.06159290298819542,
0.022731591016054153,
-0.02855013683438301,
-0.1647547334432602,
-0.02185613103210926,
-0.1336946338415146,
0.029562506824731827,
0.1860332041978836,
-0.04845026507973671,
-0.05871817469596863,
-0.09930547326803207,
-0.04670459404587746,
-0.04410635679960251,
-0.008663319982588291,
-0.0939851626753807,
-0.007941355928778648,
0.05073465406894684,
0.04872165620326996,
-0.007504342123866081,
-0.10728265345096588,
-0.11168906092643738,
0.011306782253086567,
0.04560432210564613,
0.038102198392152786,
0.01323096826672554,
-0.147343710064888,
0.036708153784275055,
-0.07542934268712997,
-0.05365820974111557,
-0.022558122873306274,
0.04326242953538895,
-0.07396894693374634,
0.023776886984705925,
-0.0696081668138504,
-0.21437163650989532,
-0.005009644664824009,
-0.09212592989206314,
0.0139533169567585,
0.019131362438201904,
-0.034199681133031845,
0.03971180319786072,
0.02045554295182228,
0.14917390048503876,
-0.12022900581359863,
0.021634541451931,
0.06025693938136101,
0.009813862852752209,
0.005264671519398689,
-0.031848564743995667,
-0.04424306005239487,
-0.074107825756073,
0.048756085336208344,
0.05451245605945587,
-0.09244870394468307,
0.023201117292046547,
-0.03135491907596588,
-0.062347978353500366,
0.009761953726410866,
-0.11088692396879196,
-0.05255439877510071,
0.025566518306732178,
0.023860333487391472,
0.14055104553699493,
0.06389623880386353,
0.04159769415855408,
-0.07202479988336563,
-0.06418013572692871,
-0.01952475495636463,
0.061728913336992264,
-0.04730299487709999,
-0.09792063385248184,
-0.007425509858876467,
-0.07719751447439194,
-0.045178163796663284,
-0.08691719174385071,
-0.11025464534759521,
-0.045547980815172195,
-0.04542486369609833,
0.01080416887998581,
-0.061475805938243866,
-0.08551828563213348,
-0.01122299674898386,
-0.05229257419705391,
-0.0619572289288044,
0.003499247133731842,
-0.01657850854098797,
0.06563619524240494,
0.05042481794953346,
0.07700373977422714,
0.06119101122021675,
0.08940116316080093,
-0.07706980407238007,
-0.02028944157063961,
0.0419907346367836,
0.1173333004117012,
-0.048687729984521866,
-0.040855467319488525,
-0.06804010272026062,
-0.07599293440580368,
-0.0312635563313961,
0.04563865438103676,
0.05770878121256828,
0.07903420180082321,
-0.2609935700893402,
-0.07846125215291977,
0.17273004353046417,
-0.15047994256019592,
-0.06915107369422913,
0.18702296912670135,
0.019466234371066093,
0.12514044344425201,
0.11140063405036926,
0.18848565220832825,
0.09101487696170807,
-0.1500273197889328,
0.03264422342181206,
-0.0055735413916409016,
-0.012625996954739094,
-0.06340209394693375,
0.06412875652313232,
-0.01411503180861473,
0.0009703113464638591,
0.033390242606401443,
-0.04778125137090683,
0.06654791533946991,
-0.0562482587993145,
-0.052216675132513046,
-0.022634224966168404,
-0.10186148434877396,
0.039739225059747696,
0.02987046167254448,
0.043815840035676956,
0.009670971892774105,
-0.03265023231506348,
0.032980818301439285,
0.13778044283390045,
-0.1335466355085373,
0.06319133192300797,
-0.18355382978916168,
0.10041740536689758,
-0.11312341690063477,
-0.008709256537258625,
-0.15510015189647675,
0.1567341536283493,
0.027474334463477135,
0.048495713621377945,
0.07113269716501236,
0.1229533776640892,
0.005661144852638245,
0.006682342384010553,
-0.030872540548443794,
-0.007441870868206024,
0.04883310943841934,
-0.011516260914504528,
-0.05481262877583504,
-0.056562718003988266,
-0.015658259391784668,
-0.05012836307287216,
0.09239320456981659,
-0.15746857225894928,
0.002408981788903475,
0.0379861518740654,
0.013880467042326927,
0.008603652007877827,
-0.01745561696588993,
0.10761580616235733,
0.1092633605003357,
0.027936195954680443,
-0.008017068728804588,
0.043207526206970215,
0.005421589128673077,
-0.05577680841088295,
0.1309550255537033,
-0.13912363350391388,
0.02479199878871441,
0.058169953525066376,
-0.06324388086795807,
0.015195813030004501,
0.13710837066173553,
0.0013149076839908957,
-0.0271807461977005,
-0.07723718881607056,
0.04467124119400978,
0.2853994369506836,
0.03458894044160843,
0.1288330852985382,
-0.06479047983884811,
0.03564830124378204,
0.029799165204167366,
-0.08698859810829163,
0.060325752943754196,
0.025373315438628197,
0.0825042873620987,
-0.0003547036030795425,
0.007127450313419104,
-0.0948771983385086,
-0.1145441085100174,
0.1884090155363083,
-0.016970030963420868,
-0.08313722163438797,
0.019571324810385704,
-0.06961407512426376,
-0.03495657444000244,
0.011524020694196224,
-0.20739367604255676,
-0.03932122513651848,
0.025624867528676987,
0.03363616764545441,
0.07296542078256607,
-0.13173629343509674,
0.011819738894701004,
0.01762435957789421,
-0.0864248052239418,
-0.17561961710453033,
0.1263493150472641,
-0.057508401572704315,
0.028836717829108238,
-0.1185443103313446,
-0.029820019379258156,
-0.010007491335272789,
-0.02780057303607464,
-0.18632692098617554,
0.12178774923086166,
-0.060750000178813934,
-0.2893664836883545,
-0.14767125248908997,
-0.005119198467582464,
0.03218608722090721,
-0.016213124617934227,
0.06458975374698639,
-0.13525541126728058,
-0.04934512451291084,
-0.03670278191566467,
0.09417178481817245,
0.016161249950528145,
-0.031941015273332596,
-0.07209982722997665,
-0.024670425802469254,
0.05602904409170151,
-0.1532149314880371,
0.019746985286474228,
-0.027000470086932182,
-0.11470814794301987,
0.025806279852986336,
-0.028230955824255943,
-0.02316362038254738,
0.20675261318683624,
-0.0034994904417544603,
0.031617987900972366,
0.004690973088145256,
0.17159414291381836,
-0.08230514824390411,
-0.005803709849715233,
0.23404954373836517,
0.025808190926909447,
-0.019157137721776962,
0.11409742385149002,
0.009272300638258457,
-0.06414733082056046,
-0.0004583722329698503,
-0.01314836647361517,
-0.054443005472421646,
-0.2660786211490631,
-0.11295751482248306,
-0.04979412630200386,
-0.046704307198524475,
-0.07663249224424362,
-0.0050147227011621,
0.06774124503135681,
0.02881319262087345,
0.013594428077340126,
-0.06787806004285812,
0.027029410004615784,
-0.012840262614190578,
0.15030086040496826,
-0.009465482085943222,
0.11415495723485947,
-0.053701527416706085,
0.0072649745270609856,
-0.004404204431921244,
0.005348522216081619,
0.173641636967659,
0.06764663010835648,
0.06665457040071487,
0.09063002467155457,
0.1310199797153473,
0.14716720581054688,
0.06554314494132996,
-0.0944184958934784,
-0.022910218685865402,
0.02598363347351551,
-0.02396318130195141,
-0.04085927456617355,
0.02490311488509178,
0.12416787445545197,
-0.047642406076192856,
-0.0007025547092780471,
0.01679091714322567,
-0.012651802971959114,
0.18980184197425842,
0.04865100607275963,
-0.1855185627937317,
-0.08607765287160873,
-0.023568926379084587,
-0.08382979780435562,
-0.010025404393672943,
0.055534202605485916,
0.11671288311481476,
-0.11377031356096268,
0.022625088691711426,
-0.02548392117023468,
0.09243910014629364,
-0.008687026798725128,
0.022354653105139732,
-0.09872637689113617,
0.012128658592700958,
0.01746375486254692,
0.07663380354642868,
-0.19528953731060028,
0.22319111227989197,
0.001500260317698121,
0.09006913006305695,
0.005651824176311493,
-0.015192379243671894,
0.006023071706295013,
0.1837533712387085,
0.06032523885369301,
0.015367774292826653,
0.11620836704969406,
-0.1175750344991684,
-0.04173756018280983,
0.06373513489961624,
-0.024801580235362053,
0.06762432307004929,
0.07434501498937607,
-0.002695006551221013,
0.009401122108101845,
0.003024621633812785,
-0.0312237236648798,
-0.11470299959182739,
-0.011629151180386543,
0.05478320270776749,
0.1478794366121292,
0.11906364560127258,
-0.03882673755288124,
-0.13872183859348297,
-0.13473758101463318,
0.05132179334759712,
-0.07042280584573746,
-0.06639166921377182,
-0.025652777403593063,
-0.03553717955946922,
0.10743765532970428,
-0.07148418575525284,
-0.0026651760563254356,
0.08625751733779907,
0.11709053814411163,
-0.02145061455667019,
0.014517088420689106,
0.044742703437805176,
-0.05998053774237633,
-0.0767943486571312,
0.016681276261806488,
0.14300808310508728,
0.07486965507268906,
0.0475071482360363,
0.044157758355140686,
0.010198107920587063,
0.016675982624292374,
-0.02673620916903019,
-0.024942981079220772,
0.15837924182415009,
-0.19016166031360626,
-0.005315739195793867,
0.00644331332296133,
-0.1601622998714447,
-0.13392291963100433,
-0.0876941978931427,
0.14705152809619904,
0.04274974390864372,
-0.04001741483807564,
0.1673707664012909,
0.19268274307250977,
-0.10120856761932373,
-0.16723592579364777,
-0.04419723153114319,
0.07528351992368698,
0.14198310673236847,
0.02207370661199093,
-0.16756397485733032,
0.08938587456941605,
-0.02332185208797455,
-0.03745908662676811,
-0.10752199590206146,
-0.21349787712097168,
-0.15068629384040833,
0.14040294289588928,
0.013295101001858711,
0.1253080517053604,
-0.0064650168642401695,
-0.017542921006679535,
-0.004380916245281696,
-0.004017574246972799,
-0.039080534130334854,
-0.05220092833042145,
0.10614525526762009,
0.006693424191325903,
0.12245859205722809,
0.045413825660943985,
-0.00994962640106678,
0.06854739040136337,
0.05403020977973938,
-0.03030463308095932,
0.005031053442507982,
0.04388713464140892,
0.04872221499681473,
0.0661206841468811,
0.16861790418624878,
-0.023009277880191803,
0.021867208182811737,
-0.06935832649469376,
-0.11365693807601929,
-0.07027743756771088,
0.02590242773294449,
0.0627126470208168,
-0.032750122249126434,
0.03449484705924988,
-0.01729380525648594,
-0.0015417429385706782,
0.0007080947398208082,
-0.0822802260518074,
-0.1351974457502365,
0.05925166606903076,
0.13094119727611542,
0.22736306488513947,
-0.11011975258588791,
-0.054314397275447845,
-0.05428393557667732,
-0.021012399345636368,
0.09492971003055573,
-0.05050636827945709,
0.006745901890099049,
0.05499011650681496,
0.041026074439287186,
0.13478730618953705,
0.0039052350912243128,
-0.08833831548690796,
0.09948316961526871,
0.03033612109720707,
0.033263884484767914,
-0.18480800092220306,
-0.019420791417360306,
0.008257091045379639,
-0.025400618091225624,
0.06619628518819809,
0.1016104444861412,
-0.1218661442399025,
-0.039735130965709686,
-0.025634316727519035,
0.030122937634587288,
-0.11874998360872269,
0.2040126770734787,
0.0036880020052194595,
0.07066990435123444,
-0.09598848968744278,
-0.00589530635625124,
0.01601092331111431,
-0.05514775589108467,
-0.0004850503464695066,
0.02612113207578659,
-0.1112636998295784,
-0.07146397233009338,
-0.057421986013650894,
0.05042143166065216,
0.05480167269706726,
-0.09191560745239258,
-0.06347645819187164,
-0.09105661511421204,
-0.00796461571007967,
0.13380520045757294,
0.0460897721350193,
0.026491643860936165,
-0.18876567482948303,
-0.07285386323928833,
-0.09943308681249619,
0.05242007225751877,
0.10969807952642441,
-0.021905843168497086,
-0.16268855333328247,
0.19585293531417847,
0.107972152531147,
0.05167469382286072,
-0.055392391979694366,
-0.09419811517000198,
0.06800075620412827,
0.083897665143013,
-0.08449050039052963,
-0.003887460334226489,
-0.05162707343697548,
-0.00529870530590415,
-0.007172783371061087,
-0.08342602849006653,
-0.020577706396579742,
0.07540042698383331,
-0.08101619780063629,
0.07024974375963211,
-0.020299874246120453,
0.07683058828115463,
-0.04134539142251015,
0.013898571953177452,
0.05580466240644455,
-0.03836066275835037,
0.0839836448431015,
0.09839232265949249,
-0.13102276623249054,
0.14036819338798523,
-0.13844335079193115,
-0.034648552536964417,
0.05371176823973656,
0.0876953974366188,
-0.014601261354982853,
-0.14325396716594696,
0.03642963245511055,
0.09047503769397736,
0.05205822363495827,
0.005986224394291639,
0.08800279349088669,
-0.048815712332725525,
-0.029466887935996056,
-0.09610196202993393,
0.010979048907756805,
-0.04363046586513519,
0.013579042628407478,
0.01423650048673153,
0.1632242351770401,
0.1649336963891983,
-0.10494094341993332,
0.09685233235359192,
-0.11839855462312698,
0.025323908776044846,
-0.08092803508043289,
0.0005949651822447777,
-0.1381264328956604,
-0.08650536835193634,
0.06841418147087097,
-0.060292474925518036,
0.10296724736690521,
0.019331185147166252,
0.04348950833082199,
-0.030633410438895226,
-0.05539290979504585,
0.007949214428663254,
-0.033613305538892746,
0.24786487221717834,
0.08763235807418823,
0.052068326622247696,
0.006371634546667337,
-0.028636327013373375,
-0.03965594992041588,
0.1381172239780426,
-0.06431202590465546,
0.09974998235702515,
-0.01520769577473402,
0.055842019617557526,
0.1026586964726448,
-0.02560505084693432,
-0.07434624433517456,
-0.06554821878671646,
-0.16804878413677216,
0.005942033138126135,
-0.051696259528398514,
0.2140641212463379,
0.19282211363315582,
-0.05732208490371704,
0.09316500276327133,
-0.0017765137599781156,
-0.10020814836025238,
-0.14929510653018951,
-0.13314343988895416,
-0.08744124323129654,
-0.16123118996620178,
0.03968190774321556,
-0.0913890153169632,
0.01840740628540516,
0.06124081835150719,
0.0590798482298851,
-0.042984627187252045,
0.1911187767982483,
-0.0018377790693193674,
-0.08765114098787308,
0.047454867511987686,
-0.08071426302194595,
-0.01991564966738224,
-0.0897335335612297,
0.07066577672958374,
0.1173480674624443,
-0.0014087761519476771,
0.060164857655763626,
-0.005892920307815075,
-0.08184915781021118,
0.013802849687635899,
-0.07247965037822723,
-0.07317192852497101,
-0.017617572098970413,
-0.03585582226514816,
0.03612138330936432,
0.13963459432125092,
0.0978754386305809,
-0.06085708364844322,
-0.009071853943169117,
0.09332114458084106,
-0.02979878894984722,
-0.1696639358997345,
-0.14468792080879211,
0.21121692657470703,
0.05048613250255585,
-0.0009650111896917224,
0.02367248199880123,
-0.0082174651324749,
-0.024713370949029922,
0.2292081117630005,
0.1784186065196991,
0.041402943432331085,
0.020325804129242897,
-0.01219349168241024,
-0.0073009985499084,
-0.09613929688930511,
0.05939911678433418,
0.07154576480388641,
0.20700007677078247,
-0.03622926026582718,
0.035999178886413574,
-0.14611542224884033,
-0.06869539618492126,
-0.013159545138478279,
-0.018932009115815163,
-0.03533526137471199,
-0.05682438239455223,
-0.008819087408483028,
0.1532035619020462,
-0.0780554711818695,
-0.037360209971666336,
-0.15601208806037903,
-0.07871013134717941,
-0.08791247010231018,
-0.04439915716648102,
-0.016490228474140167,
0.0749407634139061,
0.014688435941934586,
-0.06663665920495987,
0.038184940814971924,
0.17406705021858215,
0.022400878369808197,
-0.04515616223216057,
-0.031909309327602386,
0.1095864549279213,
-0.09016063064336777,
-0.059865787625312805,
-0.006308102514594793,
0.17277956008911133,
0.0026516770012676716,
0.12253082543611526,
0.013892574235796928,
0.2159270942211151,
-0.03894372284412384,
-0.026456810534000397,
0.0033521754667162895,
0.1618874967098236,
-0.018385689705610275,
0.11799405515193939,
-0.005974540952593088,
-0.12729699909687042,
0.050198063254356384,
-0.16082382202148438,
-0.009085720404982567,
-0.06515182554721832,
0.06446874886751175,
-0.024092959240078926,
0.0759284719824791,
0.07837095856666565,
-0.06641337275505066,
-0.06784628331661224,
-0.05852567031979561,
0.03491251543164253,
0.004095715936273336,
-0.0712934285402298,
-0.0458805188536644,
-0.2601090669631958,
-0.012688851915299892,
-0.13082681596279144,
-0.012714201584458351,
-0.19341036677360535,
-0.031100792810320854,
-0.0017426650738343596,
-0.08202963322401047,
-0.003957531880587339,
0.0498199425637722,
0.07377403229475021,
0.022245384752750397,
0.008505858480930328,
-0.013124825432896614,
0.06665685027837753,
0.12788249552249908,
-0.20681491494178772,
-0.12563861906528473
] |
null | null |
transformers
|
# Chandler DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
anweasha/DialoGPT-small-Chandler
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Chandler DialoGPT Model
|
[
"# Chandler DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Chandler DialoGPT Model"
] |
[
51,
8
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Chandler DialoGPT Model"
] |
[
-0.044611480087041855,
0.0315445177257061,
-0.005560820922255516,
0.04228443279862404,
0.12489386647939682,
0.00493934890255332,
0.13434600830078125,
0.1338980793952942,
-0.02929154597222805,
-0.05327439308166504,
0.14371798932552338,
0.18761952221393585,
-0.01019383780658245,
0.0793662965297699,
-0.10539109259843826,
-0.29441753029823303,
0.018846698105335236,
0.0468362495303154,
0.07384052872657776,
0.12702129781246185,
0.07680533826351166,
-0.049259983003139496,
0.09140196442604065,
0.025264369323849678,
-0.14971868693828583,
-0.020758705213665962,
0.03576388955116272,
-0.12877678871154785,
0.10609127581119537,
0.06340096145868301,
0.03559238091111183,
0.02441822923719883,
-0.047318778932094574,
-0.15745189785957336,
0.03196026757359505,
-0.006451561581343412,
-0.02037309668958187,
0.05627863109111786,
0.04568229988217354,
-0.0922778844833374,
0.09218551963567734,
0.09571465104818344,
0.015516651794314384,
0.04508548974990845,
-0.160503089427948,
-0.05096049979329109,
-0.01312166266143322,
0.07191362977027893,
0.06908290088176727,
0.11428551375865936,
-0.04206296429038048,
0.09154810756444931,
-0.07859750837087631,
0.11826197803020477,
0.12923605740070343,
-0.3523136377334595,
-0.026157565414905548,
0.10238205641508102,
0.009264817461371422,
0.05047241970896721,
-0.0488852933049202,
0.0765325054526329,
0.005785932298749685,
-0.00428801542147994,
-0.033610980957746506,
-0.08051761984825134,
-0.11491749435663223,
0.009757678024470806,
-0.09534357488155365,
-0.027092615142464638,
0.27930402755737305,
-0.046633146703243256,
0.054819803684949875,
-0.08465726673603058,
-0.09756878763437271,
-0.02016308531165123,
-0.04534795507788658,
-0.02034779265522957,
-0.07117890566587448,
0.06532187759876251,
-0.0033597052097320557,
-0.08562711626291275,
-0.12049531191587448,
-0.02846037968993187,
-0.183100625872612,
0.18328815698623657,
0.03912739455699921,
0.054469507187604904,
-0.2354099452495575,
0.11332835257053375,
0.02211632952094078,
-0.09558473527431488,
0.002810858655720949,
-0.11014363169670105,
0.012991856783628464,
0.025595787912607193,
-0.03230883553624153,
0.01376393623650074,
0.04787176102399826,
0.12540724873542786,
0.015433629974722862,
0.038148894906044006,
-0.03160388767719269,
0.050533369183540344,
0.02588603086769581,
0.09583059698343277,
0.03338782861828804,
-0.10680670291185379,
0.0059699974954128265,
-0.10323040187358856,
-0.007243498228490353,
-0.06517516076564789,
-0.17545896768569946,
-0.03875653073191643,
0.0532958097755909,
0.04888966679573059,
0.062111251056194305,
0.12273696064949036,
-0.017127444967627525,
-0.06775972992181778,
0.03319169208407402,
-0.010403350926935673,
-0.02056162804365158,
0.008737103082239628,
-0.019183320924639702,
0.16922956705093384,
0.032695669680833817,
0.03323302045464516,
-0.1131381019949913,
0.01812756061553955,
-0.03520306199789047,
0.004695463925600052,
-0.008013002574443817,
-0.038754161447286606,
-0.013410542160272598,
-0.01447074580937624,
0.008215186186134815,
-0.1534334123134613,
-0.17147643864154816,
0.0033550236839801073,
-0.005128337536007166,
-0.05216587334871292,
-0.08310703188180923,
-0.10234798491001129,
-0.02175336889922619,
0.04204048961400986,
-0.06932944059371948,
0.011479453183710575,
-0.05803582817316055,
0.06917276978492737,
-0.03256935998797417,
0.09940222650766373,
-0.0848747119307518,
0.08098890632390976,
-0.07494769245386124,
-0.022285645827651024,
-0.10211723297834396,
0.13528595864772797,
0.021724138408899307,
0.0662761852145195,
-0.028380136936903,
-0.033989641815423965,
-0.10954859107732773,
0.040269311517477036,
-0.04869022220373154,
0.25374266505241394,
-0.08330520242452621,
-0.08909041434526443,
0.24897409975528717,
-0.04955193027853966,
-0.12115153670310974,
0.13589468598365784,
-0.021974658593535423,
0.08434942364692688,
0.12698224186897278,
0.19792398810386658,
0.021974846720695496,
0.0036410056054592133,
0.08756887167692184,
0.10722936689853668,
-0.10194451361894608,
-0.014422794803977013,
0.01533852331340313,
-0.03338419646024704,
-0.08254923671483994,
0.03769031912088394,
0.0643942579627037,
0.04265284165740013,
-0.057624995708465576,
-0.025920571759343147,
-0.005952124949544668,
-0.0074908840470016,
0.1203804612159729,
-0.040005072951316833,
0.1376066356897354,
-0.034105073660612106,
-0.05474892631173134,
-0.04524580016732216,
0.006195399444550276,
-0.032368797808885574,
0.049741555005311966,
-0.0702342838048935,
0.07747253775596619,
0.002022132044658065,
0.05984021723270416,
-0.13942348957061768,
-0.06451957672834396,
-0.040202271193265915,
0.19491137564182281,
0.053494423627853394,
0.0755859911441803,
0.0619724877178669,
-0.03287003934383392,
-0.01820644922554493,
0.021185822784900665,
0.18779590725898743,
-0.01426000241190195,
-0.09912917762994766,
-0.10162957012653351,
0.06582698971033096,
-0.04756476357579231,
0.1517457216978073,
-0.08715691417455673,
0.011004955507814884,
-0.008733727969229221,
0.08700469881296158,
-0.003958455286920071,
0.027883809059858322,
-0.008115815930068493,
-0.024761684238910675,
-0.049306243658065796,
0.007202398497611284,
0.10100432485342026,
-0.015586268156766891,
-0.09965263307094574,
0.20961566269397736,
-0.15640957653522491,
0.13544824719429016,
0.19292491674423218,
-0.2811583876609802,
-0.00031768163898959756,
-0.15844494104385376,
-0.028797848150134087,
0.008116680197417736,
0.05240119621157646,
-0.035995058715343475,
0.2573416233062744,
-0.020202742889523506,
0.16311879456043243,
-0.035251978784799576,
-0.0404314324259758,
-0.0381833054125309,
-0.06104670837521553,
0.021205706521868706,
0.08837462961673737,
0.05742167308926582,
-0.17101414501667023,
0.14822545647621155,
0.1164931207895279,
0.0805538222193718,
0.16094693541526794,
0.03994326665997505,
-0.00003210940121789463,
0.05313803628087044,
0.008873874321579933,
-0.03646830841898918,
-0.05761640891432762,
-0.2704146206378937,
-0.05290856957435608,
0.08310755342245102,
0.0363050252199173,
0.1003812775015831,
-0.1252482384443283,
-0.029439350590109825,
0.006730454508215189,
-0.017115630209445953,
-0.008278305642306805,
0.09207688271999359,
0.036479540169239044,
0.10850713402032852,
-0.014431359246373177,
-0.05513739958405495,
0.07706630975008011,
0.009020415134727955,
-0.07790632545948029,
0.16846665740013123,
-0.1182202696800232,
-0.31556594371795654,
-0.12196700274944305,
-0.19279833137989044,
-0.07274352014064789,
0.03769182041287422,
0.10042703151702881,
-0.11265192180871964,
-0.006832989398390055,
-0.008100583218038082,
0.10278134047985077,
-0.10690686106681824,
0.02011384256184101,
-0.06254259496927261,
-0.01152465958148241,
-0.13403725624084473,
-0.09206237643957138,
-0.05641068145632744,
-0.07317224889993668,
-0.04284730926156044,
0.11868790537118912,
-0.1569552719593048,
0.02409280464053154,
0.2190261334180832,
0.06253133714199066,
0.06461415439844131,
-0.04358236491680145,
0.1851961612701416,
-0.1076943501830101,
-0.007826779969036579,
0.20348581671714783,
-0.04429755359888077,
0.05622606351971626,
0.09994857758283615,
-0.00897175446152687,
-0.08196565508842468,
0.028696401044726372,
-0.0382319912314415,
-0.08107434958219528,
-0.22552567720413208,
-0.12762558460235596,
-0.12754274904727936,
0.10638629645109177,
0.046774741262197495,
0.03528175503015518,
0.1606324464082718,
0.06326503306627274,
-0.026072977110743523,
-0.005901490803807974,
0.07847917824983597,
0.08405118435621262,
0.303406298160553,
-0.07001807540655136,
0.1664048135280609,
-0.00932023860514164,
-0.1340690553188324,
0.058715566992759705,
0.07030719518661499,
0.1510382443666458,
0.052679963409900665,
0.04339192435145378,
0.01134695578366518,
0.0909958928823471,
0.1346767693758011,
0.06370080262422562,
0.046675506979227066,
-0.028594909235835075,
-0.029491383582353592,
-0.0311463363468647,
-0.043132223188877106,
0.05555414780974388,
0.08958519995212555,
-0.1605996936559677,
-0.02584683708846569,
-0.0269499309360981,
0.05564658343791962,
0.04765012860298157,
0.08731298893690109,
-0.19182005524635315,
-0.011031215079128742,
0.06286640465259552,
-0.042528942227363586,
-0.10383396595716476,
0.08845986425876617,
-0.003212686162441969,
-0.15143680572509766,
0.04774240404367447,
-0.01445833221077919,
0.11859295517206192,
-0.10980881750583649,
0.07269527018070221,
-0.10590273141860962,
-0.06606465578079224,
0.007279694080352783,
0.08078320324420929,
-0.2616187334060669,
0.1984798163175583,
-0.004450215958058834,
-0.06063593178987503,
-0.12612178921699524,
-0.009497003629803658,
0.04355406016111374,
0.0845070332288742,
0.11920888721942902,
-0.019637489691376686,
0.0460757277905941,
0.002372296527028084,
-0.05416495352983475,
0.014498633332550526,
0.12449967861175537,
-0.03240517899394035,
-0.008027579635381699,
-0.03579777851700783,
-0.00930437259376049,
0.001862571807578206,
-0.029450079426169395,
0.015358386561274529,
-0.18453431129455566,
0.056158144026994705,
0.06637227535247803,
0.0437210276722908,
0.030345305800437927,
-0.03634258732199669,
-0.09173119813203812,
0.18710990250110626,
0.0494980625808239,
-0.09402863681316376,
-0.0901416763663292,
-0.05234992876648903,
0.056546565145254135,
-0.06378169357776642,
0.033980172127485275,
-0.042306799441576004,
0.0001941994996741414,
-0.04916508123278618,
-0.1961570382118225,
0.12928855419158936,
-0.09581227600574493,
-0.05223754793405533,
-0.026610692963004112,
0.2242550402879715,
-0.020361395552754402,
0.022509267553687096,
0.04203411191701889,
0.016004087403416634,
-0.12018481642007828,
-0.10155246406793594,
-0.012000172398984432,
0.030777113512158394,
-0.03462472930550575,
0.07417196035385132,
-0.003241183701902628,
-0.05924694240093231,
-0.07763101905584335,
0.01348524633795023,
0.32158100605010986,
0.13365311920642853,
-0.0591418482363224,
0.1638583242893219,
0.05693988874554634,
-0.05493360385298729,
-0.2669195830821991,
-0.0708758607506752,
-0.0799207091331482,
-0.05543573945760727,
-0.06615447998046875,
-0.14378178119659424,
0.10937413573265076,
-0.009857891127467155,
-0.017948942258954048,
0.12052147090435028,
-0.28885525465011597,
-0.11492717266082764,
0.19389860332012177,
-0.01558854803442955,
0.3999457359313965,
-0.09522289782762527,
-0.09088258445262909,
-0.03366932272911072,
-0.1651950627565384,
0.14081460237503052,
-0.04367240145802498,
0.1301562786102295,
-0.013303065672516823,
0.1287260800600052,
0.05159417912364006,
-0.0003438885323703289,
0.08449380844831467,
0.06636849045753479,
-0.05567344278097153,
-0.08971109986305237,
-0.04504431411623955,
0.01718764379620552,
0.026899509131908417,
0.026476731523871422,
-0.03595877066254616,
0.021528474986553192,
-0.13449092209339142,
-0.059636227786540985,
-0.08769020438194275,
0.02024969831109047,
0.017231756821274757,
-0.06862757354974747,
0.002774191787466407,
-0.059770915657281876,
0.012105441652238369,
0.018200673162937164,
0.12667253613471985,
-0.09409339725971222,
0.13496261835098267,
0.16400444507598877,
0.13608130812644958,
-0.13370312750339508,
-0.052003178745508194,
-0.058955635875463486,
-0.054508011788129807,
0.0554894283413887,
-0.09160999953746796,
0.006243130192160606,
0.10636885464191437,
-0.016866065561771393,
0.07510410249233246,
0.11031540483236313,
0.017621271312236786,
0.00868665985763073,
0.07702646404504776,
-0.2188398241996765,
-0.1174205020070076,
-0.07860799878835678,
-0.06412634998559952,
0.06859022378921509,
0.10496335476636887,
0.20258629322052002,
-0.012743251398205757,
-0.012743153609335423,
0.002598022110760212,
0.012502240017056465,
-0.04723316431045532,
0.08894800394773483,
0.00466972915455699,
0.014052937738597393,
-0.15358248353004456,
0.051575351506471634,
-0.0011544539593160152,
-0.04956921190023422,
-0.0005615788977593184,
0.14326389133930206,
-0.1012938916683197,
-0.1160774752497673,
-0.12009799480438232,
0.10033223032951355,
-0.13429564237594604,
-0.008974811062216759,
-0.03277602046728134,
-0.1691877543926239,
0.06553567945957184,
0.05062214285135269,
0.06185992434620857,
0.06775997579097748,
-0.11132416874170303,
-0.0181229617446661,
-0.022584842517971992,
-0.002656108234077692,
0.05910104140639305,
0.0005184608162380755,
-0.056554216891527176,
0.07047811895608902,
-0.023784292861819267,
0.11139021813869476,
-0.08770079910755157,
-0.11063696444034576,
-0.12926334142684937,
0.05249744653701782,
-0.09572409838438034,
-0.08283285796642303,
-0.1116529032588005,
-0.023893125355243683,
-0.009792041033506393,
-0.01396164670586586,
-0.03579282388091087,
-0.03295738995075226,
-0.09940018504858017,
0.03490971773862839,
-0.026487130671739578,
0.01901520974934101,
-0.07139107584953308,
0.01997843198478222,
0.04767464101314545,
-0.03971276804804802,
0.15123727917671204,
0.15079237520694733,
-0.12583628296852112,
0.09245774149894714,
-0.09738273173570633,
-0.09092724323272705,
0.0971439927816391,
0.03067500703036785,
0.0638575553894043,
0.05795152857899666,
-0.00032441920484416187,
0.05534588173031807,
0.047813817858695984,
0.06039471551775932,
0.06882839649915695,
-0.08676242083311081,
0.01480532344430685,
-0.0540461540222168,
-0.12818533182144165,
-0.04614298790693283,
-0.03413337469100952,
0.02511375956237316,
0.04254793003201485,
0.09089814126491547,
-0.04587898775935173,
0.09289617091417313,
-0.07087661325931549,
0.029757708311080933,
0.020734550431370735,
-0.17742618918418884,
0.0732092335820198,
-0.09779876470565796,
0.0622728057205677,
0.0190610121935606,
0.21492142975330353,
0.005269190762192011,
0.004683709237724543,
0.021344734355807304,
0.09942890703678131,
0.012581584975123405,
-0.009869284927845001,
0.16815268993377686,
0.11447896808385849,
-0.06242408975958824,
-0.06979253888130188,
0.10934986174106598,
0.05783287435770035,
0.08360836654901505,
0.16267715394496918,
0.012538542971014977,
-0.01613072119653225,
0.09784863889217377,
0.018187368288636208,
0.028353719040751457,
-0.14646485447883606,
-0.13578684628009796,
-0.03923896700143814,
0.07514055073261261,
-0.06215899437665939,
0.07973640412092209,
0.12291044741868973,
-0.03746381029486656,
0.024005182087421417,
-0.0048560816794633865,
-0.07901354134082794,
-0.1763230413198471,
-0.16644328832626343,
-0.06785402446985245,
-0.15568985044956207,
-0.007718607317656279,
-0.11617154628038406,
0.030993683263659477,
0.05343346670269966,
0.0926101878285408,
-0.05637567117810249,
0.10951068252325058,
0.008539235219359398,
-0.10561323910951614,
0.10526464879512787,
-0.042120449244976044,
0.1025790423154831,
-0.06441912055015564,
-0.019157998263835907,
-0.092731773853302,
0.023332273587584496,
0.0077583398669958115,
0.029159214347600937,
-0.06540998816490173,
0.017564401030540466,
-0.10133769363164902,
-0.08657243847846985,
-0.05907424911856651,
0.07037096470594406,
0.017433352768421173,
0.18092939257621765,
0.02407417632639408,
-0.0194387249648571,
0.0178558100014925,
0.24188029766082764,
-0.09222125262022018,
-0.10204783082008362,
-0.06521913409233093,
0.19523414969444275,
0.018387172371149063,
0.07553433626890182,
-0.010158064775168896,
0.004680496174842119,
-0.08493339270353317,
0.33828359842300415,
0.3367950916290283,
-0.1136968582868576,
0.018909847363829613,
0.027877477928996086,
0.03654034063220024,
0.12092599272727966,
0.10351867973804474,
0.11359760910272598,
0.3200874328613281,
-0.04729609563946724,
-0.0156509168446064,
0.004282835870981216,
-0.03336801752448082,
-0.06214703619480133,
0.06873303651809692,
0.059900954365730286,
-0.07270722836256027,
-0.024241827428340912,
0.12416277080774307,
-0.2837981581687927,
0.10216391831636429,
-0.10516249388456345,
-0.2107369303703308,
-0.09147375077009201,
-0.01112385094165802,
0.12979647517204285,
0.04399798810482025,
0.11365175247192383,
-0.010368969291448593,
-0.06290490925312042,
0.04411688074469566,
0.02763933688402176,
-0.1964484602212906,
-0.012060758657753468,
0.09311288595199585,
-0.093468576669693,
-0.0325593426823616,
-0.019545340910553932,
0.06533273309469223,
0.07886842638254166,
0.07800756394863129,
-0.027054885402321815,
0.032749924808740616,
-0.009513240307569504,
-0.07609941810369492,
0.04101184383034706,
0.03461175411939621,
0.016977816820144653,
-0.05129995942115784,
0.07697918266057968,
-0.1704806238412857,
0.04773106053471565,
0.04289218410849571,
-0.012256895191967487,
-0.032709673047065735,
-0.0060662077739834785,
-0.07677378505468369,
0.07602009922266006,
0.0970601737499237,
-0.01117134653031826,
-0.016566166654229164,
-0.008560718037188053,
-0.047214165329933167,
-0.029667895287275314,
-0.10194850713014603,
-0.10045070946216583,
-0.14634522795677185,
-0.12050911784172058,
0.053291454911231995,
-0.0038993400521576405,
-0.14648106694221497,
-0.004286912735551596,
-0.12329182028770447,
0.042498670518398285,
-0.1233072578907013,
0.08571377396583557,
0.11258093267679214,
0.006503931246697903,
0.0033957541454583406,
-0.02402549609541893,
0.036028873175382614,
0.09566349536180496,
-0.1392456293106079,
-0.07957390695810318
] |
null | null |
transformers
|
# Jake Peralta DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
anweasha/DialoGPT-small-Jake
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Jake Peralta DialoGPT Model
|
[
"# Jake Peralta DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Jake Peralta DialoGPT Model"
] |
[
51,
10
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Jake Peralta DialoGPT Model"
] |
[
-0.04435775801539421,
0.10074503719806671,
-0.006036684848368168,
0.022257352247834206,
0.152704656124115,
-0.012472658418118954,
0.13179905712604523,
0.11126446723937988,
-0.029823295772075653,
-0.01581176370382309,
0.11510863900184631,
0.15242095291614532,
-0.009319973178207874,
0.11822344362735748,
-0.03599002584815025,
-0.3162708282470703,
0.04646337777376175,
0.039816055446863174,
0.06181862950325012,
0.1348995715379715,
0.1058356985449791,
-0.04752204567193985,
0.08078497648239136,
0.006181383039802313,
-0.14632008969783783,
0.0030598577577620745,
0.0008718766621313989,
-0.12453705817461014,
0.11685765534639359,
0.03486902639269829,
0.03979117423295975,
0.015947509557008743,
-0.04904618114233017,
-0.09449630230665207,
0.041690800338983536,
0.015343447215855122,
-0.017792945727705956,
0.04376312345266342,
0.0466567687690258,
-0.1131218820810318,
0.12208832800388336,
0.13404840230941772,
0.012477995827794075,
0.03979599475860596,
-0.16477926075458527,
-0.015537216328084469,
0.0357736311852932,
0.08187495917081833,
0.0355033352971077,
0.11586049199104309,
-0.050720151513814926,
0.08849368989467621,
-0.05685622990131378,
0.10338741540908813,
0.07408708333969116,
-0.32645171880722046,
-0.025875627994537354,
0.06626874208450317,
0.048293568193912506,
0.07410163432359695,
-0.04600326344370842,
0.04853307083249092,
0.04116653650999069,
-0.0066285585053265095,
-0.035219624638557434,
-0.060268864035606384,
-0.03754863515496254,
-0.014996259473264217,
-0.08788279443979263,
-0.005091026425361633,
0.2676098048686981,
-0.042135871946811676,
0.05745282024145126,
-0.06492020189762115,
-0.10344988107681274,
-0.051290363073349,
-0.02722117491066456,
-0.034676991403102875,
-0.07665758579969406,
0.06904321163892746,
0.01045945007354021,
-0.09267357736825943,
-0.11852282285690308,
-0.035712774842977524,
-0.20658181607723236,
0.12255024909973145,
0.038720592856407166,
0.01458317507058382,
-0.20594581961631775,
0.10302793234586716,
-0.014462782070040703,
-0.10662362724542618,
0.03382233530282974,
-0.10836269706487656,
0.0013543644454330206,
0.013682367280125618,
-0.03111894242465496,
-0.03922111541032791,
0.05745963379740715,
0.11771456897258759,
0.055028293281793594,
0.013149979524314404,
-0.026365799829363823,
0.0730072408914566,
0.06033029779791832,
0.08142753690481186,
-0.022031748667359352,
-0.02369893342256546,
0.036389388144016266,
-0.08183564245700836,
-0.01655171439051628,
-0.06978954374790192,
-0.15472647547721863,
-0.000268933130428195,
0.10332430154085159,
0.051279257982969284,
0.05119691044092178,
0.10581370443105698,
-0.0026446001138538122,
-0.05870293080806732,
-0.015601491555571556,
-0.040060870349407196,
-0.02455832250416279,
0.0025612851604819298,
-0.0032685832120478153,
0.2040477991104126,
0.0019481825875118375,
0.046732816845178604,
-0.11415590345859528,
0.024069078266620636,
-0.0598028302192688,
-0.009615613147616386,
-0.033134981989860535,
-0.05579976364970207,
0.002378244651481509,
-0.02907923236489296,
0.016886208206415176,
-0.16324862837791443,
-0.15973368287086487,
-0.02189335785806179,
-0.006146594882011414,
-0.05376993492245674,
-0.13130809366703033,
-0.10499627143144608,
0.018845096230506897,
0.02590101771056652,
-0.08790682256221771,
-0.05419108644127846,
-0.06087346747517586,
0.08515758812427521,
-0.044037144631147385,
0.08406329154968262,
-0.058955054730176926,
0.083060622215271,
-0.1030513122677803,
-0.017880626022815704,
-0.06975290179252625,
0.10765687376260757,
0.020579390227794647,
0.08596176654100418,
-0.040214262902736664,
-0.01600634679198265,
-0.11463800817728043,
0.07214268296957016,
-0.05989469215273857,
0.2259237915277481,
-0.10684523731470108,
-0.12085330486297607,
0.29210859537124634,
-0.08583198487758636,
-0.10378976911306381,
0.11583149433135986,
0.0017582396976649761,
0.13578495383262634,
0.14233285188674927,
0.224404975771904,
0.0156236682087183,
0.019813524559140205,
0.0899195447564125,
0.08866415917873383,
-0.061770953238010406,
0.01281268335878849,
0.023814715445041656,
-0.04128511995077133,
-0.06972901523113251,
0.025097010657191277,
0.08278599381446838,
0.0606156587600708,
-0.058969203382730484,
-0.016299141570925713,
0.013859054073691368,
-0.01359628140926361,
0.06576985120773315,
-0.032050855457782745,
0.12952721118927002,
-0.043053559958934784,
-0.06379003822803497,
-0.029565664008259773,
0.011785217560827732,
-0.065589040517807,
0.044534116983413696,
-0.0876682698726654,
0.06803250312805176,
0.0052107516676187515,
0.06074824556708336,
-0.1437876671552658,
-0.04229278489947319,
-0.04602107033133507,
0.1573033481836319,
0.04159506410360336,
0.13337597250938416,
0.05817347764968872,
-0.031995583325624466,
0.006912254728376865,
0.024969028308987617,
0.14889003336429596,
-0.0082295136526227,
-0.053172480314970016,
-0.1085173711180687,
0.10184405744075775,
-0.06634853780269623,
0.11558708548545837,
-0.019166359677910805,
0.006157773546874523,
0.005186096765100956,
0.10681483149528503,
-0.023560496047139168,
0.008778240531682968,
0.00015186500968411565,
-0.007473255041986704,
-0.04743967577815056,
-0.0042275963351130486,
0.10716518759727478,
-0.007447513286024332,
-0.04843037575483322,
0.23343433439731598,
-0.19185799360275269,
0.13577476143836975,
0.20748375356197357,
-0.18784312903881073,
-0.0008705251384526491,
-0.0975300520658493,
-0.017128445208072662,
0.011303894221782684,
0.09397817403078079,
-0.03490038216114044,
0.29388633370399475,
0.010693949647247791,
0.1774347722530365,
-0.034938205033540726,
-0.012782501056790352,
-0.031145883724093437,
-0.054140813648700714,
0.01629927009344101,
0.11624221503734589,
0.09110414981842041,
-0.121970996260643,
0.17918488383293152,
0.09975297749042511,
0.05542416498064995,
0.16810379922389984,
0.02251567132771015,
0.00009086355566978455,
0.08280372619628906,
0.014793410897254944,
-0.03429514169692993,
-0.08323831856250763,
-0.29936471581459045,
-0.047687433660030365,
0.07160880416631699,
0.0515553280711174,
0.11348779499530792,
-0.08770250529050827,
-0.028143908828496933,
0.011669803410768509,
0.002018128288909793,
0.04676385596394539,
0.11795173585414886,
0.01486015971750021,
0.113967165350914,
-0.014225331135094166,
-0.08501633256673813,
0.052552882581949234,
0.027409793809056282,
-0.08249939233064651,
0.17540498077869415,
-0.11488254368305206,
-0.345975399017334,
-0.0938151478767395,
-0.18306660652160645,
-0.06088755279779434,
0.05689682438969612,
0.09538368880748749,
-0.10973715782165527,
-0.017185306176543236,
-0.001720996806398034,
0.11723792552947998,
-0.11609867960214615,
-0.01098270807415247,
0.0059149363078176975,
0.0002753314620349556,
-0.10942386835813522,
-0.08879172056913376,
-0.040721140801906586,
-0.01652318239212036,
-0.10640072822570801,
0.1045001894235611,
-0.1429070085287094,
0.036958832293748856,
0.22653722763061523,
0.06309238821268082,
0.05693550407886505,
-0.029450545087456703,
0.21379126608371735,
-0.09955635666847229,
-0.028974827378988266,
0.21217858791351318,
-0.027205444872379303,
0.05100888013839722,
0.12670494616031647,
-0.027058131992816925,
-0.05495714768767357,
0.01757294498383999,
-0.009670952335000038,
-0.06370310485363007,
-0.19603155553340912,
-0.14385323226451874,
-0.0913911908864975,
0.10178691893815994,
0.06977051496505737,
0.06485529989004135,
0.09779969602823257,
0.0650988221168518,
-0.04285437613725662,
-0.015773095190525055,
0.053497377783060074,
0.0908626914024353,
0.2201681286096573,
-0.0947350412607193,
0.14695589244365692,
0.0021268019918352365,
-0.14215099811553955,
0.0778208076953888,
0.06305930763483047,
0.05641854554414749,
0.08067656308412552,
0.05282357335090637,
0.00981181114912033,
0.03833291307091713,
0.13427942991256714,
0.08942823112010956,
0.00024836111697368324,
-0.032725490629673004,
-0.030369220301508904,
-0.04163840413093567,
-0.04474955052137375,
0.04481154680252075,
0.017796680331230164,
-0.1594836413860321,
-0.041543181985616684,
-0.033235013484954834,
0.06580016762018204,
0.036092884838581085,
0.07380115240812302,
-0.17160046100616455,
-0.010333886370062828,
0.06728747487068176,
-0.040840666741132736,
-0.13689105212688446,
0.09837529063224792,
-0.007227983325719833,
-0.13668452203273773,
0.04072868824005127,
-0.014367565512657166,
0.12917226552963257,
-0.14026428759098053,
0.08443387597799301,
-0.14240983128547668,
-0.0634114146232605,
0.0048116096295416355,
0.13076524436473846,
-0.2685244083404541,
0.1872185617685318,
-0.02515050768852234,
-0.06705530732870102,
-0.09522940218448639,
-0.01835273765027523,
-0.010479886084794998,
0.12391293048858643,
0.14442826807498932,
-0.015247977338731289,
0.05118775740265846,
-0.023902829736471176,
-0.09049826115369797,
0.02740493230521679,
0.09641704708337784,
-0.06252101063728333,
-0.013078836724162102,
-0.02616073004901409,
0.014347352087497711,
-0.040214333683252335,
-0.04330388456583023,
0.003946044947952032,
-0.19523502886295319,
0.07869336009025574,
0.029092222452163696,
0.1595369130373001,
0.02799542061984539,
-0.05262013152241707,
-0.13210727274417877,
0.24038664996623993,
-0.053836919367313385,
-0.09778984636068344,
-0.0705677792429924,
-0.06022337079048157,
0.0548638217151165,
-0.055388059467077255,
0.02329353429377079,
-0.06960736960172653,
0.0484386645257473,
-0.0974574014544487,
-0.16906125843524933,
0.10740157961845398,
-0.08564798533916473,
-0.024330798536539078,
-0.018550578504800797,
0.21395094692707062,
-0.01727689430117607,
0.027568303048610687,
0.045977212488651276,
-0.004488824401050806,
-0.15288808941841125,
-0.09814723581075668,
-0.026566870510578156,
0.042267270386219025,
-0.003953354898840189,
-0.011776473373174667,
-0.03358545899391174,
-0.01486437488347292,
-0.044631507247686386,
-0.008327830582857132,
0.34172841906547546,
0.12592612206935883,
-0.019162321463227272,
0.17810994386672974,
0.10614761710166931,
-0.0707237720489502,
-0.2693096101284027,
-0.11195169389247894,
-0.07045042514801025,
-0.021859239786863327,
-0.09030841290950775,
-0.18859823048114777,
0.047705914825201035,
0.01760043203830719,
-0.0010497223120182753,
0.08577580749988556,
-0.28958961367607117,
-0.10902799665927887,
0.16351191699504852,
-0.04784826934337616,
0.3802168369293213,
-0.08665582537651062,
-0.0709085687994957,
-0.07869534939527512,
-0.11941523104906082,
0.1085163801908493,
-0.018970755860209465,
0.11875146627426147,
0.011721772141754627,
0.1524769365787506,
0.054616779088974,
0.0022678892128169537,
0.09599840641021729,
0.01210940070450306,
-0.07491724193096161,
-0.09767215698957443,
-0.038123372942209244,
-0.0017138933762907982,
0.03440311551094055,
0.0278956089168787,
-0.06270837038755417,
0.009695997461676598,
-0.15289407968521118,
-0.07606969773769379,
-0.08377140015363693,
0.02264917455613613,
0.033116210252046585,
-0.06329464167356491,
0.013391544111073017,
-0.03655398264527321,
0.0023743046913295984,
0.009410320781171322,
0.12276606261730194,
-0.12761911749839783,
0.16610068082809448,
0.07325664907693863,
0.11821018904447556,
-0.16223031282424927,
-0.0944085493683815,
-0.06072155758738518,
-0.0453018918633461,
0.06770866364240646,
-0.08948961645364761,
0.019747966900467873,
0.11647000163793564,
-0.04705950617790222,
0.07052718102931976,
0.07897138595581055,
-0.035853344947099686,
0.03313736245036125,
0.09700436145067215,
-0.2641875147819519,
-0.03794306889176369,
-0.07210258394479752,
0.0412587895989418,
0.09637228399515152,
0.09417504072189331,
0.2118683159351349,
-0.01631404645740986,
-0.030234603211283684,
0.007086018566042185,
0.012311252765357494,
-0.020170526579022408,
0.07535520195960999,
-0.005250764545053244,
0.018136218190193176,
-0.15085692703723907,
0.0553247332572937,
0.03075684793293476,
-0.0897415429353714,
0.028247755020856857,
0.13432730734348297,
-0.09132591634988785,
-0.11628293246030807,
-0.059124719351530075,
0.13242438435554504,
-0.08130361884832382,
-0.012456790544092655,
-0.04924453794956207,
-0.1295788586139679,
0.0530255064368248,
0.089618980884552,
0.043586354702711105,
0.057373493909835815,
-0.0985780879855156,
-0.050731807947158813,
-0.025807587429881096,
-0.019214801490306854,
0.06564631313085556,
-0.027476368471980095,
-0.06968723982572556,
0.08708492666482925,
-0.021673299372196198,
0.10439968854188919,
-0.08966639637947083,
-0.11833132803440094,
-0.13985612988471985,
0.0525471568107605,
-0.0591479130089283,
-0.09285679459571838,
-0.1113705039024353,
-0.04570828750729561,
-0.0067962948232889175,
-0.0084922444075346,
-0.03404887393116951,
-0.04850262403488159,
-0.11221480369567871,
0.025591887533664703,
-0.058364588767290115,
0.008552604354918003,
-0.06758470833301544,
0.04661157727241516,
0.05324074625968933,
-0.023141833022236824,
0.15469634532928467,
0.098708875477314,
-0.10987141728401184,
0.08003338426351547,
-0.15046723186969757,
-0.07099180668592453,
0.10484108328819275,
0.01942344941198826,
0.040305569767951965,
0.057905323803424835,
0.002526715165004134,
0.05196322128176689,
0.028248706832528114,
0.0536360964179039,
0.050582773983478546,
-0.09196445345878601,
0.024083269760012627,
0.030461376532912254,
-0.12109792232513428,
-0.025966349989175797,
-0.030293798074126244,
0.029786383733153343,
0.03884236887097359,
0.06981091946363449,
-0.059767186641693115,
0.11514803767204285,
-0.0466596744954586,
0.03030444122850895,
0.022631265223026276,
-0.15896794199943542,
0.04764001816511154,
-0.09934429079294205,
0.053224168717861176,
0.026210231706500053,
0.20985902845859528,
0.035814397037029266,
0.043033305555582047,
0.026537250727415085,
0.05103953555226326,
0.05230884253978729,
0.014140933752059937,
0.20825955271720886,
0.12921370565891266,
-0.020948177203536034,
-0.07880320399999619,
0.10131645947694778,
0.03760868310928345,
0.05353381484746933,
0.0935433954000473,
-0.03347703441977501,
-0.02682548761367798,
0.0744442418217659,
0.003896318841725588,
0.028872201219201088,
-0.10888093709945679,
-0.14390943944454193,
0.0007487190305255353,
0.04847091808915138,
-0.06332357972860336,
0.09611501544713974,
0.14178191125392914,
-0.01426622737199068,
0.013938750140368938,
-0.012648975476622581,
-0.07487248629331589,
-0.17520922422409058,
-0.2213379591703415,
-0.07973896712064743,
-0.15721319615840912,
0.011609392240643501,
-0.12790384888648987,
0.02997223474085331,
0.025989677757024765,
0.09414137154817581,
-0.0761977955698967,
0.07993821054697037,
0.05158160999417305,
-0.14856508374214172,
0.08676789700984955,
-0.025484878569841385,
0.11023082584142685,
-0.07653568685054779,
-0.012832175940275192,
-0.07757210731506348,
0.04797428101301193,
0.011050413362681866,
0.056855328381061554,
-0.02579871565103531,
0.003984866198152304,
-0.11280887573957443,
-0.08672761172056198,
-0.05125120282173157,
0.056222137063741684,
-0.0011640969896689057,
0.1610822081565857,
0.0001560803793836385,
-0.0409267321228981,
0.004587939474731684,
0.2307887077331543,
-0.05111171677708626,
-0.07428339868783951,
-0.05420301854610443,
0.2000967562198639,
0.016246680170297623,
0.11203265190124512,
-0.0331123061478138,
-0.004949966445565224,
-0.10029498487710953,
0.34274935722351074,
0.27672505378723145,
-0.11189538985490799,
0.011819014325737953,
-0.007257799152284861,
0.04413798451423645,
0.11138887703418732,
0.08715317398309708,
0.10265423357486725,
0.2620438039302826,
-0.07053885608911514,
-0.0060815163888037205,
-0.009722430258989334,
-0.03512663394212723,
-0.09391244500875473,
0.06088734045624733,
0.06003719940781593,
-0.047286443412303925,
-0.03993447497487068,
0.12221977859735489,
-0.23651184141635895,
0.08740990608930588,
-0.13008594512939453,
-0.1826431304216385,
-0.10786063969135284,
-0.007145125884562731,
0.15604186058044434,
0.0313565619289875,
0.09146793186664581,
0.004786222707480192,
-0.06810595840215683,
0.041653767228126526,
0.03551708906888962,
-0.17261344194412231,
-0.05906739458441734,
0.06735648959875107,
-0.03678015619516373,
-0.053451139479875565,
-0.024488922208547592,
0.053187910467386246,
0.05532997474074364,
0.045163873583078384,
-0.015624198131263256,
0.04697529226541519,
-0.008769490756094456,
-0.08461930602788925,
0.02942221611738205,
0.041279278695583344,
0.01006544753909111,
-0.05849310755729675,
0.08388513326644897,
-0.11971810460090637,
0.029145020991563797,
-0.01429548766463995,
-0.016409652307629585,
-0.01828876882791519,
0.04340159147977829,
-0.07660356909036636,
0.07415907829999924,
0.07990466803312302,
-0.01717621646821499,
-0.03590443730354309,
-0.03251148760318756,
-0.0003779857652261853,
-0.03784489259123802,
-0.08668532967567444,
-0.08606117963790894,
-0.18709315359592438,
-0.1384880542755127,
0.059746842831373215,
0.019599365070462227,
-0.1732022911310196,
0.01883520744740963,
-0.11442305147647858,
0.06191252917051315,
-0.12914656102657318,
0.10274934023618698,
0.04020173102617264,
0.014300674200057983,
-0.0036855419166386127,
0.03970247134566307,
0.05538804829120636,
0.07224631309509277,
-0.1507878452539444,
-0.08111724257469177
] |
null | null |
transformers
|
## [google/t5-v1_1-small](google/t5-v1_1-small) model
### pretrained on [anzorq/kbd-ru-1.67M-temp](https://huggingface.co/datasets/anzorq/kbd-ru-1.67M-temp)
### fine-tuned on **17753** Russian-Kabardian word/sentence pairs
kbd text uses custom latin script for optimization reasons.
Translation input should start with '**ru->kbd:** '.
**Tokenizer**: T5 sentencepiece, char, cased.
|
{"language": ["ru", "kbd"], "tags": ["translation"], "datasets": ["anzorq/kbd-ru-1.67M-temp", "17753 Russian-Kabardian pairs of text"], "widget": [{"text": "ru->kbd: \u042f \u0438\u0434\u0443 \u0434\u043e\u043c\u043e\u0439.", "example_title": "\u042f \u0438\u0434\u0443 \u0434\u043e\u043c\u043e\u0439."}, {"text": "ru->kbd: \u0414\u0435\u0442\u0438 \u0438\u0433\u0440\u0430\u044e\u0442 \u0432\u043e \u0434\u0432\u043e\u0440\u0435.", "example_title": "\u0414\u0435\u0442\u0438 \u0438\u0433\u0440\u0430\u044e\u0442 \u0432\u043e \u0434\u0432\u043e\u0440\u0435."}, {"text": "ru->kbd: \u0421\u043a\u043e\u043b\u044c\u043a\u043e \u0442\u0435\u0431\u0435 \u043b\u0435\u0442?", "example_title": "\u0421\u043a\u043e\u043b\u044c\u043a\u043e \u0442\u0435\u0431\u0435 \u043b\u0435\u0442?"}]}
|
translation
|
anzorq/t5-v1_1-small-ru_kbd-cased
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"translation",
"ru",
"kbd",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ru",
"kbd"
] |
TAGS
#transformers #pytorch #t5 #text2text-generation #translation #ru #kbd #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
## google/t5-v1_1-small model
### pretrained on anzorq/kbd-ru-1.67M-temp
### fine-tuned on 17753 Russian-Kabardian word/sentence pairs
kbd text uses custom latin script for optimization reasons.
Translation input should start with 'ru->kbd: '.
Tokenizer: T5 sentencepiece, char, cased.
|
[
"## google/t5-v1_1-small model",
"### pretrained on anzorq/kbd-ru-1.67M-temp",
"### fine-tuned on 17753 Russian-Kabardian word/sentence pairs\n\nkbd text uses custom latin script for optimization reasons.\n\nTranslation input should start with 'ru->kbd: '.\n\nTokenizer: T5 sentencepiece, char, cased."
] |
[
"TAGS\n#transformers #pytorch #t5 #text2text-generation #translation #ru #kbd #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"## google/t5-v1_1-small model",
"### pretrained on anzorq/kbd-ru-1.67M-temp",
"### fine-tuned on 17753 Russian-Kabardian word/sentence pairs\n\nkbd text uses custom latin script for optimization reasons.\n\nTranslation input should start with 'ru->kbd: '.\n\nTokenizer: T5 sentencepiece, char, cased."
] |
[
56,
12,
20,
60
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #translation #ru #kbd #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## google/t5-v1_1-small model### pretrained on anzorq/kbd-ru-1.67M-temp### fine-tuned on 17753 Russian-Kabardian word/sentence pairs\n\nkbd text uses custom latin script for optimization reasons.\n\nTranslation input should start with 'ru->kbd: '.\n\nTokenizer: T5 sentencepiece, char, cased."
] |
[
0.020580818876624107,
-0.19382397830486298,
-0.005650755949318409,
0.04392087459564209,
0.13688334822654724,
-0.0043128253892064095,
0.10679453611373901,
0.0825551226735115,
0.00048636834253557026,
0.006226568948477507,
0.13136571645736694,
0.03532499447464943,
0.02958713471889496,
0.11793962121009827,
-0.014639523811638355,
-0.28023895621299744,
0.019654031842947006,
0.008002961054444313,
0.1104743480682373,
0.11812248080968857,
0.07909541577100754,
-0.021983889862895012,
0.07969624549150467,
-0.0123136593028903,
-0.10858447849750519,
0.0856982171535492,
0.06296654045581818,
-0.07054515182971954,
0.07416142523288727,
0.09568998962640762,
0.05187186226248741,
0.04276371747255325,
-0.03328445553779602,
-0.1412700116634369,
0.02680410072207451,
0.0025149518623948097,
-0.07256900519132614,
-0.011051023378968239,
0.03534417971968651,
-0.05756702274084091,
0.18426337838172913,
-0.13136373460292816,
-0.09002309292554855,
0.008286419324576855,
-0.10143022239208221,
-0.03560778126120567,
0.00558296637609601,
0.09941606223583221,
0.005386168137192726,
0.06371554732322693,
-0.053214624524116516,
0.09924263507127762,
-0.1283290684223175,
0.1345890760421753,
0.1549927294254303,
-0.41163596510887146,
-0.029248058795928955,
0.046696629375219345,
0.03073502518236637,
0.13156980276107788,
-0.014253098517656326,
0.05018266662955284,
-0.015495074912905693,
-0.03510406240820885,
-0.0204957015812397,
-0.10280215740203857,
-0.12271416932344437,
-0.029805170372128487,
-0.10895019769668579,
-0.008243936114013195,
0.21924223005771637,
-0.052915021777153015,
0.04340438172221184,
-0.015029232949018478,
-0.07022362947463989,
-0.08975791931152344,
-0.013020604848861694,
-0.04614692181348801,
-0.07075510174036026,
0.02330974116921425,
0.06912478059530258,
-0.05069199576973915,
-0.13915032148361206,
-0.02696552686393261,
-0.1815701723098755,
0.2143745869398117,
0.06400229781866074,
0.01735016517341137,
-0.12779203057289124,
0.08498687297105789,
-0.11915609985589981,
-0.0986289381980896,
0.04227830097079277,
-0.09654713422060013,
0.03587382286787033,
0.04908346012234688,
-0.045767903327941895,
-0.0781310498714447,
0.06796545535326004,
0.014061948284506798,
-0.03697074204683304,
0.06432394683361053,
0.015336934477090836,
0.06849642097949982,
-0.0420963279902935,
0.15693993866443634,
0.0007119402871467173,
-0.008413378149271011,
0.09841541945934296,
-0.06388547271490097,
0.014749052003026009,
0.0013161158422008157,
-0.1813642829656601,
-0.18935516476631165,
0.08443041145801544,
0.08868619054555893,
-0.029243798926472664,
0.13729581236839294,
0.015403522178530693,
-0.014324809424579144,
0.02340097352862358,
-0.08780945092439651,
-0.018466858193278313,
0.0054352120496332645,
0.015903951600193977,
0.07099599391222,
-0.0069526792503893375,
-0.005735252518206835,
-0.14997395873069763,
0.0365896038711071,
-0.019051382318139076,
0.01700141839683056,
0.0105242058634758,
-0.10291392356157303,
-0.0182481836527586,
-0.13480158150196075,
0.0003767683228943497,
-0.21886184811592102,
-0.02882678434252739,
-0.014710325747728348,
-0.05360459163784981,
-0.032853156328201294,
0.013762028887867928,
-0.04343234375119209,
-0.016455557197332382,
0.035002339631319046,
-0.050010234117507935,
-0.02790265716612339,
-0.05107298493385315,
0.05711939558386803,
0.0382489413022995,
0.09651697427034378,
-0.11745236814022064,
0.03973526507616043,
-0.14165747165679932,
-0.03184877708554268,
-0.1563214510679245,
0.142724871635437,
-0.02031487226486206,
-0.03786144778132439,
-0.09784804284572601,
-0.014566819183528423,
-0.16248314082622528,
0.034199826419353485,
0.022358832880854607,
0.13825172185897827,
-0.1595577746629715,
-0.06906922906637192,
0.23543314635753632,
-0.019908519461750984,
-0.05357527360320091,
0.12951689958572388,
0.0150505555793643,
0.13549543917179108,
0.11100945621728897,
0.220611572265625,
0.1100539118051529,
-0.036343369632959366,
0.009861809201538563,
0.03947822004556656,
-0.07861953973770142,
-0.040932413190603256,
0.010535057634115219,
0.01604536361992359,
-0.07898867875337601,
0.055215023458004,
-0.07971451431512833,
0.052607081830501556,
0.005935085937380791,
0.003066631266847253,
-0.03652270510792732,
0.051371462643146515,
0.060038674622774124,
-0.023142963647842407,
0.07834993302822113,
-0.04965668544173241,
-0.09012315422296524,
-0.09109413623809814,
-0.03607344627380371,
0.011399748735129833,
0.10922644287347794,
-0.06498434394598007,
0.06768858432769775,
0.025856604799628258,
0.09191589057445526,
-0.13541512191295624,
-0.018547363579273224,
0.03285709023475647,
0.12209329009056091,
0.09844881296157837,
0.021693384274840355,
0.02835698053240776,
-0.05870939418673515,
-0.07593369483947754,
0.00738221500068903,
0.10836916416883469,
-0.006189670413732529,
-0.07972261309623718,
-0.1146964356303215,
0.094658263027668,
-0.0013166473945602775,
-0.09406574815511703,
0.04378078132867813,
-0.006251566577702761,
0.1034553125500679,
0.07212165743112564,
-0.038287241011857986,
0.09667907655239105,
-0.042096126824617386,
0.0654311329126358,
-0.00015764609270263463,
0.044878989458084106,
0.08539460599422455,
-0.029193274676799774,
-0.10254476964473724,
0.2400142401456833,
-0.12779219448566437,
-0.04741958528757095,
0.12049504369497299,
-0.012139393948018551,
-0.036841001361608505,
-0.0864751935005188,
0.004475365858525038,
-0.014745119027793407,
0.10957332700490952,
-0.02891446463763714,
0.22686737775802612,
0.010986988432705402,
0.11828195303678513,
-0.10275514423847198,
-0.014940574765205383,
0.0010921717621386051,
-0.06416568160057068,
0.005613979883491993,
0.13173039257526398,
-0.013356323353946209,
-0.26604774594306946,
0.07395613193511963,
0.1297476440668106,
-0.025727100670337677,
0.26769229769706726,
-0.0004880797932855785,
-0.04022564738988876,
0.012190615758299828,
0.06344147026538849,
-0.017975300550460815,
-0.04583100974559784,
-0.0766119733452797,
-0.05666027218103409,
0.026706866919994354,
0.037537023425102234,
0.07087016105651855,
-0.07246395945549011,
0.02018115669488907,
-0.033038780093193054,
-0.03876283019781113,
-0.020648563280701637,
0.14199298620224,
0.03189241886138916,
0.12549139559268951,
0.034521281719207764,
0.00763337779790163,
0.023722408339381218,
0.017855897545814514,
-0.09263564646244049,
0.16742821037769318,
-0.110709048807621,
-0.28369805216789246,
-0.14323008060455322,
-0.024051018059253693,
-0.06583401560783386,
-0.004564653616398573,
0.1269015222787857,
-0.14416497945785522,
0.0011459111701697111,
-0.018284814432263374,
0.14174340665340424,
-0.0045937965624034405,
-0.018568692728877068,
-0.14927320182323456,
0.013505017384886742,
-0.09219812601804733,
-0.055639754980802536,
-0.052522968500852585,
-0.03170207515358925,
-0.036468204110860825,
0.10510284453630447,
-0.1522698849439621,
0.05576997250318527,
0.12783345580101013,
-0.03834176063537598,
0.027089141309261322,
-0.09414868801832199,
0.13599443435668945,
-0.04297977313399315,
0.05569864809513092,
0.09674638509750366,
-0.005477121099829674,
0.061613209545612335,
0.18012824654579163,
-0.042476050555706024,
0.008081055246293545,
0.05871104076504707,
-0.016979435458779335,
-0.042000435292720795,
-0.201902836561203,
-0.055059146136045456,
-0.11574001610279083,
0.04701675847172737,
-0.0159138273447752,
0.02401944249868393,
-0.012175919488072395,
0.04431367665529251,
-0.05832719802856445,
0.07844606786966324,
0.027720555663108826,
0.12062691897153854,
0.27361011505126953,
0.0231810100376606,
0.12013201415538788,
-0.0474807471036911,
-0.07954701781272888,
0.07719554007053375,
0.015199413523077965,
0.08269388228654861,
-0.02660216949880123,
0.14524969458580017,
0.03841273486614227,
0.1452166587114334,
0.11121539026498795,
0.014704041182994843,
0.03767411410808563,
0.014256324619054794,
-0.002990767592564225,
-0.055813923478126526,
-0.032906100153923035,
0.03321390226483345,
0.01749461516737938,
-0.10461801290512085,
-0.018373560160398483,
0.09756407141685486,
0.0939958244562149,
0.13318264484405518,
0.012685870751738548,
-0.12453597784042358,
-0.07054786384105682,
0.015839336439967155,
-0.010474232956767082,
-0.10555430501699448,
0.06156450882554054,
0.022676944732666016,
-0.09463280439376831,
0.026030581444501877,
-0.005215485114604235,
0.0873880535364151,
0.03083350509405136,
0.07180239260196686,
-0.09963725507259369,
0.016275392845273018,
-0.0016935316380113363,
0.09897971153259277,
-0.32139426469802856,
0.18475164473056793,
0.0331573523581028,
-0.015723761171102524,
-0.07856588065624237,
-0.028552420437335968,
0.011946395970880985,
0.08453228324651718,
0.085330069065094,
0.015232225880026817,
-0.040787409991025925,
-0.08341943472623825,
-0.03377258777618408,
0.03500797972083092,
0.16012486815452576,
-0.04624927416443825,
0.060210391879081726,
-0.07302739471197128,
-0.005820733029395342,
-0.031087884679436684,
0.04920603707432747,
-0.13159000873565674,
-0.10102362185716629,
0.042022455483675,
0.12109386175870895,
-0.0024547369685024023,
0.031927429139614105,
-0.038795631378889084,
0.015522144734859467,
0.05784055218100548,
-0.05918131396174431,
-0.09705287963151932,
-0.05630570650100708,
-0.01101790089160204,
0.07214683294296265,
-0.15520283579826355,
0.008030393160879612,
-0.10319477319717407,
-0.04640314355492592,
-0.042072590440511703,
-0.138922318816185,
0.1078571155667305,
-0.05404503643512726,
-0.06757130473852158,
0.007233664859086275,
0.1944812685251236,
-0.04039497673511505,
0.02397039905190468,
0.011429539881646633,
0.009730442427098751,
-0.06128702312707901,
-0.046494416892528534,
-0.010598619468510151,
-0.07889558374881744,
0.013093521818518639,
0.05940067395567894,
-0.16487370431423187,
-0.08869805186986923,
-0.13093580305576324,
-0.04731620475649834,
0.20686836540699005,
0.1473015546798706,
-0.07019250839948654,
0.12444078177213669,
0.09203268587589264,
-0.05649971589446068,
-0.2537078559398651,
-0.12665016949176788,
0.058010172098875046,
0.03511263057589531,
-0.02407519519329071,
-0.11148751527070999,
-0.012688877992331982,
-0.0655435100197792,
0.024928946048021317,
0.04767443239688873,
-0.1970728188753128,
-0.1458750218153,
0.12492559105157852,
-0.026277365162968636,
0.30361077189445496,
-0.10959842801094055,
-0.07582391053438187,
-0.036105938255786896,
-0.060109302401542664,
0.0414695106446743,
-0.02195843495428562,
0.11001493781805038,
-0.039522744715213776,
0.07634858042001724,
0.025282621383666992,
0.048043932765722275,
0.036601435393095016,
-0.00901146698743105,
-0.043632280081510544,
-0.0911015048623085,
-0.13334503769874573,
0.0726238265633583,
-0.03668690472841263,
0.048856377601623535,
-0.07344836741685867,
0.042637571692466736,
-0.06829187273979187,
-0.057205379009246826,
-0.1034383550286293,
0.02803010866045952,
0.0029297724831849337,
-0.029189005494117737,
-0.062215011566877365,
-0.03703439235687256,
0.02064533531665802,
-0.021916376426815987,
0.10265122354030609,
-0.09521365165710449,
0.001378569402731955,
0.1497931033372879,
0.10739120841026306,
0.09628228098154068,
0.0772755965590477,
-0.018630273640155792,
-0.036976467818021774,
0.08442527800798416,
-0.06954243034124374,
0.047721441835165024,
0.10877528041601181,
-0.06132727861404419,
0.022539397701621056,
0.010964203625917435,
-0.057921942323446274,
-0.018253907561302185,
0.10220491141080856,
-0.16434267163276672,
-0.08125181496143341,
-0.08023614436388016,
-0.08946878463029861,
0.09061632305383682,
0.008631418459117413,
0.1449585258960724,
-0.07725462317466736,
-0.06728985160589218,
0.012134878896176815,
0.001833285903558135,
-0.009922951459884644,
0.0714995339512825,
0.029665237292647362,
0.012973261997103691,
-0.07519283890724182,
0.05929911136627197,
0.030995499342679977,
-0.024206766858696938,
0.06370355188846588,
0.15539616346359253,
-0.16944053769111633,
-0.08536331355571747,
0.05711919069290161,
0.0908469557762146,
-0.09445106983184814,
-0.05347966402769089,
-0.0581686906516552,
-0.09297146648168564,
0.029796410351991653,
0.20283235609531403,
0.028426341712474823,
0.0005250964313745499,
-0.045325182378292084,
-0.004611944314092398,
0.023567818105220795,
0.08936028182506561,
0.06378774344921112,
0.0021783343981951475,
-0.055065613240003586,
0.026160378009080887,
-0.038993556052446365,
0.1388150006532669,
-0.06522610783576965,
0.012199925258755684,
-0.10271322727203369,
-0.00979213509708643,
-0.18229776620864868,
0.04636189714074135,
-0.016897043213248253,
-0.033009547740221024,
-0.027324901893734932,
-0.05235149711370468,
-0.05452137812972069,
0.008009695447981358,
-0.08301344513893127,
0.0227284524589777,
-0.07823885232210159,
0.08584484457969666,
-0.017184805124998093,
-0.015185055322945118,
0.04515450447797775,
0.01223088800907135,
0.0904059112071991,
0.11072497069835663,
-0.06392668187618256,
0.1554211974143982,
-0.04184374958276749,
0.0377022959291935,
0.05569486692547798,
0.02761908248066902,
-0.010236856527626514,
0.007376363500952721,
0.012808757834136486,
0.04095317795872688,
0.06522028893232346,
0.00710807740688324,
0.1273949295282364,
-0.08604852110147476,
-0.0019192659528926015,
-0.07497604936361313,
-0.0670345351099968,
-0.04124156013131142,
0.047499943524599075,
0.00536373583599925,
0.09313569962978363,
0.09530720859766006,
-0.14588139951229095,
0.03667520359158516,
-0.015950245782732964,
0.015233492478728294,
0.0042472295463085175,
-0.1317048966884613,
-0.09966711699962616,
-0.12145745754241943,
0.040915507823228836,
0.016354257240891457,
0.10940337926149368,
0.06774042546749115,
-0.0014299523318186402,
-0.005925753619521856,
-0.0038048196583986282,
-0.019641350954771042,
-0.025621211156249046,
0.13883942365646362,
0.02699963189661503,
-0.05938654765486717,
-0.09365227818489075,
0.045052431523799896,
-0.006709341891109943,
0.0661952942609787,
0.20054879784584045,
0.05049974098801613,
0.0878400132060051,
0.140827476978302,
0.014721420593559742,
-0.001765857683494687,
0.06096869334578514,
-0.11084804683923721,
-0.11064660549163818,
0.004659187979996204,
-0.06666082888841629,
0.12357848137617111,
0.17622646689414978,
-0.04286467656493187,
0.0459580160677433,
-0.04087885841727257,
-0.06867731362581253,
-0.14073362946510315,
-0.20910510420799255,
-0.08094365149736404,
-0.11121808737516403,
-0.028325507417321205,
-0.0929783433675766,
-0.010175487026572227,
-0.08036542683839798,
0.044809870421886444,
-0.05639227479696274,
0.20382235944271088,
0.012557797133922577,
-0.21504369378089905,
0.16881228983402252,
-0.024567995220422745,
0.06271790713071823,
0.08452356606721878,
-0.0076378812082111835,
0.008614708669483662,
-0.09040716290473938,
0.0017319758189842105,
0.07074470818042755,
-0.03303726390004158,
-0.03687753528356552,
-0.11679640412330627,
-0.07093261927366257,
-0.04724506661295891,
0.10182448476552963,
0.029361797496676445,
0.1393391638994217,
0.055931128561496735,
-0.04628777131438255,
0.00341225229203701,
0.15974107384681702,
0.037126898765563965,
-0.0817800834774971,
-0.0024521909654140472,
0.15697623789310455,
0.04392571002244949,
0.05931590870022774,
-0.05012648180127144,
-0.067751444876194,
-0.028113769367337227,
0.25331780314445496,
0.18174080550670624,
-0.0122688552364707,
0.033125560730695724,
0.009418105706572533,
0.05742916092276573,
0.14647667109966278,
0.15339010953903198,
0.06687198579311371,
0.18156296014785767,
-0.031480398029088974,
0.0013204709393903613,
-0.045648686587810516,
0.022065017372369766,
-0.08156386017799377,
0.07149173319339752,
0.05565224960446358,
-0.1111438125371933,
-0.06397222727537155,
0.08144412189722061,
-0.15427826344966888,
-0.02002202346920967,
-0.027813119813799858,
-0.11011095345020294,
-0.053603749722242355,
-0.04872329160571098,
0.08342161029577255,
0.08293100446462631,
0.09276911616325378,
-0.022890115156769753,
-0.022542398422956467,
0.01148850005120039,
0.04244987666606903,
-0.1384844034910202,
0.017558248713612556,
0.03460662439465523,
-0.0637807846069336,
-0.08215046674013138,
-0.043350789695978165,
0.1861320286989212,
0.06704612821340561,
0.0569303072988987,
-0.012581270188093185,
0.12979769706726074,
-0.004174250178039074,
0.022325588390231133,
0.05524645000696182,
0.025497490540146828,
0.02463800460100174,
-0.11349236965179443,
0.09179103374481201,
-0.04580190032720566,
0.08188943564891815,
0.07486332207918167,
0.0075872596353292465,
-0.09967966377735138,
0.11776312440633774,
-0.13494327664375305,
0.06919125467538834,
0.15414641797542572,
-0.0019373091636225581,
-0.04036552459001541,
-0.08506408333778381,
-0.05974713712930679,
0.03209272027015686,
0.018083710223436356,
-0.035082101821899414,
-0.05685034394264221,
-0.034391749650239944,
0.12356173247098923,
0.06345319002866745,
-0.17382663488388062,
0.02072233334183693,
-0.11136265099048615,
0.05677969753742218,
-0.14262987673282623,
0.16453389823436737,
0.012351587414741516,
0.009897802025079727,
0.026682624593377113,
-0.2567093074321747,
0.03998394310474396,
0.08532904833555222,
-0.14212359488010406,
-0.08166433870792389
] |
null | null |
transformers
|
# BERT L-10 H-512 fine-tuned on MLM (CORD-19 2020/06/16)
BERT model with [10 Transformer layers and hidden embedding of size 512](https://huggingface.co/google/bert_uncased_L-10_H-512_A-8), referenced in [Well-Read Students Learn Better: On the Importance of Pre-training Compact Models](https://arxiv.org/abs/1908.08962), fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).
## Training the model
```bash
python run_language_modeling.py
--model_type bert
--model_name_or_path google/bert_uncased_L-10_H-512_A-8
--do_train
--train_data_file {cord19-200616-dataset}
--mlm
--mlm_probability 0.2
--line_by_line
--block_size 512
--per_device_train_batch_size 10
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-10_H-512_A-8_cord19-200616
|
{}
|
fill-mask
|
aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616
|
[
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"arxiv:1908.08962",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[] |
TAGS
#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us
|
# BERT L-10 H-512 fine-tuned on MLM (CORD-19 2020/06/16)
BERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).
## Training the model
'''bash
python run_language_modeling.py
--model_type bert
--model_name_or_path google/bert_uncased_L-10_H-512_A-8
--do_train
--train_data_file {cord19-200616-dataset}
--mlm
--mlm_probability 0.2
--line_by_line
--block_size 512
--per_device_train_batch_size 10
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-10_H-512_A-8_cord19-200616
|
[
"# BERT L-10 H-512 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).",
"## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-10_H-512_A-8\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 512\n --per_device_train_batch_size 10\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616"
] |
[
"TAGS\n#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us \n",
"# BERT L-10 H-512 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).",
"## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-10_H-512_A-8\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 512\n --per_device_train_batch_size 10\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616"
] |
[
48,
85,
149
] |
[
"passage: TAGS\n#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us \n# BERT L-10 H-512 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-10_H-512_A-8\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 512\n --per_device_train_batch_size 10\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616"
] |
[
-0.14001089334487915,
0.11857099831104279,
-0.002892516553401947,
0.10868430882692337,
0.10566886514425278,
0.024618390947580338,
0.09136108309030533,
0.1261557936668396,
0.003371910657733679,
0.10068278014659882,
0.11190195381641388,
0.057829469442367554,
0.04026234522461891,
0.07623106241226196,
-0.029260365292429924,
-0.21276773512363434,
0.0005424523842521012,
-0.008845311589539051,
-0.06833469122648239,
0.07558410614728928,
0.060702674090862274,
-0.08649804443120956,
0.10305684804916382,
-0.018117163330316544,
-0.056534018367528915,
0.00763167068362236,
-0.0051231300458312035,
-0.05589877441525459,
0.04850902408361435,
0.007491139229387045,
0.10807866603136063,
-0.019068308174610138,
0.06180848926305771,
-0.1137113943696022,
0.013625022955238819,
0.06473226100206375,
0.018269484862685204,
0.1359601616859436,
0.05404338240623474,
0.015389075502753258,
-0.006397376302629709,
-0.09959922730922699,
0.06397685408592224,
0.07820265740156174,
-0.06508266925811768,
-0.08930031210184097,
-0.07187549024820328,
0.023146124556660652,
0.07373613119125366,
0.07445744425058365,
-0.02042256109416485,
0.050607189536094666,
-0.07517994195222855,
0.03406786173582077,
0.18130259215831757,
-0.23813921213150024,
-0.025745414197444916,
0.09943390637636185,
0.03635089471936226,
0.017306126654148102,
-0.07616813480854034,
-0.02841019071638584,
0.04143794998526573,
0.03813788294792175,
0.10155384242534637,
-0.01252987701445818,
0.03764830157160759,
0.036774128675460815,
-0.11629265546798706,
0.020927878096699715,
0.1197182908654213,
0.05551862716674805,
-0.03472265973687172,
-0.12595398724079132,
-0.07177980244159698,
-0.08749289065599442,
-0.04095035418868065,
0.032227929681539536,
0.027362117543816566,
-0.017589742317795753,
-0.03893565759062767,
-0.04284549131989479,
-0.04237275943160057,
-0.11932100355625153,
-0.009826302528381348,
0.1153445914387703,
0.05418761447072029,
-0.00591496005654335,
-0.04085572063922882,
0.09988948702812195,
-0.07185181230306625,
-0.10862386226654053,
0.007215621881186962,
-0.029035424813628197,
-0.006614685524255037,
0.029512612149119377,
-0.008576562628149986,
-0.05896824970841408,
0.0056822937913239,
0.0639655590057373,
0.006978818215429783,
0.0658966675400734,
0.028079714626073837,
0.024548888206481934,
-0.037282563745975494,
0.11549750715494156,
-0.12304437905550003,
0.0018074645195156336,
0.023962490260601044,
0.1436508148908615,
0.013752510771155357,
-0.008283576928079128,
-0.06552670150995255,
-0.018287282437086105,
0.01265624538064003,
0.03886904567480087,
-0.01810809038579464,
0.10045308619737625,
-0.08156134188175201,
-0.023434238508343697,
0.013657392002642155,
-0.1715717613697052,
0.01717931404709816,
0.03181766718626022,
-0.08472543209791183,
-0.02434631437063217,
0.06602824479341507,
-0.029145879670977592,
-0.08385859429836273,
0.07425691187381744,
-0.09744046628475189,
-0.050776928663253784,
-0.11029022186994553,
-0.14006121456623077,
0.010955119505524635,
-0.05601365119218826,
-0.007156172767281532,
-0.0800279974937439,
-0.1743408590555191,
0.009877791628241539,
0.05166056379675865,
-0.04724655672907829,
0.028383009135723114,
-0.04948928952217102,
-0.10555689036846161,
0.02407531812787056,
0.014296644367277622,
0.08255105465650558,
-0.05134987086057663,
0.0776076540350914,
0.047269295901060104,
0.04560093581676483,
0.02743145450949669,
0.03093992918729782,
0.011534194462001324,
0.058473434299230576,
-0.11922582238912582,
0.10107449442148209,
-0.12271542102098465,
0.029517754912376404,
-0.13551968336105347,
-0.09695389121770859,
-0.01862892135977745,
0.047523126006126404,
0.10918663442134857,
0.141045480966568,
-0.21435193717479706,
-0.01735527813434601,
0.16021355986595154,
0.012705431319773197,
-0.0682399719953537,
0.10165457427501678,
-0.0530242919921875,
0.015217236243188381,
0.06311642378568649,
0.07915406674146652,
0.10833165049552917,
-0.14402537047863007,
-0.07378825545310974,
0.06721995025873184,
0.03692743554711342,
0.007952382788062096,
0.10026956349611282,
-0.0356055349111557,
-0.05506978556513786,
0.038405921310186386,
-0.05293590947985649,
-0.011794572696089745,
-0.0330086275935173,
-0.0510544553399086,
-0.024072401225566864,
-0.07543688267469406,
0.10118623077869415,
-0.03344030678272247,
0.02488727681338787,
-0.0740661472082138,
-0.1574336737394333,
0.06512656807899475,
0.13724076747894287,
-0.06474803388118744,
-0.013992915861308575,
-0.11706758290529251,
0.09091629832983017,
-0.06901787221431732,
-0.0009003916056826711,
-0.14157827198505402,
-0.0002675292198546231,
0.012427371926605701,
-0.05518442764878273,
0.03057100623846054,
-0.0071973600424826145,
0.0623076893389225,
0.07689952105283737,
-0.029390526935458183,
-0.029714884236454964,
-0.0631582960486412,
-0.008986302651464939,
-0.07266771048307419,
-0.08968625217676163,
-0.058748356997966766,
-0.005317301489412785,
0.05554307997226715,
-0.1670788824558258,
0.06849956512451172,
0.055312830954790115,
0.07702624797821045,
0.027554873377084732,
-0.01871403120458126,
0.022513819858431816,
-0.01806754618883133,
0.030962690711021423,
-0.05985834822058678,
0.043473053723573685,
-0.0005929216858930886,
-0.038995079696178436,
-0.07301121950149536,
-0.18614019453525543,
-0.033457834273576736,
0.12660151720046997,
0.036521684378385544,
-0.057521335780620575,
0.023423485457897186,
-0.030081184580922127,
-0.0008617140119895339,
-0.03313121199607849,
-0.085279680788517,
0.23731888830661774,
0.04116779938340187,
0.17151424288749695,
-0.1153472438454628,
-0.05362313240766525,
0.015118240378797054,
0.00836386438459158,
0.011891253292560577,
0.11721554398536682,
0.05886642634868622,
-0.022974302992224693,
0.03584964945912361,
0.10651969164609909,
-0.09164375066757202,
0.16480176150798798,
-0.036165058612823486,
-0.07441704720258713,
-0.04902884364128113,
0.014004334807395935,
0.0025793458335101604,
0.10750045627355576,
-0.061107560992240906,
-0.004982348065823317,
0.0378456711769104,
0.007443008478730917,
0.041119519621133804,
-0.15630003809928894,
0.029756231233477592,
0.030060136690735817,
-0.02887413464486599,
-0.061110448092222214,
0.00926111824810505,
0.0006174533045850694,
0.07277218997478485,
0.025358255952596664,
-0.024276427924633026,
0.018558776006102562,
-0.03423250839114189,
-0.08506070077419281,
0.18961435556411743,
-0.1319625973701477,
-0.1584189087152481,
-0.15938186645507812,
-0.04895269498229027,
-0.04029671475291252,
-0.004761185962706804,
-0.017151039093732834,
-0.11883348226547241,
-0.08142565935850143,
-0.1035730168223381,
0.10981098562479019,
-0.0033819505479186773,
-0.04848266392946243,
0.024380873888731003,
0.04898316040635109,
0.09915493428707123,
-0.16801728308200836,
0.0076020220294594765,
0.010018753819167614,
-0.04930127039551735,
-0.02697901241481304,
0.04583415761590004,
0.020465251058340073,
0.07824116945266724,
-0.014000683091580868,
0.022435704246163368,
0.008269119076430798,
0.17453289031982422,
-0.009369904175400734,
0.008183284662663937,
0.1377689093351364,
-0.02146904543042183,
0.05380303040146828,
0.1195613220334053,
0.06859444081783295,
-0.06265484541654587,
0.014661570079624653,
0.053620174527168274,
0.014496889896690845,
-0.24765455722808838,
-0.05130863934755325,
-0.0448182038962841,
0.09251807630062103,
0.13887104392051697,
0.01923241652548313,
-0.12154742330312729,
0.06867197155952454,
-0.0004215673543512821,
0.07186956703662872,
0.0049248551949858665,
0.07146838307380676,
0.03223951533436775,
-0.029839741066098213,
0.06311720609664917,
-0.01026204414665699,
0.0025173595640808344,
0.07332726567983627,
0.021609550341963768,
0.10603346675634384,
-0.08052351325750351,
0.18126316368579865,
0.03673660755157471,
0.13156303763389587,
0.03577899932861328,
0.0821736752986908,
-0.06198528781533241,
0.010706610046327114,
0.014470113441348076,
-0.08711257576942444,
-0.03753340616822243,
0.010157196782529354,
-0.022349011152982712,
0.06138485297560692,
-0.11318786442279816,
0.07791676372289658,
0.017950013279914856,
0.12028481066226959,
0.09712057560682297,
-0.29162168502807617,
-0.0584648959338665,
0.02689153142273426,
-0.013872724026441574,
-0.1289001703262329,
0.0653950646519661,
0.12005528807640076,
-0.06374168395996094,
-0.026225175708532333,
-0.05897483602166176,
0.08697272837162018,
-0.04846565052866936,
0.028731176629662514,
0.04274876043200493,
0.2070780247449875,
-0.00028841712628491223,
0.0993645191192627,
-0.12118810415267944,
0.15565796196460724,
0.043350305408239365,
0.06641330569982529,
-0.028455179184675217,
0.05206330120563507,
0.041350141167640686,
-0.06311479210853577,
0.09577008336782455,
-0.0075489794835448265,
0.09399746358394623,
-0.14612215757369995,
-0.14209489524364471,
0.03329502046108246,
0.08499284088611603,
-0.0858692079782486,
0.046841755509376526,
-0.04377219080924988,
-0.02765601873397827,
0.052956949919462204,
-0.06939957290887833,
-0.1211719736456871,
-0.13241292536258698,
0.01771395094692707,
-0.007315517868846655,
0.014692419208586216,
-0.08560539036989212,
-0.07088136672973633,
-0.09030979126691818,
0.16802802681922913,
-0.07508402317762375,
-0.05127383768558502,
-0.10275596380233765,
0.05773762986063957,
0.12180547416210175,
-0.0678587332367897,
0.04151856154203415,
-0.011136275716125965,
0.1266944408416748,
0.006721006240695715,
-0.08191710710525513,
0.10837450623512268,
-0.046320635825395584,
-0.08865365386009216,
-0.050215791910886765,
0.16770698130130768,
-0.0016565413679927588,
0.05932936072349548,
-0.03216418996453285,
-0.0013746493496000767,
-0.008394028060138226,
-0.07487569004297256,
-0.018041269853711128,
0.11849017441272736,
0.2084624320268631,
0.0743512436747551,
-0.15797285735607147,
-0.002835739403963089,
0.009027298539876938,
0.05554618313908577,
0.15146581828594208,
0.2065494954586029,
-0.06816403567790985,
0.03378244861960411,
0.1326921433210373,
-0.007039870135486126,
-0.236494779586792,
0.016463784500956535,
0.07793132215738297,
0.06020043045282364,
-0.03186752274632454,
-0.2093483954668045,
0.07758738100528717,
0.05107114091515541,
0.006218003109097481,
0.00016120211512316018,
-0.2420935034751892,
-0.12567590177059174,
0.1486232876777649,
0.09156443178653717,
0.023654872551560402,
-0.06616324931383133,
-0.04611518979072571,
-0.0964609906077385,
-0.14506573975086212,
0.12106651812791824,
-0.11616094410419464,
0.09803414344787598,
0.0006010427605360746,
0.05594027414917946,
0.04381623864173889,
-0.06770621240139008,
0.1473584920167923,
-0.04629329964518547,
0.04921724274754524,
-0.051103778183460236,
-0.10874856263399124,
-0.011635939590632915,
-0.09817299246788025,
0.12312217056751251,
-0.1234053373336792,
0.13337965309619904,
-0.13185420632362366,
-0.014776836149394512,
-0.011899879202246666,
-0.0031297688838094473,
-0.03937307000160217,
-0.04088278114795685,
-0.04593097046017647,
0.0381455272436142,
0.03863954544067383,
-0.022241003811359406,
-0.013125075958669186,
0.0032862452790141106,
-0.034298911690711975,
0.05965080484747887,
0.13079272210597992,
-0.030141377821564674,
-0.12591111660003662,
0.016999101266264915,
0.0006193603621795774,
0.08632133156061172,
-0.2064932882785797,
0.037897203117609024,
0.10067027062177658,
0.027289848774671555,
0.14954054355621338,
0.0417976938188076,
-0.06172230839729309,
-0.021715182811021805,
0.03612056002020836,
-0.13943180441856384,
-0.0698789730668068,
-0.08064746856689453,
0.06190258637070656,
-0.20208071172237396,
-0.02565465122461319,
0.10930812358856201,
-0.05324724316596985,
-0.012285607866942883,
0.03780612722039223,
0.03200560063123703,
-0.05008287355303764,
0.13395218551158905,
0.08199143409729004,
0.07587816566228867,
-0.08987946063280106,
0.09738857299089432,
0.032055992633104324,
0.010775728151202202,
0.047505028545856476,
0.07855179905891418,
-0.08680079132318497,
-0.036973997950553894,
0.05441061779856682,
0.21756133437156677,
0.09742502868175507,
-0.010032465681433678,
-0.06433998048305511,
-0.07591473311185837,
0.05019215866923332,
0.08885227143764496,
0.04603860154747963,
0.0005222744075581431,
-0.0732923224568367,
-0.007159275468438864,
-0.08078933507204056,
0.12205293774604797,
0.04869922250509262,
0.015193521976470947,
-0.13496147096157074,
0.10496655106544495,
-0.009337018243968487,
0.020676668733358383,
-0.0020326077938079834,
0.028477542102336884,
-0.10931381583213806,
-0.0391663983464241,
-0.17677021026611328,
0.05953993275761604,
0.023210886865854263,
0.008534127846360207,
-0.01507766917347908,
0.02692955546081066,
-0.013191386125981808,
0.010441668331623077,
-0.07330822199583054,
-0.06128312647342682,
0.011301667429506779,
0.04986223950982094,
-0.09408104419708252,
0.028117556124925613,
0.04874451085925102,
-0.08617796003818512,
0.06846334785223007,
0.00961842481046915,
-0.0045920973643660545,
0.044662393629550934,
-0.04147570952773094,
-0.07261766493320465,
0.0015717254718765616,
0.042283959686756134,
0.0493742972612381,
-0.09628544747829437,
0.008218487724661827,
-0.06469156593084335,
0.009828243404626846,
-0.03374464809894562,
0.05307045206427574,
-0.10111396759748459,
-0.003941346425563097,
-0.010199974291026592,
0.019069669768214226,
-0.09919606149196625,
0.06135870888829231,
0.0720733255147934,
0.06744404137134552,
0.13542822003364563,
-0.03596944361925125,
0.027631431818008423,
-0.17894242703914642,
-0.0033050074707716703,
0.010068240575492382,
-0.07438193261623383,
-0.045126207172870636,
-0.020152149721980095,
0.08424753695726395,
-0.08947553485631943,
0.0606798380613327,
0.018201541155576706,
-0.08274462819099426,
0.007961850613355637,
0.020419755950570107,
0.018362320959568024,
0.03909463435411453,
0.21065795421600342,
0.007352935150265694,
0.01542474515736103,
0.004468679428100586,
0.014985796995460987,
0.08486399054527283,
0.05377505347132683,
0.0824885368347168,
0.15479062497615814,
-0.01025872491300106,
0.058721497654914856,
0.07502423226833344,
-0.07113265246152878,
-0.08728282153606415,
0.14055049419403076,
-0.06845570355653763,
0.07933129370212555,
-0.07764118909835815,
0.18311765789985657,
0.04826967790722847,
-0.13416022062301636,
0.0692950114607811,
-0.03412959724664688,
-0.13824288547039032,
-0.11636989563703537,
-0.14340487122535706,
-0.07309658825397491,
-0.0790405347943306,
0.004784325137734413,
-0.12473885715007782,
-0.015464508906006813,
0.11251090466976166,
0.025410544127225876,
0.02215077355504036,
0.18071183562278748,
-0.139509916305542,
-0.04011279717087746,
0.067000612616539,
0.026337282732129097,
-0.018073394894599915,
-0.05443166568875313,
-0.05592993646860123,
0.06577527523040771,
0.023914974182844162,
0.08045431971549988,
-0.02085542120039463,
0.0006862395093776286,
0.023635979741811752,
-0.00385438185185194,
-0.08302242308855057,
-0.020730838179588318,
-0.012991714291274548,
0.03533465415239334,
0.13504044711589813,
0.04057461768388748,
-0.030456479638814926,
-0.018635427579283714,
0.2111814320087433,
-0.07331995666027069,
-0.05687613785266876,
-0.15734900534152985,
0.08531235158443451,
-0.012327631935477257,
-0.002734578913077712,
0.037810999900102615,
-0.12711884081363678,
-0.03462345525622368,
0.22892865538597107,
0.21862265467643738,
-0.07880708575248718,
-0.0028341261204332113,
0.0358508862555027,
-0.007784618996083736,
-0.06226632744073868,
0.08765730261802673,
0.06617577373981476,
0.06140873208642006,
-0.02460457943379879,
-0.016263745725154877,
0.015118037350475788,
-0.061405327171087265,
-0.034983571618795395,
0.1055787205696106,
0.006104874890297651,
-0.03257982060313225,
-0.033772505819797516,
0.023445485159754753,
0.021682457998394966,
-0.19878233969211578,
-0.055209871381521225,
-0.12011270970106125,
-0.13495659828186035,
-0.04801183193922043,
0.06491991877555847,
0.019052496179938316,
0.06616399437189102,
-0.041189372539520264,
0.0027067067567259073,
0.11242445558309555,
-0.023591361939907074,
-0.06963551789522171,
-0.07467281818389893,
0.05395805835723877,
-0.022768428549170494,
0.18569441139698029,
0.025706414133310318,
0.06233032047748566,
0.1205620989203453,
-0.020364264026284218,
-0.09117787331342697,
0.051179684698581696,
0.06429502367973328,
-0.07831469178199768,
0.004458037670701742,
0.15347862243652344,
-0.05176929011940956,
0.05179743468761444,
0.00968649797141552,
-0.08018361777067184,
-0.03684202954173088,
-0.0612463653087616,
-0.09196025878190994,
-0.10598782449960709,
0.014534842222929,
-0.08285506814718246,
0.1253119558095932,
0.1973535567522049,
-0.04144396260380745,
-0.052354246377944946,
-0.05418217182159424,
0.06274367868900299,
0.00222736201249063,
-0.03397485613822937,
0.025491563603281975,
-0.1680046170949936,
-0.006000901572406292,
0.02599833533167839,
0.02067619375884533,
-0.2988927364349365,
-0.04715210571885109,
0.033796049654483795,
-0.01592988893389702,
-0.08060166239738464,
0.09976471215486526,
0.10236954689025879,
0.05438646674156189,
-0.03271719813346863,
-0.035107627511024475,
-0.0365075059235096,
0.1005454882979393,
-0.133823424577713,
-0.0927138477563858
] |
null | null |
transformers
|
# BERT L-10 H-512 CORD-19 (2020/06/16) fine-tuned on SQuAD v2.0
BERT model with [10 Transformer layers and hidden embedding of size 512](https://huggingface.co/google/bert_uncased_L-10_H-512_A-8), referenced in [Well-Read Students Learn Better: On the Importance of Pre-training Compact Models](https://arxiv.org/abs/1908.08962), [fine-tuned for MLM](https://huggingface.co/aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616) on CORD-19 dataset (as released on 2020/06/16) and fine-tuned for QA on SQuAD v2.0.
## Training the model
```bash
python run_squad.py
--model_type bert
--model_name_or_path aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616
--train_file 'train-v2.0.json'
--predict_file 'dev-v2.0.json'
--do_train
--do_eval
--do_lower_case
--version_2_with_negative
--max_seq_length 384
--per_gpu_train_batch_size 10
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-10_H-512_A-8_cord19-200616_squad2
|
{"datasets": ["squad_v2"]}
|
question-answering
|
aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616_squad2
|
[
"transformers",
"pytorch",
"jax",
"bert",
"question-answering",
"dataset:squad_v2",
"arxiv:1908.08962",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[] |
TAGS
#transformers #pytorch #jax #bert #question-answering #dataset-squad_v2 #arxiv-1908.08962 #endpoints_compatible #region-us
|
# BERT L-10 H-512 CORD-19 (2020/06/16) fine-tuned on SQuAD v2.0
BERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16) and fine-tuned for QA on SQuAD v2.0.
## Training the model
'''bash
python run_squad.py
--model_type bert
--model_name_or_path aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616
--train_file 'train-v2.0.json'
--predict_file 'dev-v2.0.json'
--do_train
--do_eval
--do_lower_case
--version_2_with_negative
--max_seq_length 384
--per_gpu_train_batch_size 10
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-10_H-512_A-8_cord19-200616_squad2
|
[
"# BERT L-10 H-512 CORD-19 (2020/06/16) fine-tuned on SQuAD v2.0\n\nBERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16) and fine-tuned for QA on SQuAD v2.0.",
"## Training the model\n\n'''bash\npython run_squad.py\n --model_type bert\n --model_name_or_path aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616\n --train_file 'train-v2.0.json'\n --predict_file 'dev-v2.0.json'\n --do_train\n --do_eval\n --do_lower_case\n --version_2_with_negative\n --max_seq_length 384\n --per_gpu_train_batch_size 10\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616_squad2"
] |
[
"TAGS\n#transformers #pytorch #jax #bert #question-answering #dataset-squad_v2 #arxiv-1908.08962 #endpoints_compatible #region-us \n",
"# BERT L-10 H-512 CORD-19 (2020/06/16) fine-tuned on SQuAD v2.0\n\nBERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16) and fine-tuned for QA on SQuAD v2.0.",
"## Training the model\n\n'''bash\npython run_squad.py\n --model_type bert\n --model_name_or_path aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616\n --train_file 'train-v2.0.json'\n --predict_file 'dev-v2.0.json'\n --do_train\n --do_eval\n --do_lower_case\n --version_2_with_negative\n --max_seq_length 384\n --per_gpu_train_batch_size 10\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616_squad2"
] |
[
50,
103,
180
] |
[
"passage: TAGS\n#transformers #pytorch #jax #bert #question-answering #dataset-squad_v2 #arxiv-1908.08962 #endpoints_compatible #region-us \n# BERT L-10 H-512 CORD-19 (2020/06/16) fine-tuned on SQuAD v2.0\n\nBERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16) and fine-tuned for QA on SQuAD v2.0.## Training the model\n\n'''bash\npython run_squad.py\n --model_type bert\n --model_name_or_path aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616\n --train_file 'train-v2.0.json'\n --predict_file 'dev-v2.0.json'\n --do_train\n --do_eval\n --do_lower_case\n --version_2_with_negative\n --max_seq_length 384\n --per_gpu_train_batch_size 10\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616_squad2"
] |
[
-0.11253336817026138,
0.11946509778499603,
-0.005505832843482494,
0.09330683201551437,
0.07550659775733948,
0.04791435971856117,
0.050228532403707504,
0.152336984872818,
-0.04327462986111641,
0.12243083864450455,
0.08253675699234009,
0.06403162330389023,
0.06471733003854752,
0.0725138932466507,
-0.03624800220131874,
-0.17683923244476318,
0.007780756335705519,
-0.005072448868304491,
-0.0619918555021286,
0.05844641104340553,
0.07691135257482529,
-0.10078421235084534,
0.07437069714069366,
-0.00191836419980973,
-0.025624066591262817,
0.00408436544239521,
-0.0070548877120018005,
-0.05221337452530861,
0.035921793431043625,
0.04561350867152214,
0.07429639250040054,
0.023551542311906815,
0.0684397742152214,
-0.17985643446445465,
0.013346010819077492,
0.0702727735042572,
0.011077217757701874,
0.10069750994443893,
0.09362027794122696,
0.04352143406867981,
0.003852688707411289,
-0.06501629948616028,
0.05687227100133896,
0.05539967492222786,
-0.07247202098369598,
-0.1356019526720047,
-0.09270311146974564,
0.04304775223135948,
0.10487135499715805,
0.05593934282660484,
-0.025036433711647987,
0.03755481168627739,
-0.10094831883907318,
0.036582544445991516,
0.17425961792469025,
-0.2698690891265869,
-0.02766599878668785,
0.04570626839995384,
0.003979029133915901,
0.01983245648443699,
-0.11533921957015991,
-0.07838896661996841,
0.04645020142197609,
0.016857875511050224,
0.053969450294971466,
-0.012513764202594757,
0.009032835252583027,
0.02461269125342369,
-0.1383982002735138,
-0.010474247857928276,
0.10940201580524445,
0.06309588253498077,
-0.053388532251119614,
-0.11535980552434921,
-0.07508539408445358,
-0.13648812472820282,
-0.016053849831223488,
-0.03682585433125496,
0.01998472400009632,
-0.0010487567633390427,
0.004692539107054472,
-0.02300235442817211,
-0.07761215418577194,
-0.09630712121725082,
-0.007148020900785923,
0.04250854253768921,
0.07566235959529877,
0.0023167915642261505,
-0.031341418623924255,
0.10942711681127548,
-0.06661258637905121,
-0.1059752032160759,
-0.031510382890701294,
-0.03851248323917389,
-0.06972668319940567,
-0.00045197500730864704,
0.0076215448789298534,
-0.02904844656586647,
0.014292997308075428,
0.1659569889307022,
0.0015649943379685283,
0.0756228044629097,
0.006029827520251274,
-0.009573974646627903,
-0.038530077785253525,
0.14902527630329132,
-0.07117098569869995,
-0.005280744284391403,
0.01728016883134842,
0.13220641016960144,
0.004164249170571566,
-0.0011646909406408668,
-0.03354824334383011,
-0.03918665647506714,
0.02932690642774105,
0.04137507081031799,
-0.02038954198360443,
0.05243019387125969,
-0.07439975440502167,
-0.03989102318882942,
0.08368818461894989,
-0.13569007813930511,
0.013666482642292976,
0.06358150392770767,
-0.07814336568117142,
0.010786755941808224,
0.009099587798118591,
0.0019997144117951393,
-0.08580705523490906,
0.08346465229988098,
-0.07093074172735214,
-0.05732264369726181,
-0.05459652468562126,
-0.1046890839934349,
0.02464327961206436,
-0.05107320845127106,
0.011737647466361523,
-0.10012004524469376,
-0.12068355828523636,
-0.002116855001077056,
0.01762886717915535,
-0.051590997725725174,
0.0018476282712072134,
-0.037952546030282974,
-0.10904373228549957,
0.04709958657622337,
-0.012654812075197697,
0.06900566071271896,
-0.06343096494674683,
0.07019809633493423,
0.04885498061776161,
0.030412521213293076,
0.08841601759195328,
0.019168658182024956,
-0.01699409820139408,
0.044026363641023636,
-0.06578488647937775,
0.11390591412782669,
-0.10877519845962524,
-0.039783310145139694,
-0.16472598910331726,
-0.08770990371704102,
0.010211859829723835,
-0.002058971207588911,
0.10191091150045395,
0.13344250619411469,
-0.16405104100704193,
-0.008653589524328709,
0.12808530032634735,
-0.016948694363236427,
-0.06873452663421631,
0.12234392017126083,
-0.028566788882017136,
-0.04488995671272278,
0.04466767981648445,
0.12004775553941727,
0.12795709073543549,
-0.18177823722362518,
-0.10169973969459534,
0.08058493584394455,
0.03707220032811165,
0.06040206179022789,
0.12962955236434937,
-0.0268890131264925,
0.03661319613456726,
0.007997212000191212,
-0.07376696914434433,
-0.010147696360945702,
-0.02928776480257511,
-0.07538820058107376,
-0.019288374111056328,
-0.0649195984005928,
0.11637935042381287,
-0.02273235097527504,
0.020123062655329704,
-0.04853348433971405,
-0.15032510459423065,
-0.012858336791396141,
0.15911760926246643,
-0.06086230278015137,
0.010808352380990982,
-0.1329205483198166,
0.07158791273832321,
-0.07681863754987717,
0.008799262344837189,
-0.10506878048181534,
-0.04015898331999779,
0.044081393629312515,
-0.05546814575791359,
-0.022908665239810944,
0.058874424546957016,
0.056251898407936096,
0.0689106434583664,
-0.026589035987854004,
-0.07651537656784058,
-0.09206031262874603,
-0.03034602478146553,
-0.08787161111831665,
-0.065988689661026,
-0.05395112931728363,
-0.01644287072122097,
0.06574860215187073,
-0.1854064017534256,
0.04886477813124657,
0.08083165436983109,
0.07774016261100769,
0.0389438197016716,
-0.012068545445799828,
0.01888766512274742,
-0.028438890352845192,
0.019282547757029533,
-0.05652419850230217,
0.01993885636329651,
-0.008934836834669113,
-0.059918854385614395,
-0.07306603342294693,
-0.15739066898822784,
0.01415274664759636,
0.09277426451444626,
0.057755131274461746,
-0.035914719104766846,
-0.006139660254120827,
-0.047502219676971436,
-0.006774399895220995,
-0.035931091755628586,
-0.0488588809967041,
0.19302482903003693,
0.040867555886507034,
0.13484638929367065,
-0.0940486490726471,
-0.0514073520898819,
0.01456223614513874,
0.008756269700825214,
0.013736976310610771,
0.1317053586244583,
0.03150225058197975,
-0.03333146870136261,
0.01655399240553379,
0.13399219512939453,
-0.05735800415277481,
0.0938076600432396,
-0.05876375362277031,
-0.11043530702590942,
-0.039792902767658234,
0.04809064790606499,
0.012843435630202293,
0.05481279641389847,
-0.04204675182700157,
0.047447558492422104,
0.03606543317437172,
0.047421302646398544,
0.008768677711486816,
-0.15163777768611908,
0.02222185581922531,
0.031251296401023865,
-0.05843646824359894,
-0.02343636006116867,
0.021921003237366676,
-0.010778825730085373,
0.06620083749294281,
0.0282980315387249,
0.015530578792095184,
0.015973977744579315,
-0.030498072504997253,
-0.08826308697462082,
0.1792241930961609,
-0.11068794131278992,
-0.11687075346708298,
-0.1377045065164566,
-0.012422885745763779,
-0.03688662499189377,
-0.00668104225769639,
0.02686387486755848,
-0.14362289011478424,
-0.08400087803602219,
-0.07464290410280228,
0.1227266788482666,
-0.005862843245267868,
-0.015919379889965057,
0.0351237878203392,
0.028267526999115944,
0.10432900488376617,
-0.15001867711544037,
0.027831340208649635,
-0.018328117206692696,
-0.10806739330291748,
-0.04050406441092491,
0.07325929403305054,
0.0399446077644825,
0.06885524094104767,
0.00875406339764595,
0.008870854042470455,
-0.0115915946662426,
0.24655097723007202,
-0.04203405603766441,
0.003084372030571103,
0.19194038212299347,
-0.034732598811388016,
0.042643871158361435,
0.10400724411010742,
0.035704392939805984,
-0.05672244727611542,
0.007194987498223782,
0.08187374472618103,
0.013443327508866787,
-0.2717972993850708,
-0.0009869234636425972,
-0.04110061749815941,
0.07551346719264984,
0.12613274157047272,
0.029963281005620956,
-0.10894152522087097,
0.06724830716848373,
-0.027828486636281013,
0.05777641013264656,
0.02143469639122486,
0.05986465886235237,
0.06338111311197281,
0.030196018517017365,
0.07170295715332031,
-0.02546006627380848,
0.020209407433867455,
0.07224491983652115,
0.09337525814771652,
0.10205525159835815,
-0.10637912154197693,
0.17656688392162323,
0.04338068142533302,
0.16776327788829803,
0.019649898633360863,
0.021778644993901253,
-0.06936801224946976,
0.009563090279698372,
0.009034311398863792,
-0.061530519276857376,
-0.06508974730968475,
0.025307774543762207,
0.009196160361170769,
0.04028591886162758,
-0.11333051323890686,
0.08608769625425339,
0.04455813020467758,
0.14577101171016693,
0.08599448204040527,
-0.20757077634334564,
-0.09142786264419556,
0.05296273157000542,
-0.024553395807743073,
-0.05723017454147339,
0.07660319656133652,
0.11495224386453629,
-0.0768580511212349,
-0.015220241621136665,
-0.04357282817363739,
0.08485519886016846,
-0.09631817042827606,
0.0311040710657835,
0.05380593240261078,
0.1484401822090149,
0.03401152417063713,
0.07702112942934036,
-0.158003568649292,
0.08625482022762299,
0.037664130330085754,
0.09742605686187744,
-0.005168580450117588,
0.06656336784362793,
0.01758531667292118,
-0.044556453824043274,
0.12265576422214508,
-0.02966902032494545,
0.05045277997851372,
-0.16759978234767914,
-0.1226646825671196,
0.03276420757174492,
0.06376218795776367,
-0.059304919093847275,
0.05533221736550331,
-0.05736175552010536,
-0.009874008595943451,
0.03387818485498428,
-0.011919759213924408,
-0.10209477692842484,
-0.12209576368331909,
0.03897183761000633,
-0.038241833448410034,
0.020876813679933548,
-0.061532072722911835,
-0.048156749457120895,
-0.11855748295783997,
0.1052330955862999,
-0.1859142780303955,
-0.04866348206996918,
-0.09993468225002289,
0.06494170427322388,
0.11491920799016953,
-0.09571424126625061,
0.010998532176017761,
0.02194538712501526,
0.12443559616804123,
0.02572791464626789,
-0.07336852699518204,
0.07286376506090164,
-0.06639425456523895,
-0.17468447983264923,
-0.02272474206984043,
0.1284351944923401,
-0.008354737423360348,
0.05776692181825638,
-0.015231378376483917,
-0.012712289579212666,
-0.0010093929013237357,
-0.09377128630876541,
0.04581465944647789,
0.09797293692827225,
0.12317381799221039,
0.011450673453509808,
-0.09283090382814407,
-0.006584757007658482,
-0.016602136194705963,
0.047660041600465775,
0.12873272597789764,
0.2489716112613678,
-0.07397842407226562,
0.05879223719239235,
0.1305023729801178,
-0.035883817821741104,
-0.19814512133598328,
0.014361194334924221,
0.09855443984270096,
0.027183646336197853,
-0.05190989375114441,
-0.18518927693367004,
0.08402422815561295,
0.11567430943250656,
0.001512658316642046,
-0.03644491359591484,
-0.2891072928905487,
-0.1212892159819603,
0.10800063610076904,
0.06560884416103363,
-0.02960710972547531,
-0.09599753469228745,
-0.04651656001806259,
-0.037405457347631454,
-0.18200527131557465,
0.07956109941005707,
-0.10832767188549042,
0.08296585828065872,
-0.0012627359246835113,
-0.01975332200527191,
0.03843468427658081,
-0.043083276599645615,
0.14305870234966278,
0.0063123274594545364,
0.03127061948180199,
-0.026083579286932945,
-0.04264260455965996,
-0.015679584816098213,
-0.08042742311954498,
0.08710408955812454,
-0.03361048921942711,
0.10266187787055969,
-0.15838977694511414,
-0.016912464052438736,
-0.009836451150476933,
-0.02474585734307766,
-0.048238884657621384,
-0.05550820007920265,
-0.04803105816245079,
0.028116751462221146,
0.09432089328765869,
-0.02017790637910366,
0.043595097959041595,
0.005879094358533621,
-0.02493215538561344,
0.07352837175130844,
0.1189507246017456,
0.03217139095067978,
-0.18802128732204437,
0.002218300709500909,
0.005194257479161024,
0.08859141170978546,
-0.17609816789627075,
0.06355021893978119,
0.11454720050096512,
-0.009471002966165543,
0.1366482973098755,
0.034379251301288605,
-0.03870486468076706,
0.039277493953704834,
0.009455898776650429,
-0.10081322491168976,
-0.10479672253131866,
-0.043569013476371765,
0.019213587045669556,
-0.171766996383667,
0.018015369772911072,
0.13739603757858276,
-0.017232565209269524,
-0.022979630157351494,
0.029341813176870346,
0.06328234076499939,
-0.03715752810239792,
0.14782299101352692,
0.057691674679517746,
0.07181059569120407,
-0.08219391107559204,
0.10661949217319489,
0.059757012873888016,
-0.06720282137393951,
0.056185536086559296,
0.08147720247507095,
-0.08127490431070328,
-0.04406880587339401,
-0.00024512922391295433,
0.12258228659629822,
0.051941193640232086,
-0.006633206736296415,
-0.039851900190114975,
-0.07024470716714859,
0.06863925606012344,
0.007466228678822517,
0.019792962819337845,
0.0038838612381368876,
-0.05655520036816597,
0.01236420962959528,
-0.0917920470237732,
0.1132797971367836,
0.030164681375026703,
0.027331018820405006,
-0.13916762173175812,
0.06757785379886627,
-0.030709678307175636,
0.04484539106488228,
0.012068395502865314,
0.025779927149415016,
-0.10596124082803726,
-0.023395569995045662,
-0.13873817026615143,
0.0802418515086174,
0.0019277348183095455,
0.03800838068127632,
-0.022569231688976288,
0.010796356946229935,
-0.021893572062253952,
0.030827147886157036,
-0.06824307143688202,
-0.05522449314594269,
-0.007891770452260971,
0.07238835841417313,
-0.12772274017333984,
0.022452864795923233,
0.02490655891597271,
-0.10604115575551987,
0.10281569510698318,
0.003288081381469965,
0.012932145036756992,
0.04888333007693291,
-0.01832025870680809,
-0.06199631467461586,
0.0029754575807601213,
0.05917605385184288,
0.02545660361647606,
-0.06850198656320572,
0.027638260275125504,
-0.06438913941383362,
-0.05120279639959335,
-0.037407830357551575,
0.05012008175253868,
-0.09605133533477783,
-0.020478377118706703,
-0.0416695736348629,
0.003322785487398505,
-0.1127193346619606,
0.056938041001558304,
0.07704924046993256,
0.040383730083703995,
0.13552838563919067,
-0.03208518028259277,
0.033089037984609604,
-0.16920338571071625,
-0.02172607183456421,
0.03920319676399231,
-0.055112093687057495,
-0.0732346624135971,
-0.04068097099661827,
0.08974658697843552,
-0.08689039200544357,
0.00258373050019145,
-0.028420068323612213,
-0.03211474418640137,
0.028264906257390976,
-0.039183784276247025,
0.026911301538348198,
0.04948103800415993,
0.17969989776611328,
0.0353897325694561,
-0.005131411831825972,
0.0018054343527182937,
-0.012787398882210255,
0.06979887932538986,
0.08085804432630539,
0.09987886250019073,
0.15700863301753998,
0.02085297182202339,
0.06965366750955582,
0.014964616857469082,
-0.11819084733724594,
-0.11594677716493607,
0.17315255105495453,
-0.11181960999965668,
0.0701352208852768,
-0.04970289394259453,
0.10530608147382736,
0.0846748799085617,
-0.14690695703029633,
0.03330977261066437,
-0.048371534794569016,
-0.11479748785495758,
-0.13238029181957245,
-0.08412376791238785,
-0.08490270376205444,
-0.08175285905599594,
0.02063598670065403,
-0.13647979497909546,
0.017899546772241592,
0.07920718193054199,
0.03669474273920059,
0.018627338111400604,
0.13721789419651031,
-0.0473848320543766,
-0.03005627915263176,
0.05111628770828247,
0.037132419645786285,
-0.006933166645467281,
-0.009137646295130253,
-0.03743884339928627,
0.10648901015520096,
0.0025405411142855883,
0.09524356573820114,
-0.03079647198319435,
0.020210057497024536,
0.017054978758096695,
-0.015699192881584167,
-0.06599362939596176,
0.0025593002792447805,
0.009803916327655315,
0.03217250108718872,
0.13893115520477295,
0.06241126358509064,
-0.0025583102833479643,
-0.02274443209171295,
0.19861941039562225,
-0.049012329429388046,
-0.02319221943616867,
-0.1783655881881714,
0.042217954993247986,
0.02487010508775711,
0.04592800885438919,
0.055790290236473083,
-0.12844008207321167,
-0.039416804909706116,
0.17861247062683105,
0.1423274129629135,
-0.0837501809000969,
-0.015803800895810127,
0.05719617009162903,
-0.005784393288195133,
-0.049806758761405945,
0.10005486011505127,
0.07738982141017914,
0.08737031370401382,
-0.04589048773050308,
0.0018945675110444427,
0.03139805793762207,
-0.051043491810560226,
0.001985386945307255,
0.1639283150434494,
-0.026349524036049843,
0.008879534900188446,
-0.03337990492582321,
0.008167707361280918,
0.014557518064975739,
-0.18321393430233002,
-0.022733744233846664,
-0.10327970236539841,
-0.1446961909532547,
-0.02366097830235958,
0.07201047986745834,
0.02108984999358654,
0.05048412084579468,
-0.02811087667942047,
-0.025576820597052574,
0.1296606808900833,
-0.01668720506131649,
-0.027360500767827034,
-0.04641355946660042,
0.05957827344536781,
0.012554577551782131,
0.18675097823143005,
0.038398150354623795,
0.05745553597807884,
0.12202230095863342,
-0.011064649559557438,
-0.12188253551721573,
0.03994031623005867,
0.08916781842708588,
-0.10945912450551987,
-0.005513296462595463,
0.14284946024417877,
-0.044842109084129333,
0.10941843688488007,
0.05188991501927376,
-0.13172078132629395,
-0.03470000624656677,
-0.011600303463637829,
-0.07897903025150299,
-0.08786018192768097,
0.015898382291197777,
-0.08172307163476944,
0.1270570307970047,
0.17990067601203918,
-0.038258373737335205,
-0.03490625321865082,
-0.035730645060539246,
0.07417692989110947,
-0.011930649168789387,
0.051358114928007126,
0.045410335063934326,
-0.14612437784671783,
0.026478294283151627,
0.023696202784776688,
0.06306996941566467,
-0.1950761079788208,
-0.0590609535574913,
0.056235432624816895,
-0.01208247896283865,
-0.08943961560726166,
0.11383242905139923,
0.057790230959653854,
0.02391011081635952,
-0.03638874739408493,
-0.0747215673327446,
-0.04217865318059921,
0.08217302709817886,
-0.10093358159065247,
-0.09191698580980301
] |
null | null |
transformers
|
# BERT L-2 H-512 fine-tuned on MLM (CORD-19 2020/06/16)
BERT model with [2 Transformer layers and hidden embedding of size 512](https://huggingface.co/google/bert_uncased_L-2_H-512_A-8), referenced in [Well-Read Students Learn Better: On the Importance of Pre-training Compact Models](https://arxiv.org/abs/1908.08962), fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).
## Training the model
```bash
python run_language_modeling.py
--model_type bert
--model_name_or_path google/bert_uncased_L-2_H-512_A-8
--do_train
--train_data_file {cord19-200616-dataset}
--mlm
--mlm_probability 0.2
--line_by_line
--block_size 512
--per_device_train_batch_size 20
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-2_H-512_A-8_cord19-200616
|
{}
|
fill-mask
|
aodiniz/bert_uncased_L-2_H-512_A-8_cord19-200616
|
[
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"arxiv:1908.08962",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[] |
TAGS
#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us
|
# BERT L-2 H-512 fine-tuned on MLM (CORD-19 2020/06/16)
BERT model with 2 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).
## Training the model
'''bash
python run_language_modeling.py
--model_type bert
--model_name_or_path google/bert_uncased_L-2_H-512_A-8
--do_train
--train_data_file {cord19-200616-dataset}
--mlm
--mlm_probability 0.2
--line_by_line
--block_size 512
--per_device_train_batch_size 20
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-2_H-512_A-8_cord19-200616
|
[
"# BERT L-2 H-512 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 2 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).",
"## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-2_H-512_A-8\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 512\n --per_device_train_batch_size 20\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-2_H-512_A-8_cord19-200616"
] |
[
"TAGS\n#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us \n",
"# BERT L-2 H-512 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 2 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).",
"## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-2_H-512_A-8\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 512\n --per_device_train_batch_size 20\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-2_H-512_A-8_cord19-200616"
] |
[
48,
85,
149
] |
[
"passage: TAGS\n#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us \n# BERT L-2 H-512 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 2 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-2_H-512_A-8\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 512\n --per_device_train_batch_size 20\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-2_H-512_A-8_cord19-200616"
] |
[
-0.13583415746688843,
0.12481004744768143,
-0.003204968059435487,
0.1101379245519638,
0.10248786211013794,
0.025878550484776497,
0.09186132252216339,
0.12901149690151215,
0.00333899213001132,
0.09914079308509827,
0.11231445521116257,
0.049642447382211685,
0.03918243572115898,
0.07716801762580872,
-0.028164537623524666,
-0.21557371318340302,
-0.002088623121380806,
-0.014738201163709164,
-0.07563430815935135,
0.07467413693666458,
0.06124060973525047,
-0.08313829451799393,
0.10274384915828705,
-0.01889125257730484,
-0.059155866503715515,
0.009269816800951958,
-0.0025860334280878305,
-0.0537908673286438,
0.04795508459210396,
0.01097195129841566,
0.10868536680936813,
-0.014744678512215614,
0.06389050930738449,
-0.11831074953079224,
0.012821430340409279,
0.06299019604921341,
0.01738748326897621,
0.1354656219482422,
0.05499819666147232,
0.016623476520180702,
-0.0026723055634647608,
-0.0976061150431633,
0.06032037362456322,
0.07772772014141083,
-0.06349018961191177,
-0.09201645106077194,
-0.07063453644514084,
0.026141563430428505,
0.07474364340305328,
0.0703175738453865,
-0.01749364659190178,
0.04560570791363716,
-0.07565603405237198,
0.03131432086229324,
0.1766156405210495,
-0.24122002720832825,
-0.026905708014965057,
0.0934600979089737,
0.02794639579951763,
0.018887247890233994,
-0.07167327404022217,
-0.02591039054095745,
0.038001175969839096,
0.03892359882593155,
0.09788448363542557,
-0.013447617180645466,
0.0331655815243721,
0.037802763283252716,
-0.1143224760890007,
0.017241794615983963,
0.11570120602846146,
0.051821596920490265,
-0.035979095846414566,
-0.1212380975484848,
-0.07265985757112503,
-0.07803983986377716,
-0.037794630974531174,
0.030665792524814606,
0.028249965980648994,
-0.013794568367302418,
-0.04136867821216583,
-0.04971662536263466,
-0.04717892408370972,
-0.12146008014678955,
-0.011882705613970757,
0.11782927066087723,
0.05285046249628067,
-0.0000755309229134582,
-0.03941868990659714,
0.10379825532436371,
-0.06732666492462158,
-0.10645166784524918,
0.005783720873296261,
-0.028194177895784378,
-0.006137872580438852,
0.032015733420848846,
-0.007602642755955458,
-0.05696265399456024,
0.00663049565628171,
0.05538705363869667,
-0.0016290564090013504,
0.06360139697790146,
0.024739934131503105,
0.028243564069271088,
-0.03600025177001953,
0.1093875989317894,
-0.11799543350934982,
0.009716447442770004,
0.023878304287791252,
0.14185689389705658,
0.016889283433556557,
-0.010515916161239147,
-0.06762447953224182,
-0.018487250432372093,
0.011819141916930676,
0.042263247072696686,
-0.017341015860438347,
0.09161190688610077,
-0.0847269743680954,
-0.02345998026430607,
0.005421321373432875,
-0.16985847055912018,
0.01983683742582798,
0.03436571732163429,
-0.08046907186508179,
-0.02800467237830162,
0.06925196945667267,
-0.02707008458673954,
-0.08317945897579193,
0.07038819789886475,
-0.09247378259897232,
-0.04704374074935913,
-0.10661336034536362,
-0.13409437239170074,
0.011863959021866322,
-0.04175983741879463,
-0.009834605269134045,
-0.083884596824646,
-0.1680612862110138,
0.01615998148918152,
0.05539563670754433,
-0.05361992493271828,
0.023417171090841293,
-0.04583087936043739,
-0.10193626582622528,
0.02623879723250866,
0.012051817961037159,
0.08232416212558746,
-0.05091742426156998,
0.07897741347551346,
0.04643796756863594,
0.0448981374502182,
0.027441605925559998,
0.028161238878965378,
0.014371147379279137,
0.06094999611377716,
-0.11930925399065018,
0.10215133428573608,
-0.11923518031835556,
0.018181264400482178,
-0.13760894536972046,
-0.09573844075202942,
-0.015069634653627872,
0.046291790902614594,
0.10734865814447403,
0.14160802960395813,
-0.21132035553455353,
-0.018298296257853508,
0.15978847444057465,
0.0062238909304142,
-0.06654123961925507,
0.1050117239356041,
-0.051057275384664536,
0.018410127609968185,
0.057441357523202896,
0.07700115442276001,
0.10854251682758331,
-0.142869770526886,
-0.06713464111089706,
0.07209688425064087,
0.03934243693947792,
0.008844917640089989,
0.10398661345243454,
-0.03558257967233658,
-0.06139855086803436,
0.03624727949500084,
-0.06003810465335846,
-0.011195163242518902,
-0.03529251739382744,
-0.05403748154640198,
-0.02235497161746025,
-0.07473825663328171,
0.10332378000020981,
-0.036537054926157,
0.027836592867970467,
-0.07184603810310364,
-0.15608778595924377,
0.06472524255514145,
0.13822442293167114,
-0.06362738460302353,
-0.01042461208999157,
-0.11777755618095398,
0.08906809240579605,
-0.0676957368850708,
-0.0014610541984438896,
-0.13834591209888458,
-0.003458339488133788,
0.012497451156377792,
-0.051906611770391464,
0.02908279001712799,
-0.004570350516587496,
0.06089332699775696,
0.07961317896842957,
-0.030997589230537415,
-0.02867177315056324,
-0.0671297162771225,
-0.00957968644797802,
-0.07183752954006195,
-0.09110277891159058,
-0.058057453483343124,
-0.005356397945433855,
0.060004670172929764,
-0.16583262383937836,
0.06546701490879059,
0.059690576046705246,
0.07925695180892944,
0.029320457950234413,
-0.016823865473270416,
0.020236805081367493,
-0.019785253331065178,
0.03125054016709328,
-0.056684281677007675,
0.04477028548717499,
0.0027834195643663406,
-0.04033609479665756,
-0.06667263805866241,
-0.1768808215856552,
-0.02589751034975052,
0.12218472361564636,
0.035067226737737656,
-0.055644672363996506,
0.019304698333144188,
-0.030550643801689148,
-0.0030911292415112257,
-0.03865671902894974,
-0.08376201242208481,
0.2410011738538742,
0.04276520758867264,
0.17111901938915253,
-0.1136729046702385,
-0.05050250142812729,
0.013372940942645073,
0.0035728083457797766,
0.009191341698169708,
0.11556708067655563,
0.062312789261341095,
-0.030023206025362015,
0.03419000282883644,
0.10677054524421692,
-0.09352797269821167,
0.16566647589206696,
-0.0344858318567276,
-0.07363100349903107,
-0.052465543150901794,
0.01855601742863655,
0.0032001431100070477,
0.10588604211807251,
-0.0553712397813797,
-0.0010155788622796535,
0.037334442138671875,
0.007962239906191826,
0.041465505957603455,
-0.1589074283838272,
0.02810727432370186,
0.029463090002536774,
-0.03416343405842781,
-0.05794154480099678,
0.007719813380390406,
0.0022787784691900015,
0.07394011318683624,
0.026639439165592194,
-0.023678889498114586,
0.022393587976694107,
-0.03482532873749733,
-0.08797042071819305,
0.1866481900215149,
-0.1303892880678177,
-0.16424763202667236,
-0.16253796219825745,
-0.044711992144584656,
-0.03928869590163231,
-0.004952197894454002,
-0.016670241951942444,
-0.1150154322385788,
-0.07936256378889084,
-0.10476776212453842,
0.11275112628936768,
-0.005401694215834141,
-0.04603938013315201,
0.025217771530151367,
0.05169220268726349,
0.10307127237319946,
-0.16497285664081573,
0.007517853751778603,
0.010929722338914871,
-0.05362483486533165,
-0.022754240781068802,
0.04779082536697388,
0.016008496284484863,
0.07538613677024841,
-0.01263710018247366,
0.023271532729268074,
0.010685751214623451,
0.17139963805675507,
-0.004442570731043816,
0.00581922335550189,
0.1368807852268219,
-0.024523098021745682,
0.054840851575136185,
0.11945627629756927,
0.06822774559259415,
-0.06109379231929779,
0.015459541231393814,
0.05300833657383919,
0.01300758309662342,
-0.24490149319171906,
-0.05250879377126694,
-0.04361956939101219,
0.09498777985572815,
0.13876976072788239,
0.019710227847099304,
-0.13819673657417297,
0.06462541222572327,
0.0025895743165165186,
0.06834551692008972,
0.005516779609024525,
0.06955339014530182,
0.0340782105922699,
-0.030593587085604668,
0.06197145953774452,
-0.008409608155488968,
0.0023384273517876863,
0.07880639284849167,
0.02334226667881012,
0.10217105597257614,
-0.08440299332141876,
0.17526698112487793,
0.036180898547172546,
0.1337444931268692,
0.036675721406936646,
0.07662943005561829,
-0.060398168861866,
0.013789908960461617,
0.013222658075392246,
-0.08634346723556519,
-0.04209199920296669,
0.01598966307938099,
-0.01644844561815262,
0.05757344514131546,
-0.11591754108667374,
0.086703822016716,
0.017951972782611847,
0.1272590011358261,
0.0996994897723198,
-0.2839791774749756,
-0.05806087702512741,
0.03206359222531319,
-0.014386976137757301,
-0.12590292096138,
0.06543312966823578,
0.13232925534248352,
-0.06527633219957352,
-0.030557313933968544,
-0.05785828456282616,
0.08344433456659317,
-0.050284456461668015,
0.025897931307554245,
0.04103279113769531,
0.2016068398952484,
0.004140408709645271,
0.10080839693546295,
-0.11168216168880463,
0.15707990527153015,
0.042691100388765335,
0.06085757166147232,
-0.02621324732899666,
0.053381774574518204,
0.03965477645397186,
-0.06189904734492302,
0.09610860794782639,
-0.003424538066610694,
0.08679462224245071,
-0.1444515734910965,
-0.14469829201698303,
0.03494855761528015,
0.08379191160202026,
-0.09051089733839035,
0.04418448358774185,
-0.04499855265021324,
-0.02610686793923378,
0.05127428472042084,
-0.06448791921138763,
-0.11768049746751785,
-0.13090655207633972,
0.01958882249891758,
-0.009184070862829685,
0.011796044185757637,
-0.09009812772274017,
-0.07303977012634277,
-0.07268565148115158,
0.17812837660312653,
-0.07525961846113205,
-0.05239833891391754,
-0.10157912224531174,
0.06053740531206131,
0.11868792772293091,
-0.06950701773166656,
0.03609630838036537,
-0.009087073616683483,
0.13122564554214478,
0.005299942102283239,
-0.08150897920131683,
0.10966537147760391,
-0.04729630425572395,
-0.09276214987039566,
-0.05019129812717438,
0.1687544733285904,
-0.0014396858168765903,
0.06031041592359543,
-0.0314476452767849,
-0.0037833000533282757,
-0.004150988534092903,
-0.07499197870492935,
-0.012561536394059658,
0.12194410711526871,
0.20799949765205383,
0.07254455238580704,
-0.157052680850029,
0.007328531704843044,
0.013939887285232544,
0.05231722816824913,
0.15737475454807281,
0.2141282707452774,
-0.0704946294426918,
0.041531000286340714,
0.13244451582431793,
-0.0072144498117268085,
-0.23938032984733582,
0.008865307085216045,
0.08168584853410721,
0.060047879815101624,
-0.028134947642683983,
-0.2126035988330841,
0.07455545663833618,
0.04990413412451744,
0.001785497646778822,
-0.005110558122396469,
-0.24313890933990479,
-0.12403608113527298,
0.14436198770999908,
0.09114381670951843,
0.030671747401356697,
-0.06334774196147919,
-0.047469161450862885,
-0.09315499663352966,
-0.14315833151340485,
0.1213546097278595,
-0.11839569360017776,
0.09635599702596664,
0.0005135839455761015,
0.06087362393736839,
0.04442795366048813,
-0.06495489180088043,
0.15222086012363434,
-0.05320413410663605,
0.04247507080435753,
-0.049729250371456146,
-0.10453172028064728,
-0.02193332277238369,
-0.10087423771619797,
0.11955786496400833,
-0.12485570460557938,
0.13497421145439148,
-0.13465207815170288,
-0.015269091352820396,
-0.011265546083450317,
-0.003980709705501795,
-0.040077488869428635,
-0.04155419021844864,
-0.04363938421010971,
0.03880798816680908,
0.038441091775894165,
-0.02134377881884575,
-0.01585804857313633,
0.0037604342214763165,
-0.03791619837284088,
0.07778146862983704,
0.12918029725551605,
-0.026835834607481956,
-0.12347067892551422,
0.013442662544548512,
-0.0009620294440537691,
0.08743569254875183,
-0.20422209799289703,
0.04307965189218521,
0.09761499613523483,
0.0253487229347229,
0.14835511147975922,
0.041758667677640915,
-0.06304249912500381,
-0.018735403195023537,
0.03757476061582565,
-0.14246205985546112,
-0.0725470781326294,
-0.08315031230449677,
0.07019488513469696,
-0.2032366693019867,
-0.033601608127355576,
0.11797436326742172,
-0.05248211696743965,
-0.0120242265984416,
0.03938122093677521,
0.035994745790958405,
-0.053243037313222885,
0.12886732816696167,
0.07732012122869492,
0.07683775573968887,
-0.08603575825691223,
0.0974133089184761,
0.03024180978536606,
0.015569307841360569,
0.04728574678301811,
0.07990345358848572,
-0.08290755748748779,
-0.03788299486041069,
0.0527067631483078,
0.21490691602230072,
0.09298780560493469,
-0.01053033396601677,
-0.0622258186340332,
-0.07666522264480591,
0.047193340957164764,
0.08016122132539749,
0.04609471186995506,
0.0011849083239212632,
-0.07431158423423767,
-0.008697157725691795,
-0.07589465379714966,
0.1226775273680687,
0.05094790458679199,
0.012362836860120296,
-0.13604523241519928,
0.10538668185472488,
-0.010957401245832443,
0.027034148573875427,
-0.0038596095982939005,
0.027275854721665382,
-0.11101510375738144,
-0.03791644051671028,
-0.18190978467464447,
0.06426016241312027,
0.02673223242163658,
0.00825965404510498,
-0.012304809875786304,
0.024222245439887047,
-0.015552690252661705,
0.012199528515338898,
-0.07463991641998291,
-0.0598825179040432,
0.00671363202854991,
0.05082857236266136,
-0.09400616586208344,
0.024708110839128494,
0.05252740532159805,
-0.08885545283555984,
0.07258903980255127,
0.011568509973585606,
-0.00405165646225214,
0.04549265652894974,
-0.03519533574581146,
-0.07463020831346512,
0.002233775332570076,
0.041860852390527725,
0.04915815591812134,
-0.10179633647203445,
0.006116244941949844,
-0.06399521231651306,
0.007539717480540276,
-0.032819733023643494,
0.06470833718776703,
-0.10014645010232925,
-0.002507091499865055,
-0.0136100547388196,
0.011322096921503544,
-0.09783916175365448,
0.05966578796505928,
0.07115022093057632,
0.0743410587310791,
0.13410404324531555,
-0.040632616728544235,
0.029091406613588333,
-0.17741693556308746,
-0.0015760587994009256,
0.011468160897493362,
-0.06573782116174698,
-0.043547891080379486,
-0.021401094272732735,
0.0808996632695198,
-0.09029173105955124,
0.057752348482608795,
0.01755417510867119,
-0.08651521056890488,
0.009076750837266445,
0.00696561299264431,
0.023224715143442154,
0.03894376754760742,
0.2052902728319168,
0.008527280762791634,
0.017436789348721504,
0.004454934969544411,
0.010058291256427765,
0.08646746724843979,
0.055608030408620834,
0.08911293745040894,
0.15159550309181213,
-0.006501072086393833,
0.05986381694674492,
0.07672487944364548,
-0.07124044001102448,
-0.09352705627679825,
0.1384853571653366,
-0.06803031265735626,
0.07748984545469284,
-0.07904281467199326,
0.17726944386959076,
0.05104786902666092,
-0.1374245285987854,
0.06394226104021072,
-0.03559484705328941,
-0.13851647078990936,
-0.12065858393907547,
-0.13819992542266846,
-0.07610422372817993,
-0.08265171945095062,
0.003735718782991171,
-0.12272664159536362,
-0.015833204612135887,
0.10817842185497284,
0.02399369515478611,
0.025586456060409546,
0.16978347301483154,
-0.13645391166210175,
-0.03863251581788063,
0.06429033726453781,
0.026986973360180855,
-0.013618072494864464,
-0.04923868179321289,
-0.05869192257523537,
0.06353001296520233,
0.029934095218777657,
0.07905922085046768,
-0.016349736601114273,
0.003674807958304882,
0.023022828623652458,
-0.005750520620495081,
-0.08408614993095398,
-0.02382740192115307,
-0.014049038290977478,
0.03558211401104927,
0.13234160840511322,
0.04245983809232712,
-0.030659805983304977,
-0.016789274290204048,
0.21217916905879974,
-0.07549354434013367,
-0.053546372801065445,
-0.15884189307689667,
0.09485426545143127,
-0.01939869485795498,
0.0004799621528945863,
0.040022555738687515,
-0.12586218118667603,
-0.043280620127916336,
0.21992510557174683,
0.2195841521024704,
-0.08748789131641388,
-0.00021503174502868205,
0.034481558948755264,
-0.007194112986326218,
-0.06000484153628349,
0.09040699154138565,
0.06370802223682404,
0.06897459924221039,
-0.028163736686110497,
-0.018615391105413437,
0.013068403117358685,
-0.059404272586107254,
-0.03614024817943573,
0.10984142869710922,
0.0019839510787278414,
-0.028283724561333656,
-0.03830372542142868,
0.0212797150015831,
0.01956695131957531,
-0.19510102272033691,
-0.05145348981022835,
-0.11944617331027985,
-0.13628236949443817,
-0.04343397542834282,
0.06542517244815826,
0.01284435112029314,
0.0685376524925232,
-0.045982375741004944,
0.0020819103810936213,
0.11465136706829071,
-0.024268170818686485,
-0.0712977796792984,
-0.07064495980739594,
0.057343512773513794,
-0.0175523329526186,
0.18689069151878357,
0.027165621519088745,
0.06473042815923691,
0.11780622601509094,
-0.01905667968094349,
-0.0919656828045845,
0.04905524104833603,
0.06516256928443909,
-0.08128238469362259,
-0.0002531473583076149,
0.1564732789993286,
-0.05644961819052696,
0.05827485769987106,
0.014883441850543022,
-0.08032483607530594,
-0.043538615107536316,
-0.06220279261469841,
-0.09214600175619125,
-0.10460419207811356,
0.00866923201829195,
-0.08668237179517746,
0.12154937535524368,
0.19561387598514557,
-0.04112865403294563,
-0.05183195322751999,
-0.051721539348363876,
0.06803656369447708,
-0.0001520299556432292,
-0.028493991121649742,
0.028853297233581543,
-0.16679753363132477,
-0.007155533879995346,
0.03183654323220253,
0.022505007684230804,
-0.30216485261917114,
-0.050831567496061325,
0.03575865551829338,
-0.011519327759742737,
-0.08042073249816895,
0.10526152700185776,
0.09936182200908661,
0.05068723484873772,
-0.03263251483440399,
-0.042700763791799545,
-0.03766782209277153,
0.09616202861070633,
-0.13153259456157684,
-0.09335394203662872
] |
null | null |
transformers
|
# BERT L-4 H-256 fine-tuned on MLM (CORD-19 2020/06/16)
BERT model with [4 Transformer layers and hidden embedding of size 256](https://huggingface.co/google/bert_uncased_L-4_H-256_A-4), referenced in [Well-Read Students Learn Better: On the Importance of Pre-training Compact Models](https://arxiv.org/abs/1908.08962), fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).
## Training the model
```bash
python run_language_modeling.py
--model_type bert
--model_name_or_path google/bert_uncased_L-4_H-256_A-4
--do_train
--train_data_file {cord19-200616-dataset}
--mlm
--mlm_probability 0.2
--line_by_line
--block_size 256
--per_device_train_batch_size 20
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-4_H-256_A-4_cord19-200616
|
{}
|
fill-mask
|
aodiniz/bert_uncased_L-4_H-256_A-4_cord19-200616
|
[
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"arxiv:1908.08962",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[] |
TAGS
#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us
|
# BERT L-4 H-256 fine-tuned on MLM (CORD-19 2020/06/16)
BERT model with 4 Transformer layers and hidden embedding of size 256, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).
## Training the model
'''bash
python run_language_modeling.py
--model_type bert
--model_name_or_path google/bert_uncased_L-4_H-256_A-4
--do_train
--train_data_file {cord19-200616-dataset}
--mlm
--mlm_probability 0.2
--line_by_line
--block_size 256
--per_device_train_batch_size 20
--learning_rate 3e-5
--num_train_epochs 2
--output_dir bert_uncased_L-4_H-256_A-4_cord19-200616
|
[
"# BERT L-4 H-256 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 4 Transformer layers and hidden embedding of size 256, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).",
"## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-4_H-256_A-4\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 256\n --per_device_train_batch_size 20\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-4_H-256_A-4_cord19-200616"
] |
[
"TAGS\n#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us \n",
"# BERT L-4 H-256 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 4 Transformer layers and hidden embedding of size 256, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).",
"## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-4_H-256_A-4\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 256\n --per_device_train_batch_size 20\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-4_H-256_A-4_cord19-200616"
] |
[
48,
85,
149
] |
[
"passage: TAGS\n#transformers #pytorch #jax #bert #fill-mask #arxiv-1908.08962 #autotrain_compatible #endpoints_compatible #region-us \n# BERT L-4 H-256 fine-tuned on MLM (CORD-19 2020/06/16)\n\nBERT model with 4 Transformer layers and hidden embedding of size 256, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).## Training the model\n\n'''bash\npython run_language_modeling.py\n --model_type bert\n --model_name_or_path google/bert_uncased_L-4_H-256_A-4\n --do_train\n --train_data_file {cord19-200616-dataset}\n --mlm\n --mlm_probability 0.2\n --line_by_line\n --block_size 256\n --per_device_train_batch_size 20\n --learning_rate 3e-5\n --num_train_epochs 2\n --output_dir bert_uncased_L-4_H-256_A-4_cord19-200616"
] |
[
-0.1302972435951233,
0.1396423876285553,
-0.0028933510184288025,
0.11229585856199265,
0.10809233784675598,
0.0222039632499218,
0.08523204177618027,
0.11657939106225967,
-0.008649494498968124,
0.10235613584518433,
0.11648563295602798,
0.04351815581321716,
0.02734452486038208,
0.06963896751403809,
-0.015421717427670956,
-0.21351991593837738,
-0.005683594848960638,
-0.009783309884369373,
-0.09127354621887207,
0.07217182219028473,
0.054681867361068726,
-0.08768068253993988,
0.1053682416677475,
-0.019580287858843803,
-0.0654342770576477,
0.010931563563644886,
-0.009152492508292198,
-0.06019464135169983,
0.049570392817258835,
0.011006085202097893,
0.10019239038228989,
-0.015681399032473564,
0.05760733783245087,
-0.10324277728796005,
0.011593395844101906,
0.06293146312236786,
0.020957572385668755,
0.12924610078334808,
0.0447678342461586,
0.01514076441526413,
0.009636073373258114,
-0.10535012185573578,
0.05504971742630005,
0.07002821564674377,
-0.05832460895180702,
-0.09391334652900696,
-0.05854567512869835,
0.010792692191898823,
0.05678435415029526,
0.08268586546182632,
-0.02326551266014576,
0.045244891196489334,
-0.07266353815793991,
0.026215773075819016,
0.15396557748317719,
-0.25307515263557434,
-0.029531558975577354,
0.079549141228199,
0.011997940018773079,
0.013743188232183456,
-0.07095371931791306,
-0.021715566515922546,
0.038314782083034515,
0.05006314814090729,
0.11876081675291061,
-0.015691379085183144,
0.02476983331143856,
0.038410548120737076,
-0.11275868862867355,
0.02578166127204895,
0.12657226622104645,
0.05335775390267372,
-0.03183390200138092,
-0.12040580064058304,
-0.06435304880142212,
-0.05792573094367981,
-0.03712669387459755,
0.03776085749268532,
0.02990981750190258,
-0.024503391236066818,
-0.06234618276357651,
-0.041794177144765854,
-0.0475456640124321,
-0.10466828942298889,
-0.009624951519072056,
0.12870433926582336,
0.05816393718123436,
-0.004594389349222183,
-0.02406131476163864,
0.09514616429805756,
-0.04302956908941269,
-0.10762177407741547,
0.017682237550616264,
-0.03287621587514877,
0.010792912915349007,
0.04073651134967804,
-0.01756499893963337,
-0.04487045481801033,
0.01583063043653965,
0.028570646420121193,
-0.02680070698261261,
0.06751934438943863,
0.029377272352576256,
0.013990537263453007,
-0.04294871166348457,
0.1033184826374054,
-0.11911263316869736,
0.018439197912812233,
0.02824709564447403,
0.14346058666706085,
0.02143074758350849,
-0.008447145111858845,
-0.06628991663455963,
-0.004048953298479319,
0.00262457225471735,
0.03856619819998741,
-0.03282231464982033,
0.09815779328346252,
-0.07735476642847061,
-0.021932272240519524,
0.004122644197195768,
-0.1685703843832016,
0.009146984666585922,
0.03079756535589695,
-0.07902436703443527,
-0.040378160774707794,
0.06273992359638214,
-0.03639928624033928,
-0.09072910249233246,
0.06643373519182205,
-0.09791748970746994,
-0.05466452240943909,
-0.0973711609840393,
-0.1283656507730484,
0.011817383579909801,
-0.040700703859329224,
-0.002381287282332778,
-0.0875861793756485,
-0.1416839212179184,
0.020144034177064896,
0.052265509963035583,
-0.05456419289112091,
0.013596614822745323,
-0.040162861347198486,
-0.09828784316778183,
0.02920195274055004,
0.01195289846509695,
0.07216770201921463,
-0.05446532741189003,
0.07188498228788376,
0.05701199173927307,
0.045467309653759,
0.03914111852645874,
0.027642441913485527,
0.007188544608652592,
0.05668012052774429,
-0.13007168471813202,
0.10237952321767807,
-0.11898257583379745,
0.02523408830165863,
-0.1341732293367386,
-0.10520384460687637,
-0.012243136763572693,
0.04809778556227684,
0.1040865033864975,
0.1417515128850937,
-0.18708327412605286,
-0.02223438210785389,
0.16907593607902527,
0.01681414246559143,
-0.0652618482708931,
0.10656242072582245,
-0.05366279184818268,
-0.003065889235585928,
0.06267023086547852,
0.07742909342050552,
0.09294068813323975,
-0.13418278098106384,
-0.07768002897500992,
0.07917903363704681,
0.04784335568547249,
0.005310480482876301,
0.10332382470369339,
-0.03295736759901047,
-0.08181014657020569,
0.040258537977933884,
-0.05798562988638878,
-0.004237127490341663,
-0.03399036079645157,
-0.051983028650283813,
-0.02151079848408699,
-0.07584010064601898,
0.11227992177009583,
-0.033183347433805466,
0.028790021315217018,
-0.08705995231866837,
-0.16535133123397827,
0.06126099452376366,
0.13839039206504822,
-0.07375703752040863,
-0.01865454576909542,
-0.11903179436922073,
0.08106439560651779,
-0.07083707302808762,
0.004670934751629829,
-0.13361641764640808,
-0.0157818291336298,
0.011254558339715004,
-0.04020492732524872,
0.03896170109510422,
-0.011616348288953304,
0.055528149008750916,
0.08810728043317795,
-0.03824141249060631,
-0.03024846501648426,
-0.06097133085131645,
-0.008636968210339546,
-0.061420854181051254,
-0.10003865510225296,
-0.05372019112110138,
0.00010985556582454592,
0.0738152265548706,
-0.17070640623569489,
0.05437617003917694,
0.0542801134288311,
0.08365222066640854,
0.020095963031053543,
-0.006725201848894358,
0.027516363188624382,
-0.014036648906767368,
0.04274377599358559,
-0.05524316802620888,
0.04609164968132973,
0.012678470462560654,
-0.03870522230863571,
-0.05599120259284973,
-0.17292727530002594,
-0.016629546880722046,
0.13315775990486145,
0.027643829584121704,
-0.04793272539973259,
0.0366368293762207,
-0.027675146237015724,
-0.003949276637285948,
-0.02970016561448574,
-0.07374384254217148,
0.24038651585578918,
0.03317119553685188,
0.17500413954257965,
-0.11620361357927322,
-0.04888834431767464,
0.018665222451090813,
0.0022766096517443657,
0.011302175931632519,
0.12307041138410568,
0.035514529794454575,
-0.04074842855334282,
0.027481190860271454,
0.10257485508918762,
-0.09226594120264053,
0.16715887188911438,
-0.03920726850628853,
-0.0686219185590744,
-0.053505171090364456,
0.01222066767513752,
-0.0010215522488579154,
0.10374781489372253,
-0.05516367405653,
-0.009687221609055996,
0.030011741444468498,
0.00002599432264105417,
0.04132732003927231,
-0.14975398778915405,
0.03059113398194313,
0.027762804180383682,
-0.030389731749892235,
-0.04584568366408348,
0.012552193365991116,
-0.002288538496941328,
0.06953232735395432,
0.032636553049087524,
-0.019084621220827103,
0.0254043061286211,
-0.031224330887198448,
-0.08386464416980743,
0.19481225311756134,
-0.135700523853302,
-0.1785019487142563,
-0.1584475040435791,
-0.0559525229036808,
-0.051172297447919846,
-0.006844999268651009,
-0.02065705507993698,
-0.13119098544120789,
-0.07708260416984558,
-0.10016652941703796,
0.1151527613401413,
-0.014617566019296646,
-0.04770440235733986,
0.027801837772130966,
0.04742632433772087,
0.11388080567121506,
-0.1570023149251938,
0.006533463019877672,
0.013495461083948612,
-0.04450630396604538,
-0.02261323854327202,
0.05584074929356575,
0.022510914131999016,
0.06954406201839447,
-0.012549417093396187,
0.021638214588165283,
0.013299173675477505,
0.18076097965240479,
0.0018442171858623624,
-0.004476927686482668,
0.1338731348514557,
-0.027180587872862816,
0.049439240247011185,
0.10235272347927094,
0.07407251745462418,
-0.05621626228094101,
0.01822376810014248,
0.059197258204221725,
0.011661822907626629,
-0.24308495223522186,
-0.062138475477695465,
-0.04039258137345314,
0.08200421929359436,
0.14052921533584595,
0.01941234990954399,
-0.13651438057422638,
0.06253580749034882,
0.008329717442393303,
0.07755830138921738,
-0.015621096827089787,
0.06136801838874817,
0.06524493545293808,
-0.027271782979369164,
0.06437183916568756,
-0.01785050332546234,
-0.009204220026731491,
0.07684292644262314,
0.01414224598556757,
0.10832707583904266,
-0.09826313704252243,
0.17165426909923553,
0.04440148547291756,
0.14007189869880676,
0.03435754030942917,
0.06706517189741135,
-0.059154901653528214,
0.012220794335007668,
0.016800206154584885,
-0.0928928330540657,
-0.03692588210105896,
0.013101806864142418,
-0.02941400371491909,
0.052447233349084854,
-0.12597084045410156,
0.10008779168128967,
0.01626751944422722,
0.1353018432855606,
0.09779747575521469,
-0.2979651391506195,
-0.06855139881372452,
0.0303950197994709,
-0.007910915650427341,
-0.12253303080797195,
0.07463482767343521,
0.14061978459358215,
-0.05941678583621979,
-0.026940278708934784,
-0.05621170252561569,
0.07369758188724518,
-0.061668168753385544,
0.035160209983587265,
0.05963749438524246,
0.2147417962551117,
0.007213436998426914,
0.10537717491388321,
-0.09416641294956207,
0.15291725099086761,
0.04386991634964943,
0.06328371167182922,
-0.029831886291503906,
0.04706815257668495,
0.04493720084428787,
-0.06514668464660645,
0.10209350287914276,
-0.002567716408520937,
0.07296799123287201,
-0.14713861048221588,
-0.1338174194097519,
0.044130466878414154,
0.08426541835069656,
-0.08021806925535202,
0.04826713353395462,
-0.038109149783849716,
-0.031772274523973465,
0.052218951284885406,
-0.06313609331846237,
-0.11812987923622131,
-0.13854388892650604,
0.011356252245604992,
0.004170258063822985,
-0.0041030882857739925,
-0.08736727386713028,
-0.07402299344539642,
-0.07107357680797577,
0.1643008589744568,
-0.042663417756557465,
-0.05848744511604309,
-0.09371718019247055,
0.06125045567750931,
0.1253720074892044,
-0.06968916207551956,
0.03718386963009834,
-0.011369463987648487,
0.12709395587444305,
0.018441038206219673,
-0.08748248219490051,
0.10311713814735413,
-0.05221214517951012,
-0.07924021780490875,
-0.0397283174097538,
0.1717081218957901,
0.0027863127179443836,
0.05614953115582466,
-0.03584684804081917,
-0.000023297887310036458,
-0.0018581830663606524,
-0.07664819061756134,
-0.005955434404313564,
0.08481956273317337,
0.2295285016298294,
0.0607108473777771,
-0.1625526398420334,
0.03135935217142105,
0.01640993356704712,
0.05363692715764046,
0.1577150821685791,
0.18658097088336945,
-0.07221578806638718,
0.026951853185892105,
0.14413362741470337,
-0.005648968741297722,
-0.23815569281578064,
0.01176236942410469,
0.07073847204446793,
0.06351355463266373,
-0.04376406967639923,
-0.23423638939857483,
0.09458830207586288,
0.03583921492099762,
0.008759540505707264,
-0.009024567902088165,
-0.22724826633930206,
-0.11810385435819626,
0.1519959568977356,
0.08838281780481339,
0.029668685048818588,
-0.058675166219472885,
-0.049113158136606216,
-0.09515558928251266,
-0.12152449041604996,
0.12087900936603546,
-0.1511538177728653,
0.0974670872092247,
0.001994397956877947,
0.05345643684267998,
0.04152170196175575,
-0.05948776379227638,
0.14521625638008118,
-0.054032329469919205,
0.036798033863306046,
-0.04893270507454872,
-0.08710635453462601,
-0.02090405859053135,
-0.09460213780403137,
0.12360924482345581,
-0.13792498409748077,
0.13919243216514587,
-0.12934567034244537,
-0.018608154729008675,
-0.020527416840195656,
-0.010384665802121162,
-0.03831469640135765,
-0.037566300481557846,
-0.036820605397224426,
0.04287803918123245,
0.04171024635434151,
-0.023844702169299126,
-0.040758997201919556,
0.006455230060964823,
-0.03285980224609375,
0.06930230557918549,
0.12761850655078888,
-0.028501175343990326,
-0.12467700988054276,
0.023643512278795242,
-0.0026243587490171194,
0.08461769670248032,
-0.2086152881383896,
0.035486944019794464,
0.09709000587463379,
0.01804836466908455,
0.13492627441883087,
0.03662274032831192,
-0.05903741717338562,
-0.026571154594421387,
0.04305862635374069,
-0.1506722867488861,
-0.07306814938783646,
-0.09446989744901657,
0.0537790022790432,
-0.19836848974227905,
-0.028655441477894783,
0.11935371905565262,
-0.0541514977812767,
-0.010160105302929878,
0.03457995876669884,
0.024893352761864662,
-0.05179595202207565,
0.12390615046024323,
0.08137001842260361,
0.07558882236480713,
-0.09172362089157104,
0.10122919827699661,
0.0349138081073761,
0.004507105797529221,
0.05509263649582863,
0.08599777519702911,
-0.09247497469186783,
-0.0406469851732254,
0.042016539722681046,
0.20776323974132538,
0.10057386755943298,
-0.022030584514141083,
-0.0672692134976387,
-0.08926397562026978,
0.05057763308286667,
0.06551608443260193,
0.04302775859832764,
-0.006191425956785679,
-0.07621089369058609,
-0.012107383459806442,
-0.08426330238580704,
0.1290288120508194,
0.04514288902282715,
0.007948584854602814,
-0.1364925503730774,
0.09610293805599213,
-0.008442679420113564,
0.03638644516468048,
-0.002342682331800461,
0.026291942223906517,
-0.10980095714330673,
-0.03955920785665512,
-0.19645799696445465,
0.060341525822877884,
0.0221426822245121,
0.0076352437026798725,
-0.020139645785093307,
0.037971269339323044,
-0.017313826829195023,
0.01683223620057106,
-0.06896045804023743,
-0.05848144739866257,
0.01535244844853878,
0.061203062534332275,
-0.09025540202856064,
0.02761886455118656,
0.043032873421907425,
-0.08918531239032745,
0.07752085477113724,
0.014047707431018353,
-0.00821760669350624,
0.04687286168336868,
0.017209848389029503,
-0.07353852689266205,
-0.00003094145722570829,
0.0386943519115448,
0.049395572394132614,
-0.10644746571779251,
0.008894459344446659,
-0.06488464027643204,
0.008062554523348808,
-0.03436075896024704,
0.06198035180568695,
-0.09708546847105026,
-0.0038771475665271282,
-0.009499282576143742,
0.019387278705835342,
-0.09883034974336624,
0.06602977961301804,
0.07594995945692062,
0.06291359663009644,
0.12966009974479675,
-0.048527635633945465,
0.019704831764101982,
-0.17737773060798645,
-0.005756096448749304,
0.009244484826922417,
-0.07919715344905853,
-0.061055075377225876,
-0.018131496384739876,
0.08499232679605484,
-0.09719832986593246,
0.06441506743431091,
0.013660200871527195,
-0.06974010169506073,
0.009196392260491848,
0.01534140482544899,
0.01593651808798313,
0.04479027912020683,
0.19507034122943878,
-0.0023711761459708214,
0.005408680997788906,
0.006348638329654932,
0.006639393977820873,
0.08082454651594162,
0.0645662173628807,
0.08078441023826599,
0.1552574336528778,
0.035390011966228485,
0.059458278119564056,
0.08547244220972061,
-0.06828202307224274,
-0.10874683409929276,
0.15576134622097015,
-0.06459725648164749,
0.09342611581087112,
-0.07388793677091599,
0.1589878648519516,
0.041313476860523224,
-0.12876398861408234,
0.06333719938993454,
-0.05165288969874382,
-0.14119310677051544,
-0.11192499846220016,
-0.12980954349040985,
-0.07642146199941635,
-0.08630480617284775,
0.0017812334699556231,
-0.12978769838809967,
-0.02762121707201004,
0.1140219122171402,
0.022862914949655533,
0.016095025464892387,
0.19106702506542206,
-0.1187232956290245,
-0.037989091128110886,
0.07289297133684158,
0.027282124385237694,
-0.012425780296325684,
-0.0341058075428009,
-0.048659030348062515,
0.05863422155380249,
0.031827911734580994,
0.07368110865354538,
-0.02315276488661766,
0.0010038312757387757,
0.03679148480296135,
-0.004833613522350788,
-0.07876120507717133,
-0.017406780272722244,
-0.00644273217767477,
0.04786953330039978,
0.1299738734960556,
0.03985492140054703,
-0.044891465455293655,
-0.019689900800585747,
0.2062777280807495,
-0.07702068984508514,
-0.04640137776732445,
-0.1579224020242691,
0.08912140130996704,
-0.024328341707587242,
-0.004299027379602194,
0.046247564256191254,
-0.11748051643371582,
-0.044613905251026154,
0.22504235804080963,
0.2283668965101242,
-0.08972538262605667,
-0.004198837094008923,
0.03327767550945282,
-0.010513749904930592,
-0.0664747878909111,
0.10174766927957535,
0.06754975020885468,
0.07754907757043839,
-0.03395861014723778,
-0.03551970794796944,
0.004899511579424143,
-0.053775910288095474,
-0.039829324930906296,
0.1109519824385643,
0.012984744273126125,
-0.0300911832600832,
-0.02766820788383484,
0.013437905348837376,
0.007967331446707249,
-0.18473070859909058,
-0.05578426644206047,
-0.1151832565665245,
-0.13827496767044067,
-0.04812701791524887,
0.046004414558410645,
0.01646910235285759,
0.0697513148188591,
-0.04665128141641617,
-0.0005449925665743649,
0.11735973507165909,
-0.021999264135956764,
-0.06513461470603943,
-0.06618811190128326,
0.06222685053944588,
0.011822785250842571,
0.17860561609268188,
0.024353206157684326,
0.07675614953041077,
0.120355986058712,
-0.016668876633048058,
-0.0963427871465683,
0.04351702332496643,
0.06489621102809906,
-0.08359390497207642,
0.0038061037193983793,
0.14162775874137878,
-0.05093594267964363,
0.05639009177684784,
0.015226289629936218,
-0.06246773526072502,
-0.04458274319767952,
-0.049178220331668854,
-0.08582819998264313,
-0.10646963119506836,
0.0014046255964785814,
-0.0968867614865303,
0.12350934743881226,
0.20975930988788605,
-0.04075637832283974,
-0.05857253819704056,
-0.043949659913778305,
0.06589000672101974,
0.006140504963696003,
-0.03520333766937256,
0.031733520328998566,
-0.1651710420846939,
-0.00797074381262064,
0.03591472655534744,
0.02331053838133812,
-0.33054161071777344,
-0.052835818380117416,
0.05007932335138321,
-0.006184388417750597,
-0.0775129571557045,
0.10785344988107681,
0.0983189195394516,
0.056709568947553635,
-0.03723730519413948,
-0.0149815259501338,
-0.042016610503196716,
0.08846406638622284,
-0.13475868105888367,
-0.098039910197258
] |
null | null | null |
# Building a HuggingFace Transformer NLP Model
## Running this Repo
|
{}
| null |
aogara/slai_transformer
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Building a HuggingFace Transformer NLP Model
## Running this Repo
|
[
"# Building a HuggingFace Transformer NLP Model",
"## Running this Repo"
] |
[
"TAGS\n#region-us \n",
"# Building a HuggingFace Transformer NLP Model",
"## Running this Repo"
] |
[
6,
12,
6
] |
[
"passage: TAGS\n#region-us \n# Building a HuggingFace Transformer NLP Model## Running this Repo"
] |
[
0.013473309576511383,
0.007143898401409388,
-0.0070312353782355785,
0.0072982278652489185,
0.09502169489860535,
0.030234765261411667,
0.15399576723575592,
0.00707502756267786,
0.08923710882663727,
-0.021858079358935356,
0.125832661986351,
0.035284075886011124,
0.0003995946899522096,
0.11205364018678665,
0.08965606987476349,
-0.30130746960639954,
0.03540896624326706,
0.11103807389736176,
-0.08661165833473206,
0.0629960373044014,
-0.010237868875265121,
0.003545509884133935,
0.07908032089471817,
-0.05375229939818382,
-0.23715493083000183,
0.03840523213148117,
-0.023968413472175598,
0.039515286684036255,
0.1353388875722885,
0.05072801560163498,
0.14331071078777313,
-0.024374065920710564,
-0.019494084641337395,
-0.1426045298576355,
0.04084983095526695,
-0.03034089505672455,
-0.06554938852787018,
0.06511066854000092,
0.02937389723956585,
0.03820110112428665,
0.17727170884609222,
0.06542357802391052,
-0.02130444347858429,
0.06564270704984665,
-0.18094602227210999,
-0.11552120000123978,
-0.0025493570137768984,
0.0463034063577652,
-0.008033199235796928,
0.049378760159015656,
0.022409196943044662,
0.2161124050617218,
-0.08409614861011505,
0.067006416618824,
0.0771532952785492,
-0.16250313818454742,
-0.0344679057598114,
0.09557097405195236,
0.11218036711215973,
-0.06338707357645035,
0.006658024620264769,
0.07692379504442215,
0.09285050630569458,
0.007837180979549885,
-0.08859159797430038,
-0.03726447746157646,
0.05311322212219238,
0.036942437291145325,
-0.1229608952999115,
-0.07543952763080597,
0.19395127892494202,
-0.0024702241644263268,
0.009084976278245449,
0.06154194474220276,
-0.055664628744125366,
-0.018269576132297516,
0.03142961114645004,
0.03672683238983154,
-0.024672064930200577,
0.042571913450956345,
-0.16345901787281036,
-0.07842167466878891,
-0.08689980953931808,
-0.09025195240974426,
-0.13590727746486664,
0.29054221510887146,
-0.03219959884881973,
0.08200465142726898,
-0.22753804922103882,
0.09204704314470291,
-0.10120527446269989,
-0.039010632783174515,
0.08100885897874832,
-0.11265874654054642,
0.051688820123672485,
0.03581944853067398,
-0.011239370331168175,
0.11872564256191254,
0.09203621745109558,
0.2030462771654129,
0.022272832691669464,
-0.027356015518307686,
-0.0220077782869339,
0.1063024178147316,
0.01884508691728115,
0.11561685055494308,
0.02890179120004177,
0.0217076875269413,
-0.026409544050693512,
-0.053144101053476334,
0.057416435331106186,
-0.06174170598387718,
-0.09033229947090149,
0.025304235517978668,
-0.13119615614414215,
0.061312444508075714,
0.06474192440509796,
0.028399590402841568,
-0.03316935896873474,
0.008427665568888187,
0.03582858294248581,
0.030310198664665222,
-0.006731821224093437,
-0.02386830933392048,
0.029559187591075897,
0.05467979237437248,
0.057557813823223114,
-0.0326700434088707,
0.06063365191221237,
0.0009305600542575121,
-0.1110854297876358,
-0.04189807549118996,
-0.10082167387008667,
0.014087510295212269,
0.02747793309390545,
-0.020882269367575645,
0.10981319844722748,
-0.16535113751888275,
-0.07636234909296036,
0.03232085332274437,
-0.018924765288829803,
-0.02456486225128174,
-0.038640499114990234,
-0.031743649393320084,
-0.005675869062542915,
-0.02820809744298458,
-0.024979904294013977,
-0.0633293092250824,
-0.016203302890062332,
0.06417502462863922,
0.10696940869092941,
0.09019292145967484,
-0.1254958063364029,
-0.003030248684808612,
-0.13479043543338776,
0.06396478414535522,
-0.08677902072668076,
-0.03339865431189537,
-0.033421315252780914,
0.18359194695949554,
0.07326673716306686,
0.034058716148138046,
-0.1297081857919693,
0.0893428772687912,
0.021426156163215637,
0.20906266570091248,
-0.08831083029508591,
-0.1579626500606537,
0.2270640879869461,
-0.1326751559972763,
-0.13597849011421204,
0.0751834586262703,
0.05329834297299385,
0.11970324069261551,
-0.00397910550236702,
0.14125597476959229,
-0.07207030057907104,
-0.06155918538570404,
-0.044787660241127014,
0.10708243399858475,
-0.23038128018379211,
-0.09853583574295044,
-0.0007009357213973999,
0.06615914404392242,
-0.2864525318145752,
-0.012454628013074398,
-0.04790675267577171,
0.18717177212238312,
-0.13401852548122406,
-0.0033613613341003656,
-0.021076727658510208,
-0.0453069768846035,
0.009926609694957733,
0.029179513454437256,
0.0003377553657628596,
-0.01917034201323986,
-0.020488042384386063,
0.018138960003852844,
0.0999973937869072,
0.053500086069107056,
0.013350212946534157,
-0.0645214170217514,
0.010926483199000359,
-0.060967303812503815,
-0.0066178482957184315,
-0.08543386310338974,
-0.14086289703845978,
-0.019183017313480377,
-0.04024418815970421,
0.11485078930854797,
0.13155053555965424,
0.11857737600803375,
0.03909948095679283,
0.05742591992020607,
-0.040775611996650696,
-0.028988845646381378,
-0.0049042124301195145,
0.0672808438539505,
-0.10163360089063644,
0.021697714924812317,
-0.11979732662439346,
0.06987566500902176,
0.020593466237187386,
0.03447599336504936,
-0.09627728909254074,
0.11577089130878448,
0.017738977447152138,
0.01334161777049303,
-0.03627151623368263,
0.03164772689342499,
0.019281499087810516,
-0.02626483514904976,
0.10537385940551758,
-0.006466351915150881,
-0.10420312732458115,
0.13968700170516968,
-0.12834127247333527,
0.050975628197193146,
0.17505469918251038,
-0.17577582597732544,
0.0053305672481656075,
-0.03665734827518463,
-0.08672572672367096,
0.01322330441325903,
0.08341138064861298,
-0.0019420436583459377,
0.041969459503889084,
-0.0010868434328585863,
0.07696942985057831,
0.048537395894527435,
-0.0027136565186083317,
-0.04034492000937462,
-0.07471000403165817,
-0.08688534796237946,
0.07303918898105621,
0.13935932517051697,
-0.05540856719017029,
0.11242863535881042,
0.1705750972032547,
-0.017086759209632874,
0.1958504319190979,
-0.025360552594065666,
-0.012002523988485336,
-0.07415294647216797,
0.033615581691265106,
0.017382023856043816,
0.08061189204454422,
-0.13949920237064362,
-0.05726214498281479,
0.02900039404630661,
-0.0684419646859169,
0.1168651282787323,
-0.10894695669412613,
-0.10825561732053757,
-0.01428770087659359,
0.024751821532845497,
-0.10253864526748657,
0.06811091303825378,
-0.07788866013288498,
0.01720687560737133,
-0.003772030584514141,
-0.10344693809747696,
0.08119629323482513,
0.011478795669972897,
0.015290025621652603,
0.07733520865440369,
-0.044060491025447845,
-0.1993630826473236,
-0.2675066292285919,
-0.11081057786941528,
-0.06132359057664871,
0.018718520179390907,
0.00483891274780035,
-0.14901615679264069,
-0.018397724255919456,
0.0777381882071495,
0.17040526866912842,
-0.030595455318689346,
-0.0070402720011770725,
-0.05666515603661537,
0.08045309782028198,
-0.0371202751994133,
-0.1354885697364807,
-0.009577653370797634,
-0.056415531784296036,
-0.05426979064941406,
0.11788170039653778,
-0.1370246857404709,
0.09234466403722763,
0.14614719152450562,
0.07412878423929214,
0.046647749841213226,
0.0380597785115242,
0.052729591727256775,
-0.09942684322595596,
-0.09045025706291199,
0.15890611708164215,
0.001944348681718111,
0.044488683342933655,
0.10549654066562653,
0.08296360075473785,
-0.09332283586263657,
0.032208237797021866,
-0.062329407781362534,
-0.1676768958568573,
-0.06586223840713501,
-0.07517461478710175,
-0.05936042219400406,
0.2496587485074997,
0.018692992627620697,
0.03779022395610809,
0.11882290244102478,
0.08560524880886078,
0.06821901351213455,
-0.00040087001980282366,
-0.07949194312095642,
0.06593222916126251,
0.0002911954070441425,
-0.12194826453924179,
0.02278583124279976,
-0.08151451498270035,
-0.19696161150932312,
0.05690108239650726,
0.028225688263773918,
0.12897974252700806,
0.11692523211240768,
-0.011409987695515156,
0.05631571635603905,
0.019095011055469513,
0.15319867432117462,
0.20498809218406677,
0.0751958042383194,
-0.09087564051151276,
-0.02432793378829956,
-0.0028157001361250877,
-0.05381076782941818,
0.03895682841539383,
0.09159721434116364,
-0.019122350960969925,
0.00673465384170413,
-0.12346281111240387,
-0.027940120548009872,
0.03594636172056198,
0.10312654078006744,
-0.18441829085350037,
0.05961661785840988,
0.05974762886762619,
0.06898307800292969,
-0.0737520083785057,
0.045268189162015915,
0.013475378043949604,
-0.0016393184196203947,
-0.03147915005683899,
0.014663306064903736,
0.10140920430421829,
-0.007054105866700411,
0.06622330844402313,
-0.09272994101047516,
0.08767961710691452,
-0.011125442571938038,
0.03805183619260788,
-0.11203538626432419,
0.22980614006519318,
-0.016187282279133797,
-0.02173934318125248,
0.02481755241751671,
-0.06146765500307083,
0.09077750891447067,
0.13419276475906372,
0.08355393260717392,
0.03109939768910408,
-0.05656806379556656,
-0.16482320427894592,
-0.11051705479621887,
0.029834631830453873,
0.050711534917354584,
-0.05458657443523407,
-0.013797791674733162,
-0.013936501927673817,
0.0025904024951159954,
-0.0022382447496056557,
0.15780481696128845,
0.014866350218653679,
-0.0466129444539547,
-0.05107938125729561,
-0.011104810051620007,
0.06877302378416061,
-0.004244247917085886,
0.014805356971919537,
0.0927286297082901,
0.06892210245132446,
0.17460885643959045,
0.05757453665137291,
-0.06373805552721024,
-0.09862446784973145,
0.11689739674329758,
-0.01933097653090954,
-0.01612715795636177,
-0.06862962990999222,
-0.05005853623151779,
-0.0771411880850792,
-0.14381827414035797,
0.0874907374382019,
-0.07828934490680695,
0.04300837591290474,
-0.015195846557617188,
0.12223189324140549,
-0.005137080326676369,
0.03868287429213524,
-0.04635540395975113,
-0.010066852904856205,
-0.06829427182674408,
-0.08227851986885071,
0.11854160577058792,
0.05003030598163605,
-0.14958210289478302,
0.10133606940507889,
-0.006758445408195257,
0.1633194237947464,
-0.03213087096810341,
0.029635794460773468,
0.11936286091804504,
0.3388214707374573,
0.017024850472807884,
-0.0035084127448499203,
0.24382098019123077,
-0.029838701710104942,
-0.28441017866134644,
-0.1616794317960739,
-0.13971562683582306,
0.01695583388209343,
0.047087643295526505,
-0.11698203533887863,
0.09852349013090134,
0.03855254873633385,
0.004888833500444889,
0.059255167841911316,
-0.10458972305059433,
-0.014942503534257412,
0.20726566016674042,
0.010074939578771591,
0.5200842618942261,
-0.10847090184688568,
-0.04992446303367615,
-0.04737718403339386,
-0.1614120453596115,
0.19644513726234436,
-0.16454792022705078,
0.08908198773860931,
-0.040152885019779205,
0.08819271624088287,
0.0382501557469368,
-0.034050263464450836,
0.1646961271762848,
-0.07283013314008713,
0.05940718203783035,
-0.10704832524061203,
-0.1093960702419281,
-0.037841927260160446,
-0.004363789688795805,
0.04980170726776123,
0.03536558151245117,
-0.04688612371683121,
-0.1074891984462738,
-0.013908387161791325,
-0.10847966372966766,
0.09804104268550873,
0.002979323733597994,
-0.0676451176404953,
0.022583818063139915,
-0.020443441346287727,
-0.1463133543729782,
0.0436786450445652,
0.14387311041355133,
-0.05009932816028595,
0.14221669733524323,
0.13371160626411438,
0.09042304009199142,
-0.059952519834041595,
-0.008072618395090103,
-0.023577408865094185,
-0.1059599369764328,
0.09640941023826599,
-0.07070057839155197,
-0.006882248446345329,
0.07114110141992569,
0.03229476511478424,
-0.01945672184228897,
0.1296880543231964,
-0.08975772559642792,
-0.03850804269313812,
0.16130302846431732,
-0.11112237721681595,
-0.1309032440185547,
-0.08884904533624649,
0.00890359003096819,
0.08350738883018494,
0.09216322004795074,
0.11461012810468674,
-0.009620812721550465,
0.02062073163688183,
0.017467033118009567,
-0.05322107672691345,
-0.14580920338630676,
-0.033970363438129425,
0.08164535462856293,
0.024509815499186516,
-0.08805205672979355,
0.06591346859931946,
-0.053900398313999176,
-0.07995160669088364,
-0.06973810493946075,
0.16712017357349396,
0.008694560267031193,
-0.0881895124912262,
-0.10874150693416595,
0.2505693733692169,
-0.21396495401859283,
-0.05064479261636734,
-0.06962298601865768,
-0.1262490600347519,
0.01053585670888424,
0.06824438273906708,
0.037752244621515274,
0.06524841487407684,
-0.03159106522798538,
0.07051727920770645,
-0.12449760735034943,
-0.04700344428420067,
-0.030006125569343567,
0.04442339017987251,
-0.07109028846025467,
0.06381718069314957,
0.014775426127016544,
0.134973406791687,
-0.12205109000205994,
-0.08139834553003311,
-0.2096596658229828,
0.05943679064512253,
-0.06545720249414444,
-0.15047654509544373,
-0.03733183443546295,
-0.06274731457233429,
0.051513999700546265,
-0.01252944115549326,
-0.09283950179815292,
-0.05411548539996147,
-0.1155780628323555,
0.04331206530332565,
0.038703206926584244,
-0.05665877088904381,
0.016214800998568535,
0.012328111566603184,
0.15364350378513336,
-0.07288026064634323,
0.04547418653964996,
0.10818369686603546,
0.018338870257139206,
0.08306817710399628,
-0.07582731544971466,
-0.0975026786327362,
0.08664432168006897,
-0.05437614023685455,
0.04828573763370514,
0.07899069786071777,
-0.056447479873895645,
-0.04494690150022507,
-0.010934893041849136,
0.027035314589738846,
0.027013035491108894,
-0.029281044378876686,
0.0946148931980133,
0.058966320008039474,
-0.15586033463478088,
-0.06747250258922577,
-0.08429808169603348,
0.0682833269238472,
0.1164487823843956,
0.016988631337881088,
0.021587172523140907,
0.05213956907391548,
-0.059275493025779724,
0.043638356029987335,
-0.0009921214077621698,
-0.0920550525188446,
0.18140338361263275,
-0.04470067098736763,
0.005089788697659969,
0.04250900819897652,
0.19920296967029572,
-0.08717267215251923,
-0.06752292811870575,
-0.027556298300623894,
0.18577246367931366,
-0.0006164846708998084,
-0.05778563395142555,
0.1423584520816803,
0.12401222437620163,
0.022707361727952957,
-0.12672342360019684,
0.129730224609375,
-0.019684623926877975,
-0.13002389669418335,
0.06359220296144485,
-0.09354375302791595,
0.006958474405109882,
0.10246248543262482,
0.10743165761232376,
0.056675031781196594,
0.0011852486059069633,
-0.09039675444364548,
0.04822813719511032,
-0.03781730309128761,
-0.011422157287597656,
0.13563692569732666,
0.18105289340019226,
0.06460720300674438,
0.04805958271026611,
0.04313831776380539,
-0.013018669560551643,
-0.14447419345378876,
-0.05697193369269371,
-0.012633156962692738,
-0.15023992955684662,
0.04937959462404251,
-0.009169328957796097,
-0.09928125888109207,
0.09641775488853455,
-0.01006836723536253,
-0.006673278287053108,
0.07923700660467148,
0.01333026122301817,
-0.002064507920295,
0.01157980877906084,
-0.06554709374904633,
-0.01934894546866417,
-0.06671055406332016,
-0.012354501523077488,
-0.07975944876670837,
0.0034099838230758905,
-0.04934896156191826,
0.00005735251397709362,
-0.05289667472243309,
0.051146622747182846,
-0.03289005532860756,
-0.0928380936384201,
-0.09357743710279465,
0.026913749054074287,
-0.033559538424015045,
0.04248906672000885,
-0.051326457411050797,
-0.011873052455484867,
0.02716277353465557,
0.06964842230081558,
-0.04511361941695213,
-0.052529290318489075,
-0.033227626234292984,
0.1957830935716629,
-0.1522030234336853,
0.08954480290412903,
-0.09929025918245316,
-0.014090820215642452,
-0.1367579847574234,
0.22140489518642426,
0.22123484313488007,
-0.07246565073728561,
0.020845560356974602,
0.03915974497795105,
0.03463876619935036,
-0.08333735913038254,
0.11699242144823074,
0.08533380925655365,
0.23124109208583832,
-0.11800574511289597,
-0.0057122912257909775,
-0.026888610795140266,
-0.010866009630262852,
-0.009422282688319683,
-0.07842685282230377,
0.08077988028526306,
-0.05261831358075142,
-0.07852934300899506,
0.03991157189011574,
-0.24591998755931854,
0.13040877878665924,
0.05211104080080986,
-0.13160650432109833,
0.008251585066318512,
-0.06004827469587326,
0.14241768419742584,
0.04655798152089119,
0.18763059377670288,
-0.05716920644044876,
-0.05358101427555084,
0.036242175847291946,
-0.0011935881339013577,
-0.29105237126350403,
-0.05894295498728752,
0.054615311324596405,
-0.007832877337932587,
0.06843753904104233,
-0.053504157811403275,
-0.006990525405853987,
0.056181900203228,
-0.038314495235681534,
-0.04183096066117287,
0.07588478922843933,
-0.029905619099736214,
-0.15179653465747833,
-0.13853342831134796,
0.0865853875875473,
-0.027051826938986778,
-0.09998724609613419,
-0.016762780025601387,
-0.1603148877620697,
0.013752024620771408,
0.1123107522726059,
-0.00784268882125616,
-0.03401191532611847,
0.0017532811034470797,
-0.07797713577747345,
0.1093943789601326,
0.10418055206537247,
0.03599613532423973,
-0.027899324893951416,
-0.029421571642160416,
0.049994781613349915,
0.06531666964292526,
-0.03347017616033554,
-0.05891566723585129,
0.02683359384536743,
-0.062301795929670334,
0.07324828952550888,
-0.09533996134996414,
-0.17941531538963318,
-0.03639236092567444,
-0.16037997603416443,
-0.010836980305612087,
-0.06128809228539467,
-0.014206948690116405,
0.11362329870462418,
0.036523446440696716,
0.03213827684521675,
-0.17215906083583832,
0.07396464049816132,
0.043303489685058594,
-0.07331052422523499,
-0.10448521375656128
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my-new-model
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the xsum dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.1
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["xsum"], "model-index": [{"name": "my-new-model", "results": []}]}
|
question-answering
|
aozorahime/my-new-model
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:xsum",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-xsum #license-apache-2.0 #endpoints_compatible #region-us
|
# my-new-model
This model is a fine-tuned version of bert-base-uncased on the xsum dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.1
- Datasets 1.15.1
- Tokenizers 0.10.3
|
[
"# my-new-model\n\nThis model is a fine-tuned version of bert-base-uncased on the xsum dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.12.3\n- Pytorch 1.9.1\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-xsum #license-apache-2.0 #endpoints_compatible #region-us \n",
"# my-new-model\n\nThis model is a fine-tuned version of bert-base-uncased on the xsum dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.12.3\n- Pytorch 1.9.1\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
50,
31,
6,
12,
8,
3,
103,
31
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-xsum #license-apache-2.0 #endpoints_compatible #region-us \n# my-new-model\n\nThis model is a fine-tuned version of bert-base-uncased on the xsum dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0\n- mixed_precision_training: Native AMP### Framework versions\n\n- Transformers 4.12.3\n- Pytorch 1.9.1\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
-0.09891290962696075,
0.10254285484552383,
-0.0018051326042041183,
0.0431886725127697,
0.15423913300037384,
0.010519375093281269,
0.11267352104187012,
0.10221970081329346,
-0.10001213848590851,
0.051547303795814514,
0.0668063536286354,
0.017969004809856415,
0.03957400470972061,
0.10238628834486008,
-0.02563609555363655,
-0.28760382533073425,
0.049761589616537094,
0.05145692825317383,
-0.09075868874788284,
0.08857235312461853,
0.1226598471403122,
-0.10879462957382202,
0.06416782736778259,
0.05640702322125435,
-0.17868368327617645,
0.034831054508686066,
-0.02288104221224785,
-0.10058104991912842,
0.08531512320041656,
0.01825530268251896,
0.09888921678066254,
-0.0011057668598368764,
0.08698811382055283,
-0.17761319875717163,
0.005350941326469183,
0.07235324382781982,
0.044590964913368225,
0.09768624603748322,
0.06166394054889679,
0.028112033382058144,
0.1209796741604805,
-0.07380741089582443,
0.10303620994091034,
0.04472401365637779,
-0.06497729569673538,
-0.2515811622142792,
-0.07949283719062805,
0.10437655448913574,
0.09363597631454468,
0.1044890359044075,
0.014404548332095146,
0.16394227743148804,
-0.10007093846797943,
0.04207579419016838,
0.18808382749557495,
-0.2638053596019745,
-0.06591273099184036,
-0.01018149871379137,
0.06900124996900558,
0.033007122576236725,
-0.08906252682209015,
-0.038403984159231186,
0.04203362017869949,
0.04447680339217186,
0.09914290904998779,
-0.004361680243164301,
-0.04567999765276909,
-0.006351039279252291,
-0.14592967927455902,
-0.02954990603029728,
0.23333777487277985,
0.05610150098800659,
-0.07405943423509598,
-0.06419030576944351,
-0.037916187196969986,
-0.05422542244195938,
-0.033454008400440216,
-0.031300630420446396,
0.026521943509578705,
-0.04585674777626991,
-0.07579949498176575,
-0.06751276552677155,
-0.07458575069904327,
-0.07013369351625443,
-0.017312992364168167,
0.19071471691131592,
0.06931131333112717,
0.027762506157159805,
-0.0634879320859909,
0.08310606330633163,
-0.031342703849077225,
-0.10818573087453842,
-0.025230806320905685,
-0.030455462634563446,
-0.0420977845788002,
-0.044546905905008316,
-0.06869397312402725,
-0.021375682204961777,
0.0005001795361749828,
0.17342865467071533,
-0.06716454774141312,
0.04690618813037872,
0.0035974581260234118,
0.01017021480947733,
0.001831849804148078,
0.15333731472492218,
-0.06337304413318634,
0.04230932518839836,
0.015006124041974545,
0.07049164921045303,
0.012782210484147072,
-0.005517934914678335,
-0.08188552409410477,
-0.013931840658187866,
0.08644571155309677,
0.04970772936940193,
-0.026698710396885872,
0.020545197650790215,
-0.01783636584877968,
-0.06375223398208618,
0.022538671270012856,
-0.10847842693328857,
0.03700968250632286,
-0.006458066403865814,
-0.08662127703428268,
0.05043866112828255,
0.008927247487008572,
0.007022703532129526,
-0.06223614141345024,
0.050571639090776443,
-0.10758291184902191,
-0.01588311232626438,
-0.1095099076628685,
-0.08143515139818192,
0.029150066897273064,
-0.047589611262083054,
0.01694893278181553,
-0.09703216701745987,
-0.16114327311515808,
0.007523477077484131,
0.034529589116573334,
-0.03455810621380806,
-0.08010351657867432,
0.00434564333409071,
-0.05702786147594452,
0.016391772776842117,
-0.031379036605358124,
0.13266195356845856,
-0.03384651243686676,
0.06529580801725388,
0.03762243315577507,
-0.00370168499648571,
-0.026858918368816376,
0.06149215251207352,
-0.09396111220121384,
0.038836922496557236,
-0.12323927134275436,
0.037890221923589706,
-0.10886765271425247,
0.02170157991349697,
-0.11379928141832352,
-0.12274348735809326,
0.04274873435497284,
-0.012028837576508522,
0.08165288716554642,
0.09474760293960571,
-0.06801031529903412,
-0.05388748273253441,
0.09150400757789612,
-0.09158619493246078,
-0.10942619293928146,
0.09811969846487045,
-0.02659623511135578,
0.00349589460529387,
0.04696258530020714,
0.12119050323963165,
0.10105428099632263,
-0.11642878502607346,
-0.018568826839327812,
0.014619443565607071,
0.07690689712762833,
-0.04806183651089668,
0.09355731308460236,
0.0038951695896685123,
-0.06623023003339767,
0.03538160398602486,
-0.07790350914001465,
0.03522571548819542,
-0.08655688911676407,
-0.0763847604393959,
-0.06625593453645706,
-0.07750391215085983,
0.03672937676310539,
0.02945915423333645,
0.05387938395142555,
-0.07430389523506165,
-0.12034602463245392,
0.1583671271800995,
0.14294025301933289,
-0.050568558275699615,
0.0170676838606596,
-0.09594658762216568,
0.073050357401371,
-0.07108574360609055,
-0.012274108827114105,
-0.19951893389225006,
-0.11396446079015732,
0.0447223000228405,
-0.07879357039928436,
0.030015578493475914,
-0.0075803520157933235,
0.04401613026857376,
0.06422333419322968,
-0.04169708862900734,
-0.050202760845422745,
-0.13064374029636383,
-0.006999222561717033,
-0.09141963720321655,
-0.16494645178318024,
-0.0947709009051323,
-0.028371380642056465,
0.12326493859291077,
-0.16981497406959534,
0.022458193823695183,
-0.018554339185357094,
0.1347496062517166,
0.008553088642656803,
-0.020904215052723885,
-0.014511904679238796,
0.06632692366838455,
-0.016473522409796715,
-0.06290389597415924,
0.03684104233980179,
0.03183196485042572,
-0.0889986902475357,
-0.03250150755047798,
-0.10661258548498154,
0.11144666373729706,
0.09762676805257797,
0.010789555497467518,
-0.032161884009838104,
-0.03717772290110588,
-0.08105553686618805,
-0.04306333139538765,
-0.054502200335264206,
-0.003946657758206129,
0.17183652520179749,
0.008246483281254768,
0.13209767639636993,
-0.08070633560419083,
-0.042924948036670685,
0.022685790434479713,
-0.010616089217364788,
-0.03929498419165611,
0.05089815706014633,
0.08297628909349442,
-0.07162746787071228,
0.10126785188913345,
0.12446822971105576,
-0.10158740729093552,
0.10553188621997833,
-0.07626539468765259,
-0.10865115374326706,
-0.016231853514909744,
-0.01900663785636425,
-0.00871462281793356,
0.1306341588497162,
-0.15017828345298767,
0.008956411853432655,
0.05395010486245155,
0.024037117138504982,
0.06261882930994034,
-0.1647036075592041,
0.010330187156796455,
0.010820857249200344,
-0.005314046982675791,
-0.07124648243188858,
-0.007986243814229965,
0.03902418911457062,
0.08074802905321121,
0.03390884026885033,
-0.008137410506606102,
0.04481995478272438,
0.015571288764476776,
-0.07384362071752548,
0.20051708817481995,
-0.13037700951099396,
-0.1514166295528412,
-0.1553313136100769,
0.011187885887920856,
-0.09081382304430008,
-0.031832944601774216,
0.029821062460541725,
-0.07641957700252533,
-0.06415115296840668,
-0.05696600675582886,
0.029731417074799538,
-0.09579633921384811,
0.004413241986185312,
0.03995160758495331,
0.01525011844933033,
0.10333777219057083,
-0.13376882672309875,
0.01258599478751421,
-0.006509530823677778,
-0.08995196968317032,
-0.007284580264240503,
0.04890323057770729,
0.10598458349704742,
0.10873281955718994,
-0.017422441393136978,
0.01901986449956894,
-0.03530074656009674,
0.20604027807712555,
-0.050319790840148926,
-0.046031031757593155,
0.17204758524894714,
0.030037499964237213,
0.05938703939318657,
0.06980789452791214,
0.045244403183460236,
-0.057704467326402664,
0.023670895025134087,
0.05013154819607735,
-0.006938570644706488,
-0.2547397315502167,
-0.046933144330978394,
-0.033979885280132294,
-0.06295368075370789,
0.11111771315336227,
0.04772258549928665,
0.01944783329963684,
0.0891784131526947,
-0.040565259754657745,
0.05808514729142189,
-0.05632855370640755,
0.11703816056251526,
0.14453323185443878,
0.037578318268060684,
0.1072225421667099,
-0.02384479157626629,
-0.03580082952976227,
0.06577488034963608,
0.005920561030507088,
0.25257551670074463,
-0.015027984045445919,
0.081993468105793,
0.042903557419776917,
0.15740545094013214,
0.007451792247593403,
0.07283157110214233,
0.010185631923377514,
0.0029262083116918802,
0.004709264729171991,
-0.05409374460577965,
-0.026094015687704086,
0.014861758798360825,
0.0004913780721835792,
0.07931690663099289,
-0.1447625756263733,
-0.008799221366643906,
-0.009209346026182175,
0.2994346618652344,
0.007137936539947987,
-0.2982103228569031,
-0.11401497572660446,
0.013866334222257137,
-0.04524208977818489,
-0.10066728293895721,
0.02475772425532341,
0.0686512440443039,
-0.1336568295955658,
0.03887258470058441,
-0.05746803060173988,
0.10909893363714218,
0.014043610543012619,
-0.0015117910224944353,
0.0432250089943409,
0.16043686866760254,
0.011604535393416882,
0.1153707504272461,
-0.24638675153255463,
0.20234373211860657,
0.0163833349943161,
0.10318323224782944,
-0.05890897661447525,
0.045305147767066956,
0.006984700448811054,
0.07211963087320328,
0.05303872004151344,
-0.005705042742192745,
-0.04546412080526352,
-0.1647844910621643,
-0.08018285781145096,
0.052042748779058456,
0.08438931405544281,
-0.010561550967395306,
0.09989709407091141,
-0.045236267149448395,
0.02871733158826828,
0.049725789576768875,
-0.04555206745862961,
-0.1950235515832901,
-0.1288309395313263,
0.02753295935690403,
0.026755526661872864,
0.013638678938150406,
-0.11537271738052368,
-0.11743050068616867,
-0.0017047249712049961,
0.15013666450977325,
0.0044723618775606155,
-0.054677803069353104,
-0.1478642225265503,
0.07670710980892181,
0.141878142952919,
-0.06437338888645172,
0.019166629761457443,
-0.014726932160556316,
0.17520257830619812,
0.023050449788570404,
-0.1095166802406311,
0.06686816364526749,
-0.0917128399014473,
-0.15349708497524261,
-0.02094525471329689,
0.12554234266281128,
0.03726648539304733,
0.046457041054964066,
-0.006189865525811911,
0.006355279590934515,
-0.026519201695919037,
-0.09327788650989532,
-0.0062557305209338665,
0.06368157267570496,
0.06419336050748825,
0.07700219005346298,
-0.093257836997509,
0.04536823183298111,
-0.020486218854784966,
0.0030258626211434603,
0.12648427486419678,
0.1884065717458725,
-0.06444674730300903,
0.048070721328258514,
0.11480412632226944,
-0.07561328262090683,
-0.17098353803157806,
0.056096263229846954,
0.11039359122514725,
0.009950672276318073,
0.003201174782589078,
-0.2327761948108673,
0.13290727138519287,
0.11712579429149628,
-0.028225112706422806,
0.025604868307709694,
-0.277815043926239,
-0.09997391700744629,
0.12862306833267212,
0.11836841702461243,
0.07024358212947845,
-0.1443548947572708,
-0.035217881202697754,
-0.042300738394260406,
-0.12576201558113098,
0.1437273770570755,
-0.11218694597482681,
0.09034282714128494,
-0.03395659103989601,
0.10836204886436462,
0.022546488791704178,
-0.043155573308467865,
0.15632110834121704,
0.00237217266112566,
0.06847941875457764,
-0.026781070977449417,
0.061851195991039276,
0.14158771932125092,
-0.05738929659128189,
0.07498802989721298,
-0.010315615683794022,
0.055270612239837646,
-0.1682072877883911,
-0.024625547230243683,
-0.07702076435089111,
0.08133291453123093,
-0.04736432060599327,
-0.05238452926278114,
-0.027764014899730682,
0.030228212475776672,
0.035546042025089264,
-0.031513456255197525,
0.140287384390831,
0.058338090777397156,
0.12123917788267136,
0.08183931559324265,
0.114674873650074,
-0.05378507077693939,
-0.1348981410264969,
-0.006912593264132738,
-0.03460980951786041,
0.0991777554154396,
-0.10544751584529877,
0.024808557704091072,
0.11203271150588989,
0.03780239075422287,
0.11112352460622787,
0.060369014739990234,
-0.07175298035144806,
0.007533477619290352,
0.031509678810834885,
-0.10366509109735489,
-0.1609097272157669,
-0.054834019392728806,
0.09551718086004257,
-0.16137108206748962,
0.0576210543513298,
0.09997425973415375,
-0.08475416153669357,
-0.02727820910513401,
-0.0045553226955235004,
-0.006198869552463293,
-0.02275841496884823,
0.16225624084472656,
0.061181534081697464,
0.06846167147159576,
-0.09462332725524902,
0.13846707344055176,
0.07858964055776596,
-0.12293002754449844,
0.049583278596401215,
0.04262644797563553,
-0.06743182986974716,
-0.02899206057190895,
0.0338277667760849,
0.11838001012802124,
-0.09548303484916687,
-0.081100232899189,
-0.09873969852924347,
-0.13469499349594116,
0.0667387992143631,
0.07441151142120361,
0.055075258016586304,
-0.022866161540150642,
-0.049439627677202225,
0.04381101578474045,
-0.15475118160247803,
0.08362185209989548,
0.0311740655452013,
0.09273254126310349,
-0.1633981615304947,
0.09197114408016205,
0.014928564429283142,
0.07400843501091003,
-0.016176367178559303,
-0.02154768444597721,
-0.10142410546541214,
-0.006534785497933626,
-0.1966232806444168,
-0.02482447773218155,
-0.055466458201408386,
0.0074666282162070274,
-0.010664233937859535,
-0.050780072808265686,
-0.06477416306734085,
0.07000411301851273,
-0.08635256439447403,
-0.04601897671818733,
0.027481572702527046,
0.052827730774879456,
-0.11981784552335739,
0.013733688741922379,
0.03091273084282875,
-0.10473337769508362,
0.07443436980247498,
0.08118920028209686,
0.03343819081783295,
0.04576778784394264,
-0.03697749227285385,
-0.037044260650873184,
0.014894449152052402,
0.04090799018740654,
0.07244786620140076,
-0.11203762143850327,
0.008296560496091843,
0.0029258178547024727,
0.043420419096946716,
-0.0026154876686632633,
0.044265005737543106,
-0.13496632874011993,
-0.06917041540145874,
-0.0135478675365448,
-0.048159126192331314,
-0.07861271500587463,
0.03181052953004837,
0.09888577461242676,
0.0501408688724041,
0.1763531118631363,
-0.05620577186346054,
0.016022583469748497,
-0.17425982654094696,
0.0009611943387426436,
-0.03803199157118797,
-0.039770547300577164,
-0.05249080806970596,
-0.021114327013492584,
0.05364832654595375,
-0.05570381134748459,
0.1354835033416748,
-0.03688256815075874,
0.12202706187963486,
0.04498790577054024,
-0.05327347666025162,
0.02575475163757801,
0.015950676053762436,
0.2583365738391876,
0.07515028119087219,
-0.0005599646246992052,
0.10829632729291916,
0.0032319084275513887,
0.05980697646737099,
0.10450270771980286,
0.13971877098083496,
0.12058962136507034,
-0.008585470728576183,
0.087958425283432,
0.08626555651426315,
-0.07209859043359756,
-0.1347237080335617,
0.055256038904190063,
0.030475376173853874,
0.11054900288581848,
-0.02704920619726181,
0.19759896397590637,
0.12128539383411407,
-0.14180436730384827,
0.04566314443945885,
-0.03229604288935661,
-0.08616858720779419,
-0.10792479664087296,
-0.05357789620757103,
-0.07099387794733047,
-0.1584710329771042,
0.009842950850725174,
-0.14899015426635742,
-0.004559037741273642,
0.06302370876073837,
0.0008484940626658499,
-0.00434849364683032,
0.11635316908359528,
-0.008027118630707264,
-0.005101361311972141,
0.06478177011013031,
-0.011719397269189358,
-0.007882144302129745,
-0.024382496252655983,
-0.05738165229558945,
0.048920899629592896,
-0.005296010058373213,
0.07151956111192703,
-0.03146445378661156,
-0.02603301964700222,
0.0481342114508152,
-0.009140518493950367,
-0.07166456431150436,
0.020729463547468185,
0.002832187106832862,
0.03506606072187424,
0.05254625156521797,
0.01974877342581749,
-0.015265361405909061,
-0.044578298926353455,
0.27586498856544495,
-0.0910399779677391,
-0.056649237871170044,
-0.14259642362594604,
0.2586115300655365,
0.0411333367228508,
-0.03934000805020332,
0.0839158371090889,
-0.08621398359537125,
-0.052393317222595215,
0.1546122431755066,
0.12161113321781158,
-0.018799643963575363,
-0.033656567335128784,
-0.003019244410097599,
-0.03079642727971077,
-0.08964505791664124,
0.13372331857681274,
0.1227479949593544,
0.03899795934557915,
-0.05988446623086929,
-0.001868930528871715,
-0.04033491015434265,
-0.013283957727253437,
-0.11059771478176117,
0.07447414845228195,
0.02620142698287964,
-0.03164377436041832,
-0.029144197702407837,
0.05182802304625511,
-0.016622677445411682,
-0.10360149294137955,
-0.018682124093174934,
-0.09000585973262787,
-0.17367637157440186,
-0.046858880668878555,
0.05028602108359337,
-0.00291279680095613,
0.06750920414924622,
-0.02171497233211994,
0.0006152403075248003,
0.15263210237026215,
-0.00627749552950263,
-0.06740708649158478,
-0.12884382903575897,
0.1502847820520401,
-0.024082940071821213,
0.22147884964942932,
0.004276524763554335,
0.0697602704167366,
0.11312340199947357,
0.023358909413218498,
-0.16329218447208405,
0.020970752462744713,
0.062473636120557785,
-0.10353817790746689,
0.03379309922456741,
0.1587049514055252,
-0.030603455379605293,
0.09763529896736145,
0.004483464639633894,
-0.1358039379119873,
-0.04129571095108986,
-0.08075428009033203,
-0.013577514328062534,
-0.05788463354110718,
0.00217809877358377,
-0.08013537526130676,
0.13098470866680145,
0.19123998284339905,
-0.05622227117419243,
-0.04963607341051102,
-0.08929917961359024,
0.05060715228319168,
0.05803519859910011,
0.07425335794687271,
-0.0062856245785951614,
-0.2050168365240097,
0.00012990320101380348,
0.04237224534153938,
0.010694364085793495,
-0.2583938539028168,
-0.06413166224956512,
0.04407605156302452,
-0.04642738401889801,
-0.054366402328014374,
0.08253920823335648,
0.07441803067922592,
0.03560208901762962,
-0.05759838968515396,
-0.09237872064113617,
-0.08585802465677261,
0.14169584214687347,
-0.1511925756931305,
-0.04910542070865631
] |
null | null |
transformers
|
# Aladdin Bot
|
{"tags": ["conversational"]}
|
text-generation
|
aplnestrella/Aladdin-Bot
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Aladdin Bot
|
[
"# Aladdin Bot"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Aladdin Bot"
] |
[
51,
4
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Aladdin Bot"
] |
[
0.03243407607078552,
0.000021639516489813104,
-0.00728259515017271,
0.06692399829626083,
0.15213626623153687,
-0.00957894790917635,
0.23337796330451965,
0.12555500864982605,
0.019445504993200302,
-0.06532620638608932,
0.1016397476196289,
0.15122202038764954,
0.061493340879678726,
0.0628434270620346,
-0.016163965687155724,
-0.2498691976070404,
0.05290469154715538,
0.04765111207962036,
-0.027462486177682877,
0.13394342362880707,
0.0924372598528862,
-0.006718708202242851,
0.09748352319002151,
0.008774857968091965,
-0.16664548218250275,
0.01766454428434372,
0.011518123559653759,
-0.15260952711105347,
0.12254969030618668,
0.08695031702518463,
0.07122154533863068,
0.004269007593393326,
-0.03036060370504856,
-0.1057964637875557,
0.04485193267464638,
-0.02661418356001377,
-0.03694780915975571,
0.006907603703439236,
0.026566000655293465,
-0.09650032967329025,
0.13984154164791107,
0.12651190161705017,
0.01563459448516369,
0.07022468745708466,
-0.12682947516441345,
0.039753858000040054,
0.017058996483683586,
0.06110960617661476,
0.11148946732282639,
0.09561353176832199,
-0.017211956903338432,
0.09072429686784744,
-0.05819253623485565,
0.10743677616119385,
0.11698240786790848,
-0.35691457986831665,
-0.04918570816516876,
0.0703040137887001,
0.07317676395177841,
0.07267177850008011,
-0.09925977885723114,
0.07055322825908661,
-0.0031248843297362328,
0.006373957265168428,
0.015765942633152008,
-0.12081485986709595,
-0.1110428050160408,
-0.01955210417509079,
-0.03160875290632248,
-0.0017452300526201725,
0.21861891448497772,
-0.04654191806912422,
0.03489900752902031,
-0.020169749855995178,
-0.04591589421033859,
-0.011677958071231842,
-0.04188370332121849,
0.02496904507279396,
-0.07423131167888641,
0.057142145931720734,
0.02993314526975155,
-0.09002865105867386,
-0.12164599448442459,
-0.021411623805761337,
-0.1461789906024933,
0.11474994570016861,
0.03695235028862953,
0.04363267868757248,
-0.24722687900066376,
0.02945740334689617,
0.0947413221001625,
-0.11187099665403366,
0.0267143864184618,
-0.07459341734647751,
0.022926146164536476,
0.020819857716560364,
-0.00851572584360838,
-0.03389840945601463,
0.04337175190448761,
0.11132711917161942,
0.009319460950791836,
0.09133675694465637,
-0.061073046177625656,
0.10941292345523834,
0.02693076618015766,
0.0786091759800911,
-0.007771820295602083,
-0.020544907078146935,
0.08831814676523209,
-0.03866364061832428,
0.0403556227684021,
-0.08837446570396423,
-0.16841743886470795,
0.022848980501294136,
0.003520771861076355,
0.03156028315424919,
-0.00791848823428154,
0.12779223918914795,
-0.024226302281022072,
0.0011426301207393408,
0.0227451641112566,
-0.0713086798787117,
-0.02073603682219982,
0.022842317819595337,
-0.06304028630256653,
0.02775687165558338,
0.030858589336276054,
0.013492245227098465,
-0.10426368564367294,
0.011339805088937283,
-0.06556406617164612,
0.05328410863876343,
-0.0001875594025477767,
-0.024102069437503815,
0.018193919211626053,
0.015698671340942383,
0.004059535451233387,
-0.16921554505825043,
-0.09199301153421402,
0.033721908926963806,
-0.008922317996621132,
-0.05354253947734833,
-0.07788923382759094,
-0.07108201086521149,
-0.014458373188972473,
0.054671257734298706,
-0.05301748961210251,
-0.05900529399514198,
-0.04938751831650734,
0.1231975257396698,
0.0002296627644682303,
0.14015205204486847,
-0.08238469809293747,
0.06717688590288162,
-0.07977043837308884,
-0.0035284084733575583,
-0.11168783903121948,
0.10120423883199692,
-0.013245423324406147,
0.13845549523830414,
-0.009250990115106106,
-0.0030412860214710236,
-0.0022255824878811836,
0.04008101299405098,
-0.0551266148686409,
0.20508576929569244,
-0.13064926862716675,
-0.13789887726306915,
0.268130362033844,
-0.030827118083834648,
-0.2230224907398224,
0.1273289918899536,
-0.000011769896445912309,
0.04544593766331673,
0.07015015184879303,
0.17250970005989075,
-0.08037054538726807,
0.05560284107923508,
0.054403651505708694,
0.11341545730829239,
-0.019039172679185867,
0.024561259895563126,
0.029510995373129845,
0.0724603682756424,
-0.0455925427377224,
0.01590551994740963,
0.18273398280143738,
0.13363121449947357,
-0.023011203855276108,
-0.038199737668037415,
0.043302156031131744,
-0.016418669372797012,
0.0614398755133152,
-0.0179470032453537,
0.15411266684532166,
-0.0418732613325119,
-0.05308276414871216,
-0.10019678622484207,
0.011216388083994389,
0.005299997515976429,
0.06530468910932541,
-0.12105629593133926,
0.06850013881921768,
0.06734120845794678,
0.07816102355718613,
-0.13455496728420258,
-0.0014509992906823754,
-0.05474252998828888,
0.0647750273346901,
0.07916371524333954,
0.013455200009047985,
0.029704291373491287,
0.0005192511598579586,
-0.018929164856672287,
0.043626707047224045,
0.132109597325325,
-0.01225607842206955,
-0.06098340451717377,
-0.11100620776414871,
0.11033175885677338,
-0.04498758167028427,
0.12819428741931915,
-0.008922307752072811,
0.040028322488069534,
0.00826721265912056,
0.1136159747838974,
-0.027022218331694603,
0.026251796633005142,
0.03876622021198273,
0.004477831069380045,
-0.0899946466088295,
-0.04163726046681404,
0.07452385872602463,
-0.009501681663095951,
-0.15228582918643951,
0.19830112159252167,
-0.20978714525699615,
0.1551739126443863,
0.20195648074150085,
-0.23774705827236176,
-0.016603250056505203,
0.024992847815155983,
-0.01043656561523676,
0.002472394611686468,
0.1060509905219078,
0.03888043761253357,
0.17767184972763062,
0.021000202745199203,
0.22675403952598572,
-0.015805957838892937,
-0.004322053398936987,
-0.03242973983287811,
-0.08608277887105942,
-0.011223340407013893,
0.08093731850385666,
0.12744134664535522,
-0.1372309923171997,
0.1947421431541443,
0.15835325419902802,
0.036750391125679016,
0.2046203315258026,
0.07381671667098999,
0.049015145748853683,
0.08149289339780807,
-0.030167341232299805,
-0.04390855133533478,
-0.027576614171266556,
-0.26865798234939575,
-0.047864463180303574,
0.038791876286268234,
-0.049133867025375366,
0.05117245391011238,
-0.09937577694654465,
-0.060565583407878876,
-0.031954534351825714,
0.017588213086128235,
0.0028486894443631172,
0.08514714986085892,
0.030119629576802254,
0.1701793521642685,
0.005074227228760719,
-0.09557577222585678,
0.04934128001332283,
0.011468697339296341,
-0.08026450872421265,
0.1689828783273697,
-0.12673017382621765,
-0.4258863925933838,
-0.06527898460626602,
-0.21053126454353333,
-0.08916758745908737,
0.05808761715888977,
0.10769058763980865,
-0.1439976841211319,
0.012366266921162605,
0.005270803347229958,
0.07101380079984665,
-0.08745928853750229,
0.0012131535913795233,
0.01916838437318802,
0.0016383593901991844,
-0.09066736698150635,
-0.07678520679473877,
-0.04727843776345253,
-0.00009179468906950206,
-0.12603269517421722,
0.16528312861919403,
-0.1183319091796875,
0.0039810664020478725,
0.16775302588939667,
0.03916139900684357,
0.05090755596756935,
-0.060812100768089294,
0.2212531715631485,
-0.12638725340366364,
0.017858847975730896,
0.13056546449661255,
-0.021125314757227898,
0.05612275004386902,
0.17195871472358704,
0.012071080505847931,
-0.10435162484645844,
0.018818791955709457,
-0.01207304373383522,
-0.06471285969018936,
-0.1672912836074829,
-0.05128121376037598,
-0.1284535825252533,
0.10953263193368912,
0.019332269206643105,
0.06951957195997238,
0.08744554966688156,
0.05321107804775238,
-0.05288533866405487,
0.10597015917301178,
0.05563877522945404,
0.06479669362306595,
0.2045837640762329,
-0.028043895959854126,
0.12211371213197708,
-0.04065495729446411,
-0.12000162154436111,
0.04864664375782013,
0.060000717639923096,
0.011868664994835854,
0.0663963183760643,
0.0126329455524683,
-0.021664300933480263,
0.031179042533040047,
0.12457367777824402,
0.04456816613674164,
0.03577124699950218,
-0.035047635436058044,
-0.03406280651688576,
-0.009562775492668152,
-0.130459725856781,
0.05707879737019539,
0.021039392799139023,
-0.15187042951583862,
0.01616428606212139,
0.02323991246521473,
0.08537538349628448,
0.04660702869296074,
0.01563543640077114,
-0.17491793632507324,
-0.020145492628216743,
0.10831935703754425,
-0.027740478515625,
-0.1334967315196991,
0.09163691848516464,
-0.009562394581735134,
-0.16541226208209991,
0.10700083523988724,
-0.024428073316812515,
0.1091926321387291,
-0.04394406080245972,
0.07732163369655609,
-0.09901333600282669,
-0.18485529720783234,
0.00766295101493597,
0.09612075984477997,
-0.38588106632232666,
0.14855441451072693,
-0.0059148650616407394,
-0.05538864806294441,
-0.06321947276592255,
0.0037180609069764614,
0.04468758776783943,
0.10215681791305542,
0.12741725146770477,
0.00013688358012586832,
0.0803956687450409,
-0.18240930140018463,
-0.054097458720207214,
0.04262901470065117,
0.08649098128080368,
-0.0458553172647953,
-0.025053666904568672,
-0.01764053851366043,
0.007663626689463854,
-0.0769505649805069,
-0.008623939007520676,
0.015122197568416595,
-0.19127823412418365,
0.11841011792421341,
0.03980402275919914,
0.06960686296224594,
0.008780216798186302,
-0.026761164888739586,
-0.01281428150832653,
0.18779779970645905,
-0.07612934708595276,
-0.12078684568405151,
-0.06716711074113846,
-0.016121791675686836,
-0.01785164698958397,
-0.0735326036810875,
0.002271049190312624,
-0.11426741629838943,
0.021489592269062996,
-0.10436588525772095,
-0.1609496921300888,
0.09620053321123123,
-0.05875973775982857,
-0.006028206553310156,
-0.026951272040605545,
0.14660872519016266,
-0.053308211266994476,
0.06589344888925552,
0.0255772452801466,
-0.03908020630478859,
-0.07394690066576004,
-0.0799272358417511,
0.0054723178036510944,
-0.0572156123816967,
-0.06680281460285187,
-0.031953778117895126,
-0.011832368560135365,
-0.05358956381678581,
-0.11227399855852127,
-0.046480514109134674,
0.29821527004241943,
0.1868896335363388,
0.00805188249796629,
0.1663476973772049,
0.0900452509522438,
-0.029930567368865013,
-0.25716134905815125,
-0.175254687666893,
-0.1079254150390625,
-0.046478744596242905,
-0.07220868021249771,
-0.2292962521314621,
0.017511900514364243,
-0.10843773186206818,
-0.01059598196297884,
0.09957863390445709,
-0.265788197517395,
-0.11170767992734909,
0.20072096586227417,
-0.006167161278426647,
0.3892832100391388,
-0.16636130213737488,
-0.03649231046438217,
-0.06909564137458801,
-0.07411271333694458,
0.12933433055877686,
0.07117795199155807,
0.13847430050373077,
-0.04111747443675995,
0.18293772637844086,
0.012680799700319767,
0.03624395653605461,
0.068659707903862,
-0.015455365180969238,
-0.050911713391542435,
-0.1145886555314064,
-0.06352460384368896,
-0.01079098042100668,
0.030587224289774895,
-0.03672579675912857,
-0.09797748923301697,
-0.04414442181587219,
-0.17056086659431458,
-0.01645548827946186,
-0.07259179651737213,
0.020779743790626526,
0.030259516090154648,
-0.04124659672379494,
-0.01295030489563942,
-0.01775387115776539,
-0.013597154058516026,
0.027991440147161484,
0.18150636553764343,
-0.043287359178066254,
0.1742011308670044,
0.02431008778512478,
0.08100015670061111,
-0.17284512519836426,
0.060965362936258316,
-0.04781986400485039,
-0.015477021224796772,
0.07038820534944534,
-0.15387465059757233,
0.014222756959497929,
0.08479619771242142,
-0.04357128590345383,
0.1067517027258873,
0.08393752574920654,
0.02519105188548565,
0.06459605693817139,
0.16290442645549774,
-0.21499286592006683,
-0.0670604407787323,
-0.02875501848757267,
0.11616478860378265,
0.12784002721309662,
0.0022346104960888624,
0.15619869530200958,
-0.032740622758865356,
-0.06666148453950882,
0.010681657120585442,
0.02402949146926403,
-0.008511344902217388,
0.04734830930829048,
-0.037163589149713516,
0.02781090885400772,
-0.14368455111980438,
0.09026860445737839,
0.03801341354846954,
-0.1557031124830246,
0.05174240842461586,
0.21992863714694977,
-0.11249441653490067,
-0.11000873148441315,
-0.09498535096645355,
0.07573597878217697,
-0.10232709348201752,
0.027865178883075714,
-0.032937679439783096,
-0.14221404492855072,
0.018086643889546394,
0.12440360337495804,
0.026917632669210434,
0.06331939250230789,
-0.07222729176282883,
-0.027236737310886383,
0.01925749145448208,
0.002803887240588665,
0.02674179896712303,
-0.02655913680791855,
-0.03184782341122627,
0.03129572048783302,
0.019741959869861603,
0.18500357866287231,
-0.08244449645280838,
-0.1089986190199852,
-0.12917660176753998,
0.054291173815727234,
-0.14318901300430298,
-0.057769112288951874,
-0.11673539131879807,
-0.027171187102794647,
-0.022182201966643333,
-0.047591809183359146,
-0.06577682495117188,
-0.09809824079275131,
-0.10338776558637619,
0.0068886736407876015,
-0.06465700268745422,
0.026853768154978752,
-0.12631885707378387,
0.011329136788845062,
0.11597243696451187,
-0.05438593029975891,
0.16958047449588776,
0.17149806022644043,
-0.10563112050294876,
0.08502273261547089,
-0.1473504900932312,
-0.11299342662096024,
0.07225281000137329,
0.011402432806789875,
0.04624446853995323,
0.12390163540840149,
0.014747824519872665,
0.01786835491657257,
0.05333774909377098,
0.059360675513744354,
0.06769594550132751,
-0.07654081284999847,
-0.008715818636119366,
-0.045314278453588486,
-0.0854983702301979,
-0.03230653703212738,
-0.012059845961630344,
0.06623809039592743,
0.009407350793480873,
0.05816361680626869,
-0.055675216019153595,
0.08417366445064545,
-0.026932889595627785,
0.030689913779497147,
0.021680884063243866,
-0.16238011419773102,
-0.006857534404844046,
-0.1295361965894699,
0.027514806017279625,
-0.024722188711166382,
0.15247005224227905,
0.04296011105179787,
0.006967922672629356,
0.06414750963449478,
0.1171678751707077,
-0.022582413628697395,
-0.0023292554542422295,
0.05774206295609474,
0.1154438853263855,
-0.0675843358039856,
-0.031119277700781822,
0.026757122948765755,
0.07008391618728638,
0.07562127709388733,
0.0937512218952179,
-0.028302617371082306,
-0.04073963686823845,
0.05934188887476921,
-0.058753613382577896,
0.023251622915267944,
-0.08683300763368607,
-0.12230753153562546,
-0.054312337189912796,
0.025848396122455597,
-0.050373777747154236,
0.09078274667263031,
0.19534547626972198,
0.03001718968153,
0.0057357908226549625,
-0.08093973249197006,
-0.062064457684755325,
-0.13381250202655792,
-0.06581059843301773,
-0.04808777570724487,
-0.0763494148850441,
0.03235112503170967,
-0.08067352324724197,
0.05420713499188423,
0.05999305844306946,
0.09942309558391571,
-0.058052148669958115,
0.14475655555725098,
0.05807025730609894,
-0.10767266899347305,
0.07280439883470535,
-0.002969359513372183,
0.03315140679478645,
-0.03655875101685524,
0.02014821395277977,
-0.06987880170345306,
0.019993526861071587,
0.04377894103527069,
0.0854121744632721,
-0.06533212959766388,
0.06746973842382431,
-0.18154747784137726,
-0.1098192036151886,
-0.059395771473646164,
0.03631339967250824,
-0.051388949155807495,
0.13146202266216278,
0.008540050126612186,
0.02352604828774929,
0.004056752659380436,
0.2725054919719696,
-0.026150483638048172,
-0.022047383710741997,
-0.03064841404557228,
0.14349085092544556,
-0.022776959463953972,
0.03541009873151779,
-0.06773095577955246,
0.04262269288301468,
-0.14489620923995972,
0.3413310647010803,
0.24778419733047485,
-0.1709856241941452,
-0.014107874594628811,
-0.059266071766614914,
0.071696937084198,
0.058107879012823105,
0.09247202426195145,
0.1360362470149994,
0.24102802574634552,
-0.12432587891817093,
-0.016417693346738815,
-0.05260203406214714,
-0.04663021117448807,
-0.07500207424163818,
0.08358201384544373,
0.08846033364534378,
-0.0064638652838766575,
-0.07850957661867142,
0.06233717501163483,
-0.2709263265132904,
0.05577068030834198,
-0.12287355214357376,
-0.1709645837545395,
-0.03737100958824158,
0.024436039850115776,
0.08884715288877487,
0.06009827181696892,
0.07239589095115662,
0.038585808128118515,
-0.07405420392751694,
0.008073944598436356,
0.030815623700618744,
-0.18785469233989716,
-0.01435408927500248,
0.0991932824254036,
-0.12031781673431396,
0.06093217432498932,
-0.024113425984978676,
0.02606036886572838,
0.04472466558218002,
0.09042179584503174,
0.01061752624809742,
0.041321318596601486,
0.02977737970650196,
0.04681635648012161,
-0.029170997440814972,
0.13605453073978424,
0.02793808840215206,
-0.05501317232847214,
0.08998849242925644,
0.00018030707724392414,
0.039701174944639206,
-0.07082700729370117,
0.00521848862990737,
0.04420536011457443,
0.03875603899359703,
-0.05680922046303749,
0.037449613213539124,
0.09941021353006363,
-0.04150751605629921,
0.007235556375235319,
-0.04244238883256912,
-0.07784814387559891,
-0.026562638580799103,
-0.14921198785305023,
-0.13686156272888184,
-0.1639532893896103,
-0.10263901203870773,
-0.03853241354227066,
0.017362121492624283,
-0.1755923181772232,
0.026260217651724815,
-0.13203272223472595,
0.08070380985736847,
-0.1467827707529068,
0.11416751891374588,
0.04853696748614311,
0.033156268298625946,
-0.03397738188505173,
-0.08807019889354706,
0.05975784733891487,
0.07428047060966492,
-0.10753463208675385,
-0.10723346471786499
] |
null | null |
transformers
|
## DALL·E mini - Generate images from text
<img style="text-align:center; display:block;" src="https://raw.githubusercontent.com/borisdayma/dalle-mini/main/img/logo.png" width="200">
* [Technical Report](https://wandb.ai/dalle-mini/dalle-mini/reports/DALL-E-mini--Vmlldzo4NjIxODA)
* [Demo](https://huggingface.co/spaces/flax-community/dalle-mini)
### Model Description
This is an attempt to replicate OpenAI's [DALL·E](https://openai.com/blog/dall-e/), a model capable of generating arbitrary images from a text prompt that describes the desired result.

This model's architecture is a simplification of the original, and leverages previous open source efforts and available pre-trained models. Results have lower quality than OpenAI's, but the model can be trained and used on less demanding hardware. Our training was performed on a single TPU v3-8 for a few days.
### Components of the Architecture
The system relies on the Flax/JAX infrastructure, which are ideal for TPU training. TPUs are not required, both Flax and JAX run very efficiently on GPU backends.
The main components of the architecture include:
* An encoder, based on [BART](https://arxiv.org/abs/1910.13461). The encoder transforms a sequence of input text tokens to a sequence of image tokens. The input tokens are extracted from the text prompt by using the model's tokenizer. The image tokens are a fixed-length sequence, and they represent indices in a VQGAN-based pre-trained codebook.
* A decoder, which converts the image tokens to image pixels. As mentioned above, the decoder is based on a [VQGAN model](https://compvis.github.io/taming-transformers/).
The model definition we use for the encoder can be downloaded from our [Github repo](https://github.com/borisdayma/dalle-mini). The encoder is represented by the class `CustomFlaxBartForConditionalGeneration`.
To use the decoder, you need to follow the instructions in our accompanying VQGAN model in the hub, [flax-community/vqgan_f16_16384](https://huggingface.co/flax-community/vqgan_f16_16384).
### How to Use
The easiest way to get familiar with the code and the models is to follow the inference notebook we provide in our [github repo](https://github.com/borisdayma/dalle-mini/blob/main/dev/inference/inference_pipeline.ipynb). For your convenience, you can open it in Google Colaboratory: [](https://colab.research.google.com/github/borisdayma/dalle-mini/blob/main/dev/inference/inference_pipeline.ipynb)
If you just want to test the trained model and see what it comes up with, please visit [our demo](https://huggingface.co/spaces/flax-community/dalle-mini), available in 🤗 Spaces.
### Additional Details
Our [report](https://wandb.ai/dalle-mini/dalle-mini/reports/DALL-E-mini--Vmlldzo4NjIxODA) contains more details about how the model was trained and shows many examples that demonstrate its capabilities.
|
{"language": ["en"], "pipeline_tag": "text-to-image", "inference": false}
|
text-to-image
|
apol/dalle-mini
|
[
"transformers",
"jax",
"bart",
"text2text-generation",
"text-to-image",
"en",
"arxiv:1910.13461",
"autotrain_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1910.13461"
] |
[
"en"
] |
TAGS
#transformers #jax #bart #text2text-generation #text-to-image #en #arxiv-1910.13461 #autotrain_compatible #region-us
|
## DALL·E mini - Generate images from text
<img style="text-align:center; display:block;" src="URL width="200">
* Technical Report
* Demo
### Model Description
This is an attempt to replicate OpenAI's DALL·E, a model capable of generating arbitrary images from a text prompt that describes the desired result.
!DALL·E mini demo screenshot
This model's architecture is a simplification of the original, and leverages previous open source efforts and available pre-trained models. Results have lower quality than OpenAI's, but the model can be trained and used on less demanding hardware. Our training was performed on a single TPU v3-8 for a few days.
### Components of the Architecture
The system relies on the Flax/JAX infrastructure, which are ideal for TPU training. TPUs are not required, both Flax and JAX run very efficiently on GPU backends.
The main components of the architecture include:
* An encoder, based on BART. The encoder transforms a sequence of input text tokens to a sequence of image tokens. The input tokens are extracted from the text prompt by using the model's tokenizer. The image tokens are a fixed-length sequence, and they represent indices in a VQGAN-based pre-trained codebook.
* A decoder, which converts the image tokens to image pixels. As mentioned above, the decoder is based on a VQGAN model.
The model definition we use for the encoder can be downloaded from our Github repo. The encoder is represented by the class 'CustomFlaxBartForConditionalGeneration'.
To use the decoder, you need to follow the instructions in our accompanying VQGAN model in the hub, flax-community/vqgan_f16_16384.
### How to Use
The easiest way to get familiar with the code and the models is to follow the inference notebook we provide in our github repo. For your convenience, you can open it in Google Colaboratory: . Entity representation was set to the title, and description was used to disambiguate if 2 entities had the same title. If still no disambiguation was possible, we used the wikidata ID (eg. Q123456).
We trained the model on WikiKG90Mv2 for approx 1.5 epochs on 4x1080Ti GPUs. The training time for 1 epoch was approx 5.5 days.
To evaluate the model, we sample 300 times from the decoder for each input (s,r) pair. We then remove predictions which do not map back to a valid entity, and then rank the predictions by their log probabilities. Filtering was performed subsequently. We achieve 0.22 validation MRR (the full leaderboard is here https://ogb.stanford.edu/docs/lsc/leaderboards/#wikikg90mv2)
You can try the following code in an ipython notebook to evaluate the pre-trained model. The full procedure of mapping entity to ids, filtering etc. is not included here for sake of simplicity but can be provided on request if needed. Please contact Apoorv ([email protected]) for clarifications/details.
---------
```
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("apoorvumang/kgt5-wikikg90mv2")
model = AutoModelForSeq2SeqLM.from_pretrained("apoorvumang/kgt5-wikikg90mv2")
```
```
import torch
def getScores(ids, scores, pad_token_id):
"""get sequence scores from model.generate output"""
scores = torch.stack(scores, dim=1)
log_probs = torch.log_softmax(scores, dim=2)
# remove start token
ids = ids[:,1:]
# gather needed probs
x = ids.unsqueeze(-1).expand(log_probs.shape)
needed_logits = torch.gather(log_probs, 2, x)
final_logits = needed_logits[:, :, 0]
padded_mask = (ids == pad_token_id)
final_logits[padded_mask] = 0
final_scores = final_logits.sum(dim=-1)
return final_scores.cpu().detach().numpy()
def topkSample(input, model, tokenizer,
num_samples=5,
num_beams=1,
max_output_length=30):
tokenized = tokenizer(input, return_tensors="pt")
out = model.generate(**tokenized,
do_sample=True,
num_return_sequences = num_samples,
num_beams = num_beams,
eos_token_id = tokenizer.eos_token_id,
pad_token_id = tokenizer.pad_token_id,
output_scores = True,
return_dict_in_generate=True,
max_length=max_output_length,)
out_tokens = out.sequences
out_str = tokenizer.batch_decode(out_tokens, skip_special_tokens=True)
out_scores = getScores(out_tokens, out.scores, tokenizer.pad_token_id)
pair_list = [(x[0], x[1]) for x in zip(out_str, out_scores)]
sorted_pair_list = sorted(pair_list, key=lambda x:x[1], reverse=True)
return sorted_pair_list
def greedyPredict(input, model, tokenizer):
input_ids = tokenizer([input], return_tensors="pt").input_ids
out_tokens = model.generate(input_ids)
out_str = tokenizer.batch_decode(out_tokens, skip_special_tokens=True)
return out_str[0]
```
```
# an example from validation set that the model predicts correctly
# you can try your own examples here. what's your noble title?
input = "Sophie Valdemarsdottir| noble title"
out = topkSample(input, model, tokenizer, num_samples=5)
out
```
You can further load the list of entity aliases, then filter only those predictions which are valid entities then create a reverse mapping from alias -> integer id to get final predictions in required format.
However, loading these aliases in memory as a dictionary requires a lot of RAM + you need to download the aliases file (made available here https://storage.googleapis.com/kgt5-wikikg90mv2/ent_alias_list.pickle) (relation file: https://storage.googleapis.com/kgt5-wikikg90mv2/rel_alias_list.pickle)
The submitted validation/test results for were obtained by sampling 300 times for each input, then applying above procedure, followed by filtering known entities. The final MRR can vary slightly due to this sampling nature (we found that although beam search gives deterministic output, the results are inferior to sampling large number of times).
```
# download valid.txt. you can also try same url with test.txt. however test does not contain the correct tails
!wget https://storage.googleapis.com/kgt5-wikikg90mv2/valid.txt
```
```
fname = 'valid.txt'
valid_lines = []
f = open(fname)
for line in f:
valid_lines.append(line.rstrip())
f.close()
print(valid_lines[0])
```
```
from tqdm.auto import tqdm
# try unfiltered hits@k. this is approximation since model can sample same seq multiple times
# you should run this on gpu if you want to evaluate on all points with 300 samples each
k = 1
count_at_k = 0
max_predictions = k
max_points = 1000
for line in tqdm(valid_lines[:max_points]):
input, target = line.split('\t')
model_output = topkSample(input, model, tokenizer, num_samples=max_predictions)
prediction_strings = [x[0] for x in model_output]
if target in prediction_strings:
count_at_k += 1
print('Hits at {0} unfiltered: {1}'.format(k, count_at_k/max_points))
```
|
{"license": "mit", "widget": [{"text": "Apoorv Umang Saxena| family name", "example_title": "Family name prediction"}, {"text": "Apoorv Saxena| country", "example_title": "Country prediction"}, {"text": "World War 2| followed by", "example_title": "followed by"}]}
|
text2text-generation
|
apoorvumang/kgt5-wikikg90mv2
|
[
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tf #t5 #text2text-generation #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
This is a t5-small model trained from scratch on WikiKG90Mv2 dataset. Please see URL for more details on the method.
This model was trained on the tail entity prediction task ie. given subject entity and relation, predict the object entity. Input should be provided in the form of "\<entity text\>| \<relation text\>".
We used the raw text title and descriptions to get entity and relation textual representations. These raw texts were obtained from ogb dataset itself (dataset/wikikg90m-v2/mapping/URL and URL). Entity representation was set to the title, and description was used to disambiguate if 2 entities had the same title. If still no disambiguation was possible, we used the wikidata ID (eg. Q123456).
We trained the model on WikiKG90Mv2 for approx 1.5 epochs on 4x1080Ti GPUs. The training time for 1 epoch was approx 5.5 days.
To evaluate the model, we sample 300 times from the decoder for each input (s,r) pair. We then remove predictions which do not map back to a valid entity, and then rank the predictions by their log probabilities. Filtering was performed subsequently. We achieve 0.22 validation MRR (the full leaderboard is here URL
You can try the following code in an ipython notebook to evaluate the pre-trained model. The full procedure of mapping entity to ids, filtering etc. is not included here for sake of simplicity but can be provided on request if needed. Please contact Apoorv (apoorvumang@URL) for clarifications/details.
---------
You can further load the list of entity aliases, then filter only those predictions which are valid entities then create a reverse mapping from alias -> integer id to get final predictions in required format.
However, loading these aliases in memory as a dictionary requires a lot of RAM + you need to download the aliases file (made available here URL (relation file: URL
The submitted validation/test results for were obtained by sampling 300 times for each input, then applying above procedure, followed by filtering known entities. The final MRR can vary slightly due to this sampling nature (we found that although beam search gives deterministic output, the results are inferior to sampling large number of times).
|
[] |
[
"TAGS\n#transformers #pytorch #tf #t5 #text2text-generation #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
56
] |
[
"passage: TAGS\n#transformers #pytorch #tf #t5 #text2text-generation #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.010400908999145031,
0.030327240005135536,
-0.005618195049464703,
0.038645707070827484,
0.14778487384319305,
0.024248642846941948,
0.15961895883083344,
0.14133785665035248,
-0.011818120256066322,
-0.059390369802713394,
0.13366489112377167,
0.23089699447155,
0.007583661936223507,
0.07121007889509201,
-0.09568629413843155,
-0.25710058212280273,
0.033028725534677505,
0.06743486225605011,
0.021591806784272194,
0.1207289919257164,
0.1015511080622673,
-0.036579713225364685,
0.09400439262390137,
-0.022736165672540665,
-0.16881726682186127,
0.035969119518995285,
0.07726415991783142,
-0.13076528906822205,
0.12795425951480865,
0.07151467353105545,
0.0706375315785408,
0.08211265504360199,
-0.029585719108581543,
-0.1197257936000824,
0.023396320641040802,
0.014435168355703354,
-0.10374955832958221,
0.06444274634122849,
0.11291953176259995,
-0.04454924911260605,
0.16197100281715393,
0.08684632182121277,
-0.021068492904305458,
0.08300434798002243,
-0.13671883940696716,
-0.05939897149801254,
-0.05603726580739021,
0.07527060806751251,
0.05484833940863609,
0.07927528768777847,
0.019192568957805634,
0.13888368010520935,
-0.07351943105459213,
0.1146133616566658,
0.12993298470973969,
-0.3674393594264984,
0.015446045435965061,
0.08125357329845428,
0.06020224094390869,
0.04111756384372711,
-0.027324270457029343,
0.05174962058663368,
0.04898485168814659,
0.02728346176445484,
0.049616698175668716,
-0.08038555830717087,
-0.12992750108242035,
0.047101475298404694,
-0.07893506437540054,
-0.06356924027204514,
0.2358061820268631,
-0.04660964012145996,
0.027520179748535156,
-0.0204460509121418,
-0.09524895250797272,
-0.03710542619228363,
0.0066903806291520596,
0.0005931849009357393,
-0.03096538782119751,
0.08331397920846939,
0.02135743759572506,
-0.06664540618658066,
-0.1460975706577301,
-0.01516916137188673,
-0.20698100328445435,
0.08419211208820343,
0.010577467270195484,
0.05976402387022972,
-0.20168955624103546,
0.09773848950862885,
0.03381146490573883,
-0.09874244779348373,
0.021026980131864548,
-0.08033237606287003,
0.0530795194208622,
-0.024766409769654274,
-0.05143652856349945,
-0.08845679461956024,
0.06919810175895691,
0.1450580656528473,
-0.02018275111913681,
-0.00527442479506135,
-0.09371627122163773,
0.09627054631710052,
-0.011579813435673714,
0.043491180986166,
0.00547531433403492,
0.004193686414510012,
0.061188243329524994,
-0.14518293738365173,
-0.0013208419550210238,
-0.048872996121644974,
-0.18200036883354187,
-0.059769049286842346,
0.042724769562482834,
0.09719907492399216,
0.01711413450539112,
0.09855630993843079,
-0.025288229808211327,
-0.01701083406805992,
0.07007258385419846,
-0.07246914505958557,
-0.019327200949192047,
-0.002194150583818555,
0.02371346950531006,
0.10351449996232986,
0.0387425422668457,
0.00494269048795104,
-0.13325008749961853,
0.06392340362071991,
-0.07529893517494202,
-0.03093566931784153,
-0.0176913570612669,
-0.0715644583106041,
0.05734272301197052,
-0.1141379326581955,
0.016190724447369576,
-0.17351391911506653,
-0.18522216379642487,
0.04077652841806412,
0.019887162372469902,
-0.024783186614513397,
-0.05961276590824127,
0.0019211582839488983,
-0.057216860353946686,
0.03374217078089714,
-0.07581625133752823,
0.009048227220773697,
-0.05486573278903961,
0.11557752639055252,
-0.07560385763645172,
0.030965346843004227,
-0.1768040806055069,
0.07262719422578812,
-0.125581756234169,
-0.025136105716228485,
-0.03239711374044418,
0.033603694289922714,
0.00965122226625681,
0.1116521805524826,
-0.039587412029504776,
-0.04899122565984726,
-0.04370981827378273,
0.030037103220820427,
-0.037949252873659134,
0.16890570521354675,
-0.11255815625190735,
-0.08541524410247803,
0.20706135034561157,
-0.11086109280586243,
-0.2008030265569687,
0.07709459960460663,
0.01013500988483429,
0.0776555985212326,
0.08499190211296082,
0.18138065934181213,
0.06290482729673386,
-0.05328071117401123,
0.09534000605344772,
0.12149759382009506,
-0.1187533438205719,
-0.1451372504234314,
0.026577258482575417,
-0.03082885779440403,
-0.10573317110538483,
0.034201350063085556,
0.03422963619232178,
0.07680460065603256,
-0.03920114412903786,
-0.03932051360607147,
-0.0494161993265152,
-0.004555493593215942,
0.049948085099458694,
-0.0011641195742413402,
0.10668271034955978,
-0.06128877401351929,
-0.013547565788030624,
0.044043317437171936,
-0.031710393726825714,
-0.007264792453497648,
0.04655448719859123,
-0.04715730994939804,
0.10840078443288803,
0.02221592329442501,
0.03871484473347664,
-0.15146499872207642,
-0.061904069036245346,
-0.005645389202982187,
0.1227952316403389,
0.025874223560094833,
0.11137690395116806,
0.027458855882287025,
-0.010664556175470352,
-0.022433316335082054,
0.0008893950725905597,
0.13977749645709991,
0.011990252882242203,
-0.04680490121245384,
-0.09371480345726013,
0.039856500923633575,
-0.04271066188812256,
-0.04274376481771469,
-0.06241343542933464,
0.024221500381827354,
0.062386978417634964,
0.09842299669981003,
-0.0026752492412924767,
0.07525410503149033,
-0.03790257126092911,
0.01100082602351904,
-0.08951202034950256,
-0.003793126903474331,
0.11490074545145035,
0.025126071646809578,
-0.06864841282367706,
0.24196578562259674,
-0.15623220801353455,
0.24114659428596497,
0.20108459889888763,
-0.22048184275627136,
-0.00719950208440423,
-0.07420898973941803,
-0.0421326570212841,
0.006143852137029171,
0.0596345029771328,
-0.03073939122259617,
0.04854210838675499,
-0.005075455643236637,
0.19629999995231628,
-0.08465110510587692,
-0.06466507166624069,
0.004562768619507551,
-0.03855902701616287,
-0.01539834775030613,
0.04779201000928879,
0.1420099139213562,
-0.2199341207742691,
0.17850859463214874,
0.23711927235126495,
0.05994483456015587,
0.19056592881679535,
-0.05297784507274628,
-0.03129832074046135,
0.04813560098409653,
0.007187758106738329,
-0.011280511505901814,
-0.06055564060807228,
-0.13704481720924377,
0.010414674878120422,
0.08554119616746902,
0.025774115696549416,
0.07905352115631104,
-0.131109818816185,
-0.05173128470778465,
-0.008696871809661388,
-0.03847317025065422,
-0.03326883167028427,
0.10674235224723816,
0.04537918418645859,
0.14052584767341614,
-0.044455137103796005,
-0.04719926789402962,
0.13954859972000122,
0.018781283870339394,
-0.13523152470588684,
0.1865098774433136,
-0.13269372284412384,
-0.2689357101917267,
-0.1680755615234375,
-0.14702051877975464,
-0.02824341505765915,
0.03086218796670437,
0.133157879114151,
-0.03279132768511772,
-0.025315312668681145,
-0.026836466044187546,
0.02376645803451538,
-0.07843390852212906,
0.01617494784295559,
-0.09393084049224854,
0.05689031258225441,
-0.06109718605875969,
-0.11262251436710358,
-0.057729486376047134,
0.01025623083114624,
-0.07554700970649719,
0.141845241189003,
-0.11268868297338486,
0.057002369314432144,
0.17264147102832794,
-0.025087842717766762,
0.043959882110357285,
-0.08289745450019836,
0.17435091733932495,
-0.041871242225170135,
0.03644343093037605,
0.229295551776886,
-0.030965859070420265,
0.07467816025018692,
0.12638960778713226,
0.013323010876774788,
-0.06558438390493393,
0.029549220576882362,
-0.048272550106048584,
-0.09200919419527054,
-0.2796465754508972,
-0.08485864847898483,
-0.12342948466539383,
0.09493411332368851,
0.055454690009355545,
0.07365357130765915,
0.18461720645427704,
0.0659307986497879,
-0.026611756533384323,
0.03294788673520088,
0.0352121964097023,
0.09338559955358505,
0.24226857721805573,
0.0005982090951874852,
0.12002527713775635,
-0.0907125398516655,
-0.07560344785451889,
0.11104778200387955,
0.05689411237835884,
0.1380208134651184,
0.07002466171979904,
0.07903736084699631,
0.05648213252425194,
0.10008437931537628,
0.11149659752845764,
0.1578504741191864,
0.030876176431775093,
-0.000020617246264009736,
-0.03939064219594002,
-0.04867153614759445,
0.007282386999577284,
0.04077799618244171,
-0.03183300048112869,
-0.14927572011947632,
-0.07737739384174347,
-0.0973675549030304,
0.04800199717283249,
0.13619865477085114,
0.0482996366918087,
-0.21420921385288239,
0.025159407407045364,
0.05019208416342735,
-0.029815593734383583,
-0.0869293361902237,
0.08573560416698456,
-0.0016460627084597945,
-0.12551520764827728,
0.09950688481330872,
-0.05349922552704811,
0.11487841606140137,
0.032480284571647644,
0.07798460125923157,
0.011679619550704956,
-0.09430275112390518,
0.019180575385689735,
0.1161191537976265,
-0.3638226091861725,
0.2176336646080017,
-0.004299261141568422,
-0.04182686656713486,
-0.09542817622423172,
-0.0034446383360773325,
0.0005269189714454114,
0.14870630204677582,
0.11943205446004868,
-0.005180983804166317,
-0.03656614571809769,
-0.05283769220113754,
0.022529279813170433,
0.021913601085543633,
0.10644181072711945,
-0.019992860034108162,
-0.007384238764643669,
-0.06571807712316513,
-0.0031978958286345005,
0.006435716524720192,
0.033300671726465225,
-0.022558053955435753,
-0.1479572355747223,
0.06166790798306465,
0.020484501495957375,
0.060447752475738525,
-0.008195136673748493,
-0.025373172014951706,
-0.09502323716878891,
0.17279210686683655,
-0.10644697397947311,
-0.10776477307081223,
-0.1359422653913498,
-0.08587075024843216,
0.03303546458482742,
-0.06787935644388199,
0.056986816227436066,
-0.0756676197052002,
0.004353662952780724,
-0.07320882380008698,
-0.2491040825843811,
0.1142483577132225,
-0.08468849956989288,
-0.06666645407676697,
-0.03526065871119499,
0.17432934045791626,
-0.10780558735132217,
0.011499040760099888,
0.04198432341217995,
-0.0018220607889816165,
-0.062122050672769547,
-0.092263363301754,
-0.015203078277409077,
-0.06613647192716599,
0.04779784753918648,
-0.03333862125873566,
-0.11249491572380066,
-0.0289743784815073,
-0.014883605763316154,
-0.028142698109149933,
0.2790724039077759,
0.16298332810401917,
-0.06369508057832718,
0.19883881509304047,
0.12317054718732834,
-0.10875054448843002,
-0.2914796471595764,
-0.10152175277471542,
-0.11425692588090897,
-0.058898720890283585,
0.016417307779192924,
-0.15694072842597961,
0.054097771644592285,
0.019061021506786346,
-0.012805704027414322,
0.12434317171573639,
-0.249245747923851,
-0.10220198333263397,
0.13389526307582855,
0.027799135074019432,
0.27985697984695435,
-0.13964363932609558,
-0.09994731098413467,
-0.039618346840143204,
-0.17528410255908966,
0.21726824343204498,
-0.07144275307655334,
0.08699654042720795,
-0.03840286657214165,
0.07318214327096939,
0.02331792749464512,
-0.03762156888842583,
0.040996141731739044,
0.01128156017512083,
0.04441430792212486,
-0.12198595702648163,
-0.024042898789048195,
0.11168624460697174,
0.005297151394188404,
0.0637705996632576,
-0.10623957961797714,
0.04365249350667,
-0.09084102511405945,
-0.03271590173244476,
-0.09357552230358124,
0.07571247220039368,
-0.0010277569526806474,
-0.09210102260112762,
-0.006822858937084675,
-0.04304858297109604,
0.015904925763607025,
-0.04386959224939346,
0.18430320918560028,
-0.027440091595053673,
0.1613653302192688,
0.18290382623672485,
0.1274365782737732,
-0.12628880143165588,
0.028552083298563957,
-0.09379295259714127,
-0.08350693434476852,
0.054892443120479584,
-0.10759372264146805,
0.03150700405240059,
0.11849368363618851,
-0.025619393214583397,
0.08158595114946365,
0.10762469470500946,
0.0053189038299024105,
-0.02861594781279564,
0.13994278013706207,
-0.21912488341331482,
-0.03212627023458481,
-0.08297178894281387,
-0.02981582097709179,
0.07408997416496277,
0.05760315805673599,
0.1667449176311493,
-0.010729295201599598,
-0.014610202983021736,
0.009393781423568726,
-0.0047876606695353985,
-0.06268644332885742,
0.043045926839113235,
0.0293030496686697,
0.012462055310606956,
-0.1284341961145401,
0.09157611429691315,
0.02953341417014599,
-0.0953683853149414,
0.0061553772538900375,
0.16665291786193848,
-0.13646426796913147,
-0.1199454739689827,
-0.011245685629546642,
0.08294264227151871,
-0.20670506358146667,
-0.03916628286242485,
-0.04569830000400543,
-0.14499416947364807,
0.09112944453954697,
0.2110103964805603,
0.0409313403069973,
0.1100572720170021,
-0.04282877594232559,
-0.060482367873191833,
-0.023703021928668022,
0.008345178328454494,
-0.05585635453462601,
0.033509429544210434,
-0.09977351129055023,
0.11568174511194229,
-0.024602392688393593,
0.12802253663539886,
-0.07691213488578796,
-0.031751848757267,
-0.13846005499362946,
0.01815444976091385,
-0.13628746569156647,
-0.03462497144937515,
-0.07147474586963654,
-0.042573414742946625,
-0.004395403899252415,
-0.0189741812646389,
-0.04895327612757683,
-0.019831962883472443,
-0.11818289756774902,
0.014571699313819408,
-0.018436431884765625,
0.0707891508936882,
-0.09568431973457336,
-0.015372022986412048,
0.048218030482530594,
-0.023110736161470413,
0.12384765595197678,
0.07535143196582794,
-0.11460145562887192,
0.11703387647867203,
-0.17924611270427704,
-0.07929440587759018,
0.09314094483852386,
0.022003769874572754,
0.033739857375621796,
0.06678267568349838,
0.025464288890361786,
0.09565230458974838,
-0.01380913145840168,
0.04849580302834511,
-0.025279002264142036,
-0.13537056744098663,
-0.007838325574994087,
-0.03395137935876846,
-0.12341073155403137,
-0.05784644931554794,
-0.033660005778074265,
0.05114578455686569,
0.0029290751554071903,
0.14896175265312195,
-0.05179247632622719,
0.0753285363316536,
-0.09274496883153915,
0.0062860711477696896,
0.01422881055623293,
-0.14944453537464142,
-0.11611710488796234,
-0.08705127239227295,
-0.00723585719242692,
-0.000707013881765306,
0.2100067138671875,
0.047704700380563736,
-0.03727366775274277,
0.05010445788502693,
0.0803009644150734,
0.014942214824259281,
-0.003418553387746215,
0.2868421971797943,
0.04953340068459511,
-0.03402232751250267,
-0.13065895438194275,
0.049880869686603546,
-0.029003385454416275,
-0.030639592558145523,
0.17966292798519135,
0.055457644164562225,
-0.06197310984134674,
0.07022786140441895,
0.049444522708654404,
0.010007078759372234,
-0.08252488076686859,
-0.15061989426612854,
0.04782331734895706,
0.09217233955860138,
-0.029668016359210014,
0.09610818326473236,
0.18208535015583038,
-0.05392603948712349,
0.03561517223715782,
-0.0029833675362169743,
-0.03875579684972763,
-0.17567971348762512,
-0.18872787058353424,
-0.062173031270504,
-0.0994025394320488,
0.01834859699010849,
-0.08753403276205063,
0.0754406526684761,
0.04557480290532112,
0.07566115260124207,
-0.08074881881475449,
0.03012710064649582,
0.02179568260908127,
-0.11582353711128235,
0.06483571976423264,
-0.02569747157394886,
0.05691526085138321,
-0.05119806155562401,
-0.013239623978734016,
-0.06691212207078934,
-0.0227663554251194,
-0.041307881474494934,
0.052486784756183624,
-0.000504806637763977,
0.021834203973412514,
-0.14895935356616974,
-0.07907760888338089,
-0.014520417898893356,
0.05745677277445793,
-0.013035708107054234,
0.15633606910705566,
0.010867889970541,
-0.015952402725815773,
0.05161011591553688,
0.1790798008441925,
-0.06035992130637169,
-0.13009387254714966,
-0.01958288997411728,
0.24928845465183258,
0.06460783630609512,
0.07127625495195389,
0.020546454936265945,
0.0007426023366861045,
-0.056697335094213486,
0.3224284052848816,
0.2972385287284851,
-0.047523658722639084,
0.01980678178369999,
0.004891893360763788,
0.02848607674241066,
0.12411483377218246,
0.18672749400138855,
0.0684453696012497,
0.25167930126190186,
-0.054805222898721695,
-0.0008410413865931332,
-0.03337394818663597,
0.023292144760489464,
-0.09549257904291153,
0.14042766392230988,
0.030990850180387497,
-0.09906206279993057,
0.0011220300802960992,
0.10242535173892975,
-0.20011860132217407,
0.10835469514131546,
-0.06855195015668869,
-0.1056647077202797,
-0.024343622848391533,
-0.012998586520552635,
0.10175330191850662,
-0.0004780044255312532,
0.05250420421361923,
-0.030540207400918007,
-0.050433799624443054,
0.05701101943850517,
0.018861759454011917,
-0.2413271963596344,
0.024031009525060654,
0.06466919928789139,
-0.10091456770896912,
0.0205613411962986,
-0.01431333739310503,
0.07010407000780106,
0.07148387283086777,
0.07711625844240189,
-0.05866909772157669,
0.08574520796537399,
0.013701244257390499,
-0.01871314086019993,
0.039073046296834946,
-0.02740251086652279,
0.019353341311216354,
-0.08846577256917953,
0.032806362956762314,
-0.11477576196193695,
0.04752068966627121,
0.008882478810846806,
-0.055589158087968826,
-0.029798349365592003,
0.011081325821578503,
-0.05416538938879967,
0.06530793756246567,
0.07470475882291794,
-0.015573441982269287,
-0.00723332678899169,
-0.10244368016719818,
-0.02811812423169613,
0.042270228266716,
-0.16011832654476166,
-0.05166933313012123,
-0.06645841151475906,
-0.076084665954113,
0.0929279625415802,
0.0018609067192301154,
-0.21268977224826813,
0.013191210106015205,
-0.1193525493144989,
0.029662128537893295,
-0.22521032392978668,
0.08891984820365906,
0.10301489382982254,
0.008610409684479237,
0.014246947132050991,
-0.02936064451932907,
0.026570552960038185,
0.0696641057729721,
-0.11519238352775574,
-0.06799925863742828
] |
null | null | null |
1
|
{}
| null |
app-test-user/test-tensorboard
|
[
"tensorboard",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#tensorboard #region-us
|
1
|
[] |
[
"TAGS\n#tensorboard #region-us \n"
] |
[
10
] |
[
"passage: TAGS\n#tensorboard #region-us \n"
] |
[
0.006534305866807699,
0.040082383900880814,
-0.009093299508094788,
-0.008539018221199512,
0.0775318369269371,
0.03559385612607002,
0.13800844550132751,
0.0633106678724289,
0.24619461596012115,
0.053988438099622726,
0.16592620313167572,
0.061295218765735626,
-0.033082760870456696,
-0.09275762736797333,
0.0019954948220402002,
-0.24875524640083313,
-0.017190655693411827,
0.0222176443785429,
-0.08897604048252106,
0.03018813394010067,
-0.09696152061223984,
-0.11958344280719757,
0.0063727120868861675,
-0.06779000908136368,
-0.12876015901565552,
0.10196448862552643,
0.05243263393640518,
-0.016738098114728928,
0.10779542475938797,
0.003737490391358733,
0.2274874895811081,
0.039320170879364014,
-0.10886677354574203,
-0.11122076958417892,
0.04605049639940262,
0.020880023017525673,
-0.1199897974729538,
0.06093762442469597,
0.10028446465730667,
-0.07296576350927353,
-0.08250174671411514,
0.05501723289489746,
0.0024101075250655413,
0.01748649962246418,
-0.20268984138965607,
-0.04973244294524193,
-0.06384602934122086,
-0.10129387676715851,
0.03687706217169762,
0.002544021001085639,
-0.006732790730893612,
0.16736938059329987,
-0.13431771099567413,
0.03194912150502205,
0.12027584761381149,
-0.3891479969024658,
0.004831512924283743,
0.25729992985725403,
0.062398020178079605,
0.16237030923366547,
-0.06207888945937157,
0.11647333204746246,
0.06223743036389351,
-0.0372677743434906,
-0.010793986730277538,
-0.0860324576497078,
-0.0134416613727808,
0.13656353950500488,
-0.0992596298456192,
0.010151981376111507,
0.16808846592903137,
-0.007409875281155109,
0.08775166422128677,
0.10316295176744461,
-0.09438575059175491,
-0.09470876306295395,
0.05297732725739479,
-0.028867973014712334,
0.010549216531217098,
0.10153214633464813,
0.07364087551832199,
-0.15527833998203278,
-0.16690029203891754,
0.017301948741078377,
-0.2492087185382843,
0.16391491889953613,
-0.029064442962408066,
0.09715521335601807,
-0.2366093546152115,
-0.004513080697506666,
-0.17467094957828522,
-0.02737189643085003,
0.11374334990978241,
-0.039747267961502075,
-0.0560590960085392,
-0.009355051442980766,
-0.023107007145881653,
-0.2604934573173523,
0.09106821566820145,
0.006501164752990007,
0.017551714554429054,
0.0967627540230751,
-0.058897826820611954,
0.17515255510807037,
0.01934587024152279,
0.10916710644960403,
0.042747627943754196,
0.06371613591909409,
-0.01483276579529047,
-0.11632966250181198,
0.04595458135008812,
-0.10044647753238678,
-0.1764654964208603,
0.00488180061802268,
-0.03902721405029297,
0.07065213471651077,
-0.011064891703426838,
-0.06043834239244461,
-0.05692503601312637,
0.038319941610097885,
-0.03820028901100159,
-0.029242465272545815,
0.03474550321698189,
-0.01922685280442238,
0.03722156956791878,
0.09039126336574554,
-0.09299355000257492,
-0.023605596274137497,
0.09000182151794434,
0.0805899053812027,
-0.13292676210403442,
0.005910138599574566,
-0.09631680697202682,
-0.020702853798866272,
0.08199933916330338,
-0.2154531031847,
0.013668173924088478,
-0.0901302918791771,
-0.028829973191022873,
0.017273731529712677,
0.03700585290789604,
-0.050351981073617935,
0.15120579302310944,
0.02488921582698822,
0.007136741187423468,
0.017314443364739418,
-0.03396094590425491,
-0.09684359282255173,
-0.04611826688051224,
0.019119465723633766,
-0.06857654452323914,
0.0940660685300827,
-0.20151877403259277,
0.014020966365933418,
-0.045337218791246414,
0.10880597680807114,
-0.19163331389427185,
-0.038327693939208984,
-0.08018021285533905,
0.12477368116378784,
0.014924260787665844,
0.07737784087657928,
-0.24104569852352142,
0.024682575836777687,
0.021142926067113876,
0.10906314849853516,
-0.21569545567035675,
-0.09515051543712616,
0.17103171348571777,
-0.07748881727457047,
-0.06575267761945724,
0.09529823064804077,
-0.0071313041262328625,
-0.0059824129566550255,
0.019351733848452568,
0.4640624523162842,
-0.04613873362541199,
-0.09412027150392532,
0.061130642890930176,
0.1470334678888321,
-0.13006140291690826,
-0.17436769604682922,
0.002574512967839837,
-0.09879869967699051,
-0.060220204293727875,
-0.01074350904673338,
0.16528618335723877,
0.06612241268157959,
-0.056118451058864594,
0.0036165399942547083,
0.035037774592638016,
-0.006802915129810572,
0.1320066750049591,
0.09333138912916183,
0.16147242486476898,
-0.08355621248483658,
0.03908928856253624,
0.09178625047206879,
-0.0378684476017952,
0.04662730172276497,
0.02733496204018593,
-0.036234620958566666,
0.1591699868440628,
-0.14644141495227814,
-0.014790006913244724,
-0.17186471819877625,
-0.24140551686286926,
0.026623079553246498,
0.004069615621119738,
0.07785973697900772,
0.2129012644290924,
0.144130676984787,
-0.08816449344158173,
-0.012039556168019772,
0.04673686623573303,
0.08228056132793427,
0.028907516971230507,
-0.06073082610964775,
-0.07915239781141281,
0.09059498459100723,
-0.126937597990036,
-0.11751066893339157,
-0.17272667586803436,
0.02603067271411419,
0.1707337200641632,
0.0026946559082716703,
0.09990855306386948,
-0.01345545332878828,
0.00767552712932229,
0.004900574684143066,
0.01618284359574318,
-0.007279687561094761,
0.0670071691274643,
-0.03823050856590271,
-0.12483922392129898,
0.09316583722829819,
-0.12221335619688034,
0.19972530007362366,
0.15858328342437744,
-0.1818452626466751,
0.006076057441532612,
-0.08268936723470688,
0.005037080030888319,
-0.017202559858560562,
0.07732049375772476,
-0.01259735506027937,
0.04409071430563927,
0.0076699513010680676,
0.027103567495942116,
0.005676160100847483,
0.0031201732344925404,
-0.024541517719626427,
-0.04467027261853218,
-0.10407831519842148,
0.09488637000322342,
0.18285910785198212,
-0.07667400687932968,
0.14565077424049377,
0.3194742202758789,
-0.037658147513866425,
0.243926003575325,
-0.044235710054636,
-0.040325827896595,
-0.0035806619562208652,
0.01663234643638134,
0.0030329942237585783,
0.1472717523574829,
-0.21030889451503754,
-0.04524451121687889,
0.006807155907154083,
-0.02152351476252079,
0.10580825060606003,
-0.1721770018339157,
-0.07431776076555252,
-0.04492230340838432,
0.027951331809163094,
0.019052904099225998,
0.06744155287742615,
-0.03777891770005226,
0.03921189159154892,
0.04400498792529106,
-0.0700550302863121,
0.0836176946759224,
-0.019940676167607307,
-0.030445000156760216,
0.11028234660625458,
-0.10444431751966476,
-0.14896880090236664,
-0.14283819496631622,
-0.006238630972802639,
-0.020885249599814415,
0.01706763170659542,
-0.027181608602404594,
-0.13327232003211975,
0.029786938801407814,
0.04076489806175232,
0.07693277299404144,
-0.12285503000020981,
0.05448165163397789,
-0.0461324080824852,
0.028856927528977394,
-0.12792721390724182,
-0.03312709555029869,
-0.035204388201236725,
-0.13709917664527893,
0.0022407269570976496,
0.07620684057474136,
-0.12048324197530746,
0.09188514947891235,
0.27380040287971497,
0.04448940232396126,
0.06134362891316414,
-0.032678768038749695,
0.02956867590546608,
-0.09890352934598923,
-0.007705071475356817,
0.02162044867873192,
-0.07093516737222672,
0.062233779579401016,
0.11935889720916748,
0.09269148856401443,
-0.10527072846889496,
-0.04987248033285141,
0.034401196986436844,
-0.1883547604084015,
-0.25223538279533386,
-0.013318047858774662,
-0.10664031654596329,
0.12765152752399445,
-0.010244959965348244,
0.08595813065767288,
0.0968375951051712,
0.03425378352403641,
0.19851519167423248,
-0.06700027734041214,
-0.051622986793518066,
-0.02809843420982361,
0.08555478602647781,
-0.057794809341430664,
0.039914410561323166,
-0.08637060225009918,
-0.06656038016080856,
0.0951273962855339,
0.17627309262752533,
0.18293945491313934,
0.2262241542339325,
0.12174708396196365,
0.05237455293536186,
0.07176508009433746,
0.16263093054294586,
0.04854431748390198,
0.009847785346210003,
-0.08180344849824905,
0.003412329126149416,
-0.012104667723178864,
0.048380542546510696,
0.04426087811589241,
0.17464037239551544,
-0.19392864406108856,
0.07483029365539551,
-0.1622130125761032,
0.08953690528869629,
-0.034756071865558624,
0.10043641179800034,
-0.09362896531820297,
0.06334324181079865,
0.09393108636140823,
0.06349558383226395,
0.021082786843180656,
0.1126493290066719,
0.14653423428535461,
0.009260405786335468,
-0.0002819515357259661,
-0.04511117935180664,
0.04465794190764427,
-0.06152704730629921,
0.04610077664256096,
-0.08046658337116241,
-0.09758474677801132,
-0.014994517900049686,
0.006666599772870541,
-0.10874383896589279,
0.2857326567173004,
0.02909725345671177,
-0.10487885028123856,
-0.007926207035779953,
-0.0647149607539177,
0.03559763357043266,
0.12385048717260361,
0.12570184469223022,
0.026917073875665665,
-0.12048140913248062,
-0.02957685850560665,
-0.04124798625707626,
-0.002007595496252179,
0.1673530638217926,
-0.035056523978710175,
-0.12182541936635971,
0.04882395267486572,
0.025706930086016655,
0.005599671509116888,
0.05435368791222572,
0.044002898037433624,
-0.090218186378479,
-0.002929751295596361,
0.027573231607675552,
-0.2637743651866913,
0.03755902871489525,
-0.02500910870730877,
-0.10820615291595459,
0.14254985749721527,
-0.03448507562279701,
0.043258000165224075,
-0.08007816225290298,
-0.1022154912352562,
0.07443202286958694,
-0.05512619763612747,
0.017121823504567146,
-0.053266968578100204,
-0.0823264792561531,
-0.10395817458629608,
-0.18965312838554382,
0.18064510822296143,
-0.0017855956684798002,
0.11902352422475815,
-0.10145214945077896,
0.14987261593341827,
-0.038852643221616745,
0.06894510239362717,
-0.04297667741775513,
0.035979658365249634,
0.017632311210036278,
-0.07902280986309052,
0.16651985049247742,
-0.07523375749588013,
0.004555414896458387,
-0.011774768121540546,
0.0038204749580472708,
0.0699034109711647,
0.07120711356401443,
0.02326379157602787,
0.23549024760723114,
0.29437851905822754,
-0.07385604828596115,
0.11396823823451996,
0.20734500885009766,
-0.05291252210736275,
-0.30148735642433167,
0.10052309185266495,
-0.20030371844768524,
-0.04426584765315056,
0.09160439670085907,
-0.18596161901950836,
0.13254234194755554,
0.11458098888397217,
-0.07521940022706985,
0.3238363265991211,
-0.2565675675868988,
-0.06857524812221527,
0.13920506834983826,
0.05302320793271065,
0.5374566912651062,
-0.19167585670948029,
-0.14127697050571442,
0.04441896080970764,
0.013318231329321861,
0.12890884280204773,
-0.16842475533485413,
0.0728289932012558,
0.014546197839081287,
0.01566125638782978,
0.03629394993185997,
-0.06888817250728607,
0.16679249703884125,
-0.014936079271137714,
0.07837609946727753,
-0.052132926881313324,
-0.2210337370634079,
0.12277866899967194,
-0.04610269144177437,
-0.08079098165035248,
0.08026248216629028,
-0.08035223186016083,
-0.060518406331539154,
0.01706060767173767,
-0.04668021202087402,
0.07656943798065186,
0.06021571159362793,
-0.08627637475728989,
-0.08714277297258377,
0.003907916601747274,
-0.14477157592773438,
0.009725140407681465,
0.40106749534606934,
-0.04313361644744873,
0.14455673098564148,
0.13014432787895203,
-0.007941126823425293,
-0.10473056882619858,
0.0027659651823341846,
-0.02052965760231018,
-0.047221653163433075,
0.10192827135324478,
-0.16705648601055145,
0.014505027793347836,
0.14206208288669586,
-0.006061878055334091,
0.03216275945305824,
0.09032479673624039,
-0.0998530238866806,
0.04135839268565178,
0.13470561802387238,
-0.2372065931558609,
-0.2276301085948944,
0.019489768892526627,
-0.1661357432603836,
0.14527848362922668,
0.13003119826316833,
0.10338665544986725,
0.10191649943590164,
0.05746182054281235,
0.05109937861561775,
-0.0356719084084034,
-0.03869183734059334,
-0.025615138933062553,
0.1204795390367508,
0.00017679110169410706,
-0.04285447672009468,
0.171995609998703,
0.08510071039199829,
-0.21411463618278503,
-0.02836952544748783,
0.16565562784671783,
-0.0385296605527401,
-0.10193949192762375,
-0.11772961169481277,
0.17325401306152344,
-0.011026784777641296,
-0.037115562707185745,
-0.02811659686267376,
-0.007137268781661987,
-0.01016274094581604,
0.2518148124217987,
0.0386812798678875,
0.038403142243623734,
-0.0008132343064062297,
0.016317283734679222,
0.08361152559518814,
-0.05280783772468567,
-0.1497855931520462,
0.02281280979514122,
-0.08427359908819199,
-0.1410912722349167,
-0.017312582582235336,
0.11129982769489288,
-0.11773894727230072,
-0.10161316394805908,
-0.2491796314716339,
0.05327007547020912,
-0.0910675898194313,
-0.05487034097313881,
-0.044580671936273575,
-0.08928315341472626,
0.03381570056080818,
-0.02812962420284748,
-0.07149658352136612,
-0.07236111164093018,
-0.1551908254623413,
0.06809859722852707,
0.04554399102926254,
0.011201856657862663,
-0.06434933096170425,
-0.03963441029191017,
0.08687692135572433,
0.025220230221748352,
0.12771764397621155,
0.082013800740242,
0.04152253642678261,
0.18243716657161713,
-0.1501743495464325,
-0.01761900819838047,
0.10919995605945587,
-0.01925143040716648,
0.09462504088878632,
0.19312314689159393,
-0.06751598417758942,
-0.03679061681032181,
0.0510183647274971,
0.07104808837175369,
-0.059433627873659134,
-0.060671303421258926,
0.03877999633550644,
-0.06834893673658371,
-0.2284054011106491,
-0.009992959909141064,
-0.06539954990148544,
0.09991227090358734,
0.058634016662836075,
-0.00027098399004898965,
0.020929796621203423,
0.06235665827989578,
-0.008794697932898998,
0.02410396933555603,
0.04511919245123863,
-0.11267971992492676,
0.14046703279018402,
0.0027178891468793154,
-0.0252967718988657,
-0.0649188905954361,
0.27104905247688293,
0.005602406803518534,
-0.07680805772542953,
0.01836322620511055,
0.05288991332054138,
0.004493705928325653,
0.0449032187461853,
0.11125405132770538,
0.06976531445980072,
-0.09063713997602463,
-0.13550062477588654,
0.10190224647521973,
0.022734636440873146,
0.07267526537179947,
0.1796099692583084,
0.0359480194747448,
-0.12333399057388306,
0.1269669383764267,
0.05699457600712776,
0.03636976331472397,
-0.046730343252420425,
0.03961396589875221,
-0.03227224200963974,
0.05634014680981636,
0.01666443422436714,
0.05149773880839348,
0.19840560853481293,
0.0055618309415876865,
0.04533267021179199,
-0.053552042692899704,
-0.036609239876270294,
-0.15399031341075897,
-0.20292934775352478,
-0.006851530633866787,
-0.08021155744791031,
0.04627020284533501,
0.0034772257786244154,
-0.0693468302488327,
0.1849307119846344,
0.07155299931764603,
-0.011455461382865906,
0.173631951212883,
0.013396657072007656,
-0.015936201438307762,
0.011102026328444481,
0.024454845115542412,
-0.024571284651756287,
-0.07362693548202515,
-0.046804286539554596,
-0.11532551050186157,
-0.06637564301490784,
-0.13220487534999847,
0.00648867804557085,
0.006814941763877869,
-0.061106301844120026,
-0.10313733667135239,
-0.07081706076860428,
-0.05106806755065918,
0.08828683197498322,
-0.06221487745642662,
0.04690957069396973,
0.013155735097825527,
-0.0325162373483181,
0.004265311639755964,
0.13354544341564178,
-0.04418211802840233,
0.19520826637744904,
0.01733444817364216,
0.057325683534145355,
-0.08655567467212677,
0.14671412110328674,
-0.11510784924030304,
-0.04702206328511238,
-0.049668941646814346,
0.21314097940921783,
0.2697785198688507,
-0.10950656235218048,
0.029643947258591652,
0.05186738818883896,
0.04509321227669716,
0.021570339798927307,
0.13584278523921967,
-0.022682486101984978,
0.22315897047519684,
-0.07343065738677979,
-0.08755403012037277,
0.0019292806973680854,
0.000021470044885063544,
-0.059498004615306854,
0.10911418497562408,
0.10399774461984634,
0.014686492271721363,
-0.15425388514995575,
0.12033320963382721,
-0.19316740334033966,
0.041566263884305954,
0.11095796525478363,
-0.2682863473892212,
-0.08381310850381851,
-0.009971278719604015,
0.15978001058101654,
-0.14378440380096436,
0.13917388021945953,
-0.08626604080200195,
-0.15523628890514374,
-0.2506062388420105,
0.02322297915816307,
-0.3218576908111572,
-0.032319702208042145,
0.04714911803603172,
0.04720594361424446,
0.14400415122509003,
-0.04924777150154114,
-0.03366464748978615,
0.0530356727540493,
0.058395866304636,
0.020577644929289818,
-0.0020431592129170895,
0.05586469918489456,
-0.031111473217606544,
-0.21589411795139313,
0.005784761626273394,
0.01558147557079792,
-0.1156139224767685,
0.125509113073349,
-0.010575098916888237,
0.010035510174930096,
-0.09306186437606812,
-0.08910681307315826,
-0.00035987369483336806,
0.0038179580587893724,
-0.1274755746126175,
0.043872538954019547,
0.0076584769412875175,
0.0704798549413681,
-0.014906637370586395,
-0.01729007065296173,
-0.06253746151924133,
0.0846986249089241,
-0.05800727754831314,
-0.15307888388633728,
0.07692579180002213,
-0.046785108745098114,
0.1000940129160881,
-0.029083101078867912,
-0.18937113881111145,
-0.0031897935550659895,
-0.049009356647729874,
0.11895490437746048,
-0.11694790422916412,
-0.011713004671037197,
0.14187675714492798,
0.016867628321051598,
-0.016983844339847565,
-0.24412201344966888,
0.06687481701374054,
-0.03824077174067497,
-0.08858684450387955,
-0.06839192658662796
] |
null | null |
transformers
|
# DialoGPT-medium-simpsons
This is a version of [DialoGPT-medium](https://huggingface.co/microsoft/DialoGPT-medium) fine-tuned on The Simpsons scripts.
|
{"tags": ["conversational"]}
|
text-generation
|
arampacha/DialoGPT-medium-simpsons
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# DialoGPT-medium-simpsons
This is a version of DialoGPT-medium fine-tuned on The Simpsons scripts.
|
[
"# DialoGPT-medium-simpsons\n\nThis is a version of DialoGPT-medium fine-tuned on The Simpsons scripts."
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# DialoGPT-medium-simpsons\n\nThis is a version of DialoGPT-medium fine-tuned on The Simpsons scripts."
] |
[
55,
35
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# DialoGPT-medium-simpsons\n\nThis is a version of DialoGPT-medium fine-tuned on The Simpsons scripts."
] |
[
0.03159426152706146,
-0.007857142947614193,
-0.0052665905095636845,
0.001265264581888914,
0.14387908577919006,
0.05930405482649803,
0.12986093759536743,
0.1544528305530548,
-0.06198405101895332,
-0.056023284792900085,
0.08727718144655228,
0.10909537225961685,
0.0013077766634523869,
0.0810343325138092,
-0.01732805371284485,
-0.35024890303611755,
0.03850138559937477,
0.05049142614006996,
0.04988817125558853,
0.12569041550159454,
0.0862487405538559,
0.011219204403460026,
0.017794596031308174,
0.043236084282398224,
-0.2707901895046234,
-0.002058143960312009,
0.09432031959295273,
-0.11202963441610336,
0.09513868391513824,
0.08781851083040237,
0.055747140198946,
0.004170849919319153,
-0.05315735936164856,
-0.054315101355314255,
0.04158063605427742,
-0.0017027170397341251,
-0.0066938940435647964,
0.037933241575956345,
-0.018624700605869293,
-0.004055751953274012,
0.229192852973938,
0.1061856672167778,
-0.02455776371061802,
0.04470429569482803,
-0.11801082640886307,
-0.1252926141023636,
0.013901710510253906,
0.05547273904085159,
0.07222025096416473,
0.08979161083698273,
-0.04431440308690071,
0.11111466586589813,
-0.11652360111474991,
0.06691848486661911,
0.18494857847690582,
-0.37164947390556335,
-0.04150458425283432,
0.10906179994344711,
0.06625813990831375,
-0.08673818409442902,
-0.06732816994190216,
0.09270158410072327,
-0.02270546182990074,
-0.02473641186952591,
-0.054226092994213104,
-0.08552578836679459,
-0.06289535015821457,
-0.04228302836418152,
-0.0905374065041542,
-0.009923038072884083,
0.21568416059017181,
-0.0037013862747699022,
0.020357392728328705,
-0.11618097126483917,
0.0058840615674853325,
-0.026653936132788658,
-0.046020545065402985,
-0.0414879247546196,
-0.11135580390691757,
0.030312519520521164,
-0.07054881006479263,
-0.09899713844060898,
-0.15548600256443024,
-0.05013072118163109,
-0.1408192217350006,
0.2856535017490387,
0.03603363409638405,
0.040814872831106186,
-0.20746515691280365,
0.09610395133495331,
-0.029649624601006508,
-0.08208867907524109,
0.06748522073030472,
-0.06526171416044235,
0.04259685426950455,
0.04957650974392891,
-0.11344980448484421,
-0.03140377625823021,
0.06894387304782867,
0.1686123162508011,
0.11315786093473434,
-0.005590121727436781,
-0.04499038681387901,
0.08023609220981598,
0.03977898508310318,
0.09042153507471085,
0.01342477835714817,
-0.05212869867682457,
0.04462195560336113,
-0.05821184813976288,
0.05196128040552139,
-0.07040314376354218,
-0.19996927678585052,
-0.008411794900894165,
-0.003945683594793081,
0.020673876628279686,
0.03788350149989128,
0.15417394042015076,
0.014233903959393501,
-0.03326096385717392,
0.13873137533664703,
-0.06663165986537933,
0.010488157160580158,
-0.01963798515498638,
0.022624880075454712,
0.15406012535095215,
-0.035204801708459854,
0.01591317541897297,
-0.0884731337428093,
0.05735335499048233,
-0.01611408032476902,
0.025836827233433723,
-0.014983798377215862,
-0.04338628053665161,
-0.004912008531391621,
0.04259651154279709,
-0.0038366171065717936,
-0.171845942735672,
-0.015430808067321777,
0.0345102958381176,
-0.04128960520029068,
-0.05128977820277214,
-0.06674077361822128,
-0.030074147507548332,
-0.054435886442661285,
0.09643442928791046,
-0.03514734283089638,
-0.013012557290494442,
-0.012141947634518147,
0.09248965978622437,
-0.0005967772449366748,
0.10670198500156403,
-0.11838503926992416,
0.05939081311225891,
-0.05841665714979172,
-0.0402035117149353,
-0.2078721821308136,
0.02196660451591015,
0.0386246033012867,
-0.04877414181828499,
0.014636663720011711,
-0.04242387041449547,
-0.15547214448451996,
0.07775894552469254,
-0.053245190531015396,
0.1694265455007553,
-0.13329429924488068,
-0.0758170336484909,
0.24751675128936768,
-0.06815150380134583,
-0.021810399368405342,
0.16098354756832123,
0.019492750987410545,
0.05994967371225357,
0.0720609799027443,
0.2513923943042755,
0.02783956006169319,
-0.018118610605597496,
0.08021479845046997,
0.0965467244386673,
-0.09495100378990173,
0.09789586812257767,
0.02768215537071228,
0.0010729610221460462,
-0.06552744656801224,
0.04823346808552742,
0.03197816386818886,
0.11614535748958588,
-0.009695593267679214,
0.015292402356863022,
-0.005158798303455114,
0.02427958883345127,
0.18158358335494995,
-0.03976869210600853,
0.09863893687725067,
-0.06319298595190048,
-0.11405088752508163,
0.02629510872066021,
0.026710432022809982,
0.007304959464818239,
0.07004796713590622,
-0.07372082769870758,
0.009455015882849693,
0.0900978147983551,
0.055698517709970474,
-0.13013425469398499,
0.0012363638961687684,
-0.07979236543178558,
0.2275015264749527,
0.08589144051074982,
0.09092602878808975,
0.044938862323760986,
-0.1170586422085762,
-0.0349104106426239,
0.08970234543085098,
0.088900625705719,
0.0057019698433578014,
0.03223617374897003,
-0.0642850399017334,
0.06130369007587433,
-0.051566388458013535,
0.11961299926042557,
0.061095621436834335,
0.02252967096865177,
-0.04157402738928795,
0.1590949147939682,
-0.03427686542272568,
-0.0018194470321759582,
0.024224795401096344,
-0.04042988643050194,
-0.049835480749607086,
-0.002704452723264694,
0.03607746213674545,
0.033947668969631195,
-0.07725431770086288,
0.2451559156179428,
-0.22395853698253632,
0.14107467234134674,
0.22273170948028564,
-0.18370303511619568,
-0.06266653537750244,
-0.04661419615149498,
0.023860884830355644,
0.007893349044024944,
0.02408383972942829,
-0.13824333250522614,
0.12979410588741302,
-0.06360967457294464,
0.14342957735061646,
0.003646606346592307,
-0.018677404150366783,
-0.046215616166591644,
-0.05706516280770302,
-0.016997303813695908,
0.005423982162028551,
0.12509647011756897,
-0.08557630330324173,
0.13807886838912964,
0.1540471762418747,
0.06731218844652176,
0.2408272624015808,
0.0407322458922863,
0.004831074271351099,
0.008846299722790718,
-0.03885547071695328,
-0.020968150347471237,
-0.06235470622777939,
-0.23737263679504395,
-0.050090618431568146,
0.07573912292718887,
0.023741435259580612,
0.04679398983716965,
-0.11055449396371841,
-0.08130253106355667,
0.003545790910720825,
-0.013802946545183659,
0.046051785349845886,
0.15321533381938934,
0.04159615561366081,
0.11497262120246887,
0.00984635017812252,
-0.06621654331684113,
0.07996722310781479,
0.0014488096348941326,
-0.04492073878645897,
0.08675254881381989,
-0.14015232026576996,
-0.38086479902267456,
-0.13053904473781586,
-0.07837949693202972,
-0.037081945687532425,
0.01044392865151167,
0.14559108018875122,
-0.09393729269504547,
0.01968061551451683,
-0.027509232982993126,
0.1320650279521942,
-0.12879662215709686,
0.03698485344648361,
-0.05785426124930382,
-0.06881292164325714,
-0.12710417807102203,
-0.10051915794610977,
-0.04891437292098999,
-0.047815944999456406,
-0.09405455738306046,
0.09524920582771301,
-0.051331114023923874,
0.019729888066649437,
0.2247273325920105,
0.018731819465756416,
0.058849480003118515,
-0.10539139807224274,
0.17795409262180328,
-0.13145162165164948,
0.032989807426929474,
0.18359744548797607,
-0.07168471813201904,
-0.017734330147504807,
0.08603028208017349,
0.02373521216213703,
-0.0350152850151062,
0.041113950312137604,
-0.040227919816970825,
-0.09411437809467316,
-0.07515010237693787,
-0.051787301898002625,
-0.11373429745435715,
0.09797727316617966,
0.010813061147928238,
0.04956064000725746,
0.11265923827886581,
-0.0025057694874703884,
-0.060964081436395645,
0.011723557487130165,
0.05321836099028587,
0.044232483953237534,
0.3539104461669922,
-0.04352186247706413,
0.15561923384666443,
-0.016771225258708,
-0.1276782900094986,
0.02531886287033558,
0.05678165704011917,
0.03719038516283035,
0.10272298008203506,
0.09027869999408722,
0.0032220506109297276,
0.0010633683996275067,
0.11425691097974777,
-0.0035039326176047325,
0.047651611268520355,
-0.06404732912778854,
-0.022461680695414543,
-0.0262789074331522,
-0.04190636798739433,
0.005336516536772251,
0.04614068940281868,
-0.1553228497505188,
0.019315868616104126,
-0.042039379477500916,
0.07717376202344894,
-0.015348637476563454,
0.0023485333658754826,
-0.05146418884396553,
-0.0009245327091775835,
-0.009471824392676353,
-0.03172798827290535,
-0.05278211832046509,
0.06824346631765366,
0.09087793529033661,
-0.10015059262514114,
-0.013993890956044197,
0.012327943928539753,
0.04948176071047783,
0.012120614759624004,
0.07194232195615768,
-0.09442424029111862,
-0.06717905402183533,
0.00870593823492527,
0.06703472137451172,
-0.2463250309228897,
0.0993674024939537,
-0.015403227880597115,
-0.0012546840589493513,
-0.06625181436538696,
-0.03688180446624756,
0.05737071484327316,
0.04486070200800896,
0.06317155808210373,
0.011254562065005302,
-0.06603986024856567,
-0.045242927968502045,
-0.05380094423890114,
0.0012915286934003234,
0.13185350596904755,
-0.012104837223887444,
-0.04791836068034172,
-0.08493471890687943,
-0.015710821375250816,
0.032029639929533005,
0.09023428708314896,
-0.014086617156863213,
-0.17910602688789368,
0.12184777110815048,
0.18453368544578552,
-0.004960233811289072,
0.02776291035115719,
-0.05492926016449928,
-0.08208757638931274,
0.18536639213562012,
0.04624006524682045,
-0.06959544867277145,
-0.07689817994832993,
-0.01248163916170597,
0.018079865723848343,
-0.04492189362645149,
0.04042831435799599,
-0.02658885531127453,
0.06219911947846413,
-0.05489085242152214,
-0.1772594451904297,
0.11571203172206879,
-0.0643574669957161,
-0.08205417543649673,
-0.022055238485336304,
0.20544135570526123,
-0.043340906500816345,
0.09589850902557373,
0.06298422813415527,
0.02850901335477829,
-0.030562013387680054,
-0.09935390204191208,
-0.012765606865286827,
0.0038050024304538965,
-0.03180699050426483,
-0.05325186625123024,
-0.009665158577263355,
-0.038424547761678696,
-0.05425921827554703,
-0.015644539147615433,
0.3339548110961914,
0.11718356609344482,
-0.11812033504247665,
0.08751832693815231,
-0.039668381214141846,
-0.021282901987433434,
-0.21671046316623688,
-0.09715741872787476,
-0.07730688154697418,
-0.033479295670986176,
-0.024883607402443886,
-0.1191108450293541,
0.0829351544380188,
0.040576834231615067,
-0.036460135132074356,
0.15274295210838318,
-0.32277458906173706,
-0.14597515761852264,
0.17864388227462769,
-0.05056796595454216,
0.3286101520061493,
-0.16018369793891907,
-0.04588651284575462,
0.0071703107096254826,
-0.19329504668712616,
0.16427791118621826,
-0.009762933477759361,
0.1387585997581482,
-0.004978014621883631,
0.1365828663110733,
0.017232276499271393,
0.012260274961590767,
0.03763990476727486,
0.03537622466683388,
-0.06206393241882324,
-0.07662602514028549,
-0.13029851019382477,
-0.06410038471221924,
0.012896002270281315,
0.09138898551464081,
-0.1291487216949463,
-0.054860085248947144,
-0.1433481127023697,
-0.043349701911211014,
-0.13166868686676025,
0.047728098928928375,
-0.013323948718607426,
-0.046716898679733276,
-0.03782251477241516,
-0.048582274466753006,
-0.06004229933023453,
0.005324583034962416,
0.06662163138389587,
-0.1479138731956482,
0.16991601884365082,
0.030359143391251564,
0.05738718435168266,
-0.09356697648763657,
-0.06953009963035583,
-0.039209067821502686,
-0.06457311660051346,
0.06980433315038681,
-0.04128704592585564,
-0.00031161695369519293,
0.050781842321157455,
0.01732456684112549,
0.11202549189329147,
0.07351667433977127,
0.0016935765743255615,
0.0307158250361681,
0.1237616315484047,
-0.16561083495616913,
-0.055941835045814514,
-0.05599389225244522,
0.08884859085083008,
0.11163591593503952,
0.02894751913845539,
0.1700805127620697,
-0.0262224730104208,
-0.05568612739443779,
0.035567231476306915,
-0.016664743423461914,
0.0003183991357218474,
-0.001995439175516367,
0.03461068868637085,
0.0381498783826828,
-0.11632972210645676,
0.005121368449181318,
0.0024872443173080683,
-0.11831187456846237,
0.08560884743928909,
0.1538143903017044,
-0.06327487528324127,
-0.12468070536851883,
-0.14669989049434662,
-0.028368234634399414,
-0.030608616769313812,
-0.009669654071331024,
-0.04800555482506752,
-0.11941798031330109,
0.03803332895040512,
0.05486727133393288,
0.03690504655241966,
0.044920869171619415,
-0.06484547257423401,
0.019070366397500038,
-0.04302762448787689,
-0.018744712695479393,
0.026111610233783722,
0.05418815091252327,
0.010570550337433815,
0.15063919126987457,
0.0238910261541605,
0.0545944981276989,
-0.09006386250257492,
-0.09794007241725922,
-0.10268031805753708,
-0.00742408586665988,
-0.009997398592531681,
-0.07501745969057083,
-0.12348946183919907,
-0.048594895750284195,
0.05668094381690025,
-0.015944717451930046,
-0.03271479904651642,
0.004175493028014898,
-0.09287551790475845,
-0.032488733530044556,
-0.05980038270354271,
-0.06281685084104538,
-0.08882232755422592,
0.04610266536474228,
0.05339933931827545,
-0.032931044697761536,
0.07467164099216461,
0.13475528359413147,
-0.09845318645238876,
0.09660272300243378,
-0.10008124262094498,
-0.13821350038051605,
0.06510747969150543,
-0.011086743324995041,
0.02086850255727768,
0.10065054148435593,
0.009850057773292065,
0.05876491591334343,
0.10347366333007812,
0.049818966537714005,
0.0876702144742012,
-0.10537068545818329,
0.0333983488380909,
-0.01225526537746191,
-0.07301590591669083,
-0.04373548924922943,
-0.08570726960897446,
0.08338874578475952,
0.003719086991623044,
0.06247954070568085,
-0.05817283317446709,
0.05947847664356232,
-0.06260550022125244,
0.026984643191099167,
-0.030719105154275894,
-0.1363757997751236,
0.057361435145139694,
-0.042797576636075974,
0.025743640959262848,
0.02620537206530571,
0.14651386439800262,
0.05132579430937767,
-0.07220982760190964,
0.04183666408061981,
0.022870494052767754,
0.01162636000663042,
-0.03671063110232353,
0.09897099435329437,
0.06567451357841492,
-0.11109945923089981,
-0.11402633041143417,
0.05673756077885628,
0.07361418008804321,
0.007262684404850006,
0.16226811707019806,
-0.011084206402301788,
-0.011828756891191006,
0.0671200230717659,
-0.008893615566194057,
0.025665562599897385,
-0.1905973106622696,
-0.1687120944261551,
-0.0615897998213768,
0.1119823083281517,
-0.05988478660583496,
0.09798945486545563,
0.16267406940460205,
0.01942135952413082,
0.00913257896900177,
-0.11850205063819885,
0.00462558725848794,
-0.12069883942604065,
-0.17094546556472778,
-0.025416746735572815,
-0.20893123745918274,
-0.026128381490707397,
-0.05808842554688454,
0.041171349585056305,
0.0447310134768486,
0.0754498541355133,
-0.021933680400252342,
0.14539748430252075,
-0.0042032524943351746,
-0.09220125526189804,
0.10120868682861328,
-0.05483408272266388,
0.05737512558698654,
-0.06146653741598129,
0.0005322127253748477,
-0.0051628896035254,
0.003908427432179451,
0.08056186139583588,
0.04365108534693718,
-0.045882467180490494,
-0.046129610389471054,
-0.1654968559741974,
-0.08852235972881317,
-0.04052260145545006,
0.06503116339445114,
0.07495806366205215,
0.17967455089092255,
0.06574133783578873,
0.005974783096462488,
-0.016157975420355797,
0.18973565101623535,
-0.02435889095067978,
-0.11433646082878113,
-0.0830375924706459,
0.11072487384080887,
0.008561520837247372,
0.006306306459009647,
-0.10739659518003464,
-0.040221475064754486,
-0.1068262979388237,
0.2799336910247803,
0.29517629742622375,
-0.042851705104112625,
0.029339276254177094,
-0.0018565068021416664,
0.039278097450733185,
0.06652113050222397,
0.16848158836364746,
0.13462130725383759,
0.26107257604599,
-0.008344202302396297,
-0.0979142114520073,
-0.015023007988929749,
-0.06081857904791832,
-0.03598567470908165,
-0.020189298316836357,
0.04358945041894913,
-0.07481083273887634,
-0.0036774449981749058,
0.07236845791339874,
-0.16021278500556946,
0.027056457474827766,
-0.15808412432670593,
-0.1636388748884201,
-0.03963647782802582,
0.01747496798634529,
0.12474186718463898,
0.01829719915986061,
0.12431545555591583,
0.014032289385795593,
-0.008817515335977077,
0.037821076810359955,
0.01786409318447113,
-0.1522456705570221,
0.021010158583521843,
0.0584956631064415,
-0.13265173137187958,
0.005785368848592043,
0.00005065126606496051,
0.05742533504962921,
0.05515994504094124,
0.12189967930316925,
-0.006564759183675051,
0.06497154384851456,
-0.0017111472552642226,
-0.014148200862109661,
0.014325611293315887,
0.13577331602573395,
0.008729156106710434,
-0.06817536801099777,
0.06886245310306549,
-0.10699884593486786,
0.05799049884080887,
-0.032185185700654984,
0.026136159896850586,
-0.06436916440725327,
0.047309957444667816,
-0.08881256729364395,
0.1022334173321724,
0.12692421674728394,
-0.01616082526743412,
-0.04222564026713371,
0.03564850240945816,
-0.04873483628034592,
0.025686321780085564,
-0.04924183338880539,
-0.11966890841722488,
-0.15110483765602112,
-0.10557050257921219,
0.05373958498239517,
0.000926103035453707,
-0.14460931718349457,
0.004549428354948759,
-0.11822465062141418,
0.011435422115027905,
-0.08178597688674927,
0.044119615107774734,
0.05396717041730881,
0.01472296193242073,
-0.008938363753259182,
-0.07946302741765976,
0.07084079086780548,
0.09001760929822922,
-0.18080902099609375,
-0.06048384681344032
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Chech
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Czech using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "cs", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("arampacha/wav2vec2-large-xlsr-czech")
model = Wav2Vec2ForCTC.from_pretrained("arampacha/wav2vec2-large-xlsr-czech")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Czech test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "cs", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("arampacha/wav2vec2-large-xlsr-czech")
model = Wav2Vec2ForCTC.from_pretrained("arampacha/wav2vec2-large-xlsr-czech")
model.to("cuda")
chars_to_ignore = [",", "?", ".", "!", "-", ";", ":", '""', "%", "'", '"', "�", '«', '»', '—', '…', '(', ')', '*', '”', '“']
chars_to_ignore_regex = f'[{"".join(chars_to_ignore)}]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
# Note: this models is trained ignoring accents on letters as below
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().strip()
batch["sentence"] = re.sub(re.compile('[äá]'), 'a', batch['sentence'])
batch["sentence"] = re.sub(re.compile('[öó]'), 'o', batch['sentence'])
batch["sentence"] = re.sub(re.compile('[èé]'), 'e', batch['sentence'])
batch["sentence"] = re.sub(re.compile("[ïí]"), 'i', batch['sentence'])
batch["sentence"] = re.sub(re.compile("[üů]"), 'u', batch['sentence'])
batch['sentence'] = re.sub(' ', ' ', batch['sentence'])
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 24.56
## Training
The Common Voice `train`, `validation`.
The script used for training will be available [here](https://github.com/arampacha/hf-sprint-xlsr) soon.
|
{"language": "cs", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "metrics": "wer", "dataset": "common_voice", "model-index": [{"name": "Czech XLSR Wav2Vec2 Large 53", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice cs", "type": "common_voice", "args": "cs"}, "metrics": [{"type": "wer", "value": 24.56, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-large-xlsr-czech
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"cs",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"cs"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cs #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Chech
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Czech using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Czech test data of Common Voice.
Test Result: 24.56
## Training
The Common Voice 'train', 'validation'.
The script used for training will be available here soon.
|
[
"# Wav2Vec2-Large-XLSR-53-Chech\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Czech using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Czech test data of Common Voice.\n\n\n\n\nTest Result: 24.56",
"## Training\n\nThe Common Voice 'train', 'validation'.\n\nThe script used for training will be available here soon."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cs #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Chech\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Czech using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Czech test data of Common Voice.\n\n\n\n\nTest Result: 24.56",
"## Training\n\nThe Common Voice 'train', 'validation'.\n\nThe script used for training will be available here soon."
] |
[
71,
64,
20,
26,
27
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cs #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Chech\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Czech using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Czech test data of Common Voice.\n\n\n\n\nTest Result: 24.56## Training\n\nThe Common Voice 'train', 'validation'.\n\nThe script used for training will be available here soon."
] |
[
-0.17848896980285645,
-0.015683816745877266,
-0.0008451314060948789,
-0.028655830770730972,
0.11681173741817474,
-0.028369205072522163,
0.1355915367603302,
0.07454129308462143,
0.060524024069309235,
0.022581806406378746,
0.08583618700504303,
0.026863839477300644,
0.00040422790334559977,
0.12149392813444138,
0.032267890870571136,
-0.20088855922222137,
0.0581476092338562,
-0.008140289224684238,
0.07384449988603592,
0.09465772658586502,
0.10068083554506302,
-0.05153806880116463,
-0.039253801107406616,
0.12341354787349701,
-0.12844106554985046,
0.010238049551844597,
0.050343580543994904,
-0.10468073189258575,
0.14360879361629486,
0.06478472054004669,
0.028315788134932518,
0.04211888834834099,
0.08499967306852341,
-0.14045342803001404,
0.014548517763614655,
-0.02590501867234707,
-0.028043055906891823,
0.007840855978429317,
0.050079524517059326,
-0.041444920003414154,
0.21574819087982178,
0.0011752384016290307,
-0.04167682304978371,
0.05371973663568497,
-0.019666137173771858,
-0.20685343444347382,
0.007593276910483837,
-0.03179626911878586,
0.12039104849100113,
0.1511010229587555,
-0.06544064730405807,
0.12458624690771103,
-0.11433949321508408,
0.10226714611053467,
0.023414399474859238,
-0.2591198980808258,
0.0023020138032734394,
0.0713588073849678,
0.07045220583677292,
0.09536199271678925,
-0.028904061764478683,
0.08521232008934021,
0.0449938103556633,
0.012587839737534523,
0.05138256773352623,
-0.020355969667434692,
-0.1291847825050354,
-0.013785460032522678,
-0.18433021008968353,
-0.035171203315258026,
0.16915716230869293,
-0.03462742641568184,
-0.056558601558208466,
-0.13508495688438416,
0.0272687841206789,
-0.004022391512989998,
-0.0275941900908947,
-0.034428223967552185,
-0.010311882011592388,
0.010934359394013882,
0.03670024126768112,
-0.052159085869789124,
-0.09765267372131348,
-0.14184433221817017,
0.04678831622004509,
0.11899203062057495,
0.0015708098653703928,
0.02854808233678341,
-0.07861797511577606,
0.025470422580838203,
-0.054433200508356094,
-0.0382835790514946,
-0.037711963057518005,
0.03939443454146385,
-0.10753609240055084,
0.009895674884319305,
-0.1396101415157318,
-0.08916258811950684,
0.04658322408795357,
-0.036718662828207016,
0.10848807543516159,
0.030528998002409935,
0.03533849120140076,
0.05132607743144035,
0.0201950091868639,
0.1769939661026001,
-0.06699877232313156,
0.012841718271374702,
-0.00044921517837792635,
-0.054557304829359055,
-0.018957920372486115,
-0.02566542476415634,
-0.1603611409664154,
-0.13207587599754333,
0.03595795854926109,
0.07427959889173508,
-0.08132576942443848,
0.032011132687330246,
0.015936966985464096,
-0.01065350603312254,
0.06833377480506897,
-0.058704257011413574,
-0.01533173955976963,
0.07044247537851334,
-0.010866908356547356,
0.16023346781730652,
0.005736339371651411,
0.014020772650837898,
-0.11063018441200256,
0.006579498760402203,
0.022326895967125893,
0.05983685702085495,
-0.05534972622990608,
-0.13660478591918945,
0.02442767843604088,
-0.03878021612763405,
0.005238346755504608,
-0.10014716535806656,
-0.002438995521515608,
-0.07040564715862274,
-0.010433231480419636,
0.03741197660565376,
0.006650492083281279,
-0.13362085819244385,
-0.04258700832724571,
0.011250942014157772,
-0.046253859996795654,
0.0180519949644804,
-0.04532180726528168,
0.02188107743859291,
-0.017331499606370926,
0.06183696165680885,
-0.025083143264055252,
0.035968709737062454,
-0.0810275748372078,
-0.07334185391664505,
0.011384719051420689,
0.09770244359970093,
-0.08388666808605194,
-0.06002294272184372,
-0.04724748432636261,
-0.06890405714511871,
0.010458729229867458,
0.08295713365077972,
0.04425271600484848,
0.09821701049804688,
-0.22011521458625793,
-0.07768943905830383,
0.2007523775100708,
-0.11482332646846771,
-0.05230856314301491,
0.19328579306602478,
0.02857843041419983,
0.03271223604679108,
0.12604594230651855,
0.180109441280365,
0.16581064462661743,
-0.24888648092746735,
0.029990890994668007,
0.057709719985723495,
-0.04807395860552788,
-0.12247465550899506,
0.07637699693441391,
-0.023041315376758575,
0.009791133925318718,
0.0421566404402256,
-0.13837142288684845,
0.08292218297719955,
-0.01577671803534031,
-0.04849068447947502,
-0.014176031574606895,
-0.08650128543376923,
0.03778738155961037,
-0.005444323178380728,
0.03903532773256302,
-0.04538571089506149,
-0.0777144506573677,
0.04431203752756119,
0.10593730956315994,
-0.11082988977432251,
0.07109970599412918,
-0.12173604220151901,
0.08698996901512146,
-0.09805813431739807,
0.006037360988557339,
-0.14359112083911896,
0.14440889656543732,
-0.0049626389518380165,
-0.009136887267231941,
0.032835979014635086,
0.1608683168888092,
0.011812250129878521,
-0.012919188477098942,
-0.01453650277107954,
0.01657090336084366,
0.030509578064084053,
-0.02149074897170067,
-0.033127717673778534,
-0.0820331946015358,
-0.041291244328022,
-0.0714285671710968,
0.1280451864004135,
-0.07213487476110458,
-0.014238882809877396,
0.04036053642630577,
-0.03873787075281143,
-0.0701984167098999,
0.018729113042354584,
0.01566820964217186,
0.10658556967973709,
0.021036969497799873,
0.025299284607172012,
0.07036179304122925,
0.04225412756204605,
-0.0880042314529419,
0.13669537007808685,
-0.13226212561130524,
-0.0745764747262001,
0.12311059236526489,
-0.022670334205031395,
-0.0044318740256130695,
0.022580495104193687,
-0.002677913522347808,
0.025394290685653687,
-0.020163754001259804,
-0.018742084503173828,
0.262639582157135,
-0.013185851275920868,
0.08370914310216904,
-0.0603383406996727,
0.01263588946312666,
0.05917693302035332,
-0.08437185734510422,
0.002038949402049184,
0.04566173255443573,
0.07031025737524033,
0.021867673844099045,
0.0034612531308084726,
-0.07514646649360657,
-0.1434701532125473,
0.19452185928821564,
-0.008224247954785824,
-0.12528939545154572,
0.027637002989649773,
-0.03629805147647858,
0.007280174642801285,
0.12371477484703064,
-0.18156090378761292,
-0.08126818388700485,
0.03548523783683777,
0.039350539445877075,
0.09045686572790146,
-0.10369571298360825,
0.03716687858104706,
0.006907486356794834,
-0.1433439999818802,
-0.1889336109161377,
0.08337484300136566,
-0.0972723513841629,
0.0336008183658123,
-0.06450239568948746,
-0.0531775988638401,
-0.001828666660003364,
-0.0332174189388752,
-0.18321581184864044,
0.11278603225946426,
-0.10769007354974747,
-0.2348777800798416,
-0.17929697036743164,
0.0457257479429245,
0.027720842510461807,
0.041293952614068985,
0.08621867001056671,
-0.12390446662902832,
-0.026999881491065025,
0.014742609113454819,
0.07657910138368607,
-0.013056674040853977,
-0.03179499879479408,
-0.039795882999897,
0.01007271558046341,
0.0401805080473423,
-0.11206483840942383,
0.001935821259394288,
-0.03540944308042526,
-0.026573901996016502,
-0.0031489282846450806,
-0.032373636960983276,
0.07134215533733368,
0.1598919779062271,
0.0013538607163354754,
0.034353144466876984,
-0.027580657973885536,
0.14111728966236115,
-0.09389194846153259,
-0.048853494226932526,
0.16406923532485962,
-0.022088853642344475,
-0.06362801045179367,
0.014964209869503975,
-0.0007367233629338443,
-0.04985799640417099,
-0.009651179425418377,
-0.06106366962194443,
-0.09976576268672943,
-0.17906856536865234,
-0.19303642213344574,
-0.08779595047235489,
-0.043043408542871475,
-0.03717997670173645,
-0.020475931465625763,
0.04620867595076561,
0.009937076829373837,
-0.03377709537744522,
-0.1514575332403183,
0.005630252417176962,
0.01744457706809044,
0.1538819968700409,
0.009874401614069939,
0.08065668493509293,
-0.0595175065100193,
0.000843761779833585,
0.000771392835304141,
-0.019160093739628792,
0.09995004534721375,
0.0699298083782196,
0.1488865166902542,
0.048665206879377365,
0.1368773728609085,
0.08830000460147858,
0.10531680285930634,
-0.027408983558416367,
0.006607913412153721,
0.04303024336695671,
-0.05522814393043518,
-0.04166561737656593,
0.033386725932359695,
0.10328754037618637,
-0.11345735937356949,
-0.08770650625228882,
-0.004010492470115423,
0.040141936391592026,
0.1564708948135376,
0.06099221482872963,
-0.17835813760757446,
-0.12205831706523895,
-0.058967433869838715,
-0.03167401999235153,
-0.015760105103254318,
0.035098545253276825,
0.10818570852279663,
-0.12360577285289764,
0.038707803934812546,
0.017447201535105705,
0.0793650671839714,
0.06872552633285522,
0.04218989238142967,
-0.046925030648708344,
0.022558681666851044,
0.027214709669351578,
0.12066304683685303,
-0.2943657338619232,
0.2051534205675125,
0.004762193188071251,
0.1228560283780098,
-0.10207267850637436,
-0.00026124404394067824,
0.01155080646276474,
0.08602799475193024,
0.1146930456161499,
0.014795711264014244,
-0.012705190107226372,
-0.009401332587003708,
-0.06877122819423676,
0.07595434784889221,
0.0002744477824307978,
0.039757873862981796,
0.06008681282401085,
-0.004375113639980555,
-0.022398153319954872,
0.007688730955123901,
-0.05150260031223297,
-0.053901251405477524,
-0.09310582280158997,
0.009678089059889317,
0.1577443927526474,
0.035749178379774094,
0.02381834387779236,
-0.10428525507450104,
-0.14006611704826355,
0.055299900472164154,
-0.06807311624288559,
-0.03572186827659607,
-0.07456238567829132,
-0.0334964320063591,
0.11270575225353241,
-0.0801391750574112,
0.03847147524356842,
0.08370399475097656,
0.15853822231292725,
-0.035645920783281326,
-0.03643890470266342,
0.04378644749522209,
-0.11876548826694489,
-0.05229215696454048,
0.049900852143764496,
0.1476018875837326,
0.08853952586650848,
0.07593178749084473,
0.0529136136174202,
0.03690502792596817,
-0.008172784000635147,
-0.04496457427740097,
-0.006039096508175135,
0.07481755316257477,
-0.137423574924469,
-0.0366060696542263,
-0.007557700853794813,
-0.145205557346344,
-0.11251699924468994,
-0.03513098508119583,
0.1574721783399582,
0.07836424559354782,
-0.10746947675943375,
0.18291565775871277,
0.14866648614406586,
-0.06123143061995506,
-0.2288551926612854,
0.004195087123662233,
0.09387113153934479,
0.16570617258548737,
-0.07099199295043945,
-0.16001591086387634,
0.053956039249897,
-0.033891696482896805,
-0.04908659681677818,
-0.05756969004869461,
-0.17562001943588257,
-0.14621807634830475,
0.14112472534179688,
-0.03980898857116699,
0.13269300758838654,
0.0263370294123888,
0.006482473574578762,
-0.04182911291718483,
0.023611564189195633,
-0.026863643899559975,
-0.08372586965560913,
0.08272943645715714,
0.0527757853269577,
0.06944534182548523,
0.06148942559957504,
-0.027510374784469604,
0.11004044115543365,
0.07823888212442398,
-0.06212713569402695,
0.003195896977558732,
0.07882922887802124,
0.022634893655776978,
0.0075395056046545506,
0.15560965240001678,
-0.14661827683448792,
0.04103325307369232,
-0.05547236651182175,
-0.11110015958547592,
-0.07850831747055054,
0.09216367453336716,
0.010657965205609798,
-0.013294560834765434,
0.02209441363811493,
-0.03724409639835358,
0.022801827639341354,
-0.00972477812319994,
-0.05539388954639435,
-0.14395646750926971,
-0.03594614937901497,
0.0852816179394722,
0.21806709468364716,
-0.04291258379817009,
0.006012601777911186,
0.028785670176148415,
-0.03625062108039856,
0.12931303679943085,
-0.008612147532403469,
0.021531708538532257,
0.06646490097045898,
0.03359922021627426,
0.05439494550228119,
0.0011569923954084516,
-0.07191107422113419,
0.04012182354927063,
0.031675029546022415,
-0.06063218042254448,
-0.07743049412965775,
0.0015757723012939095,
-0.03083065338432789,
0.009612289257347584,
0.03506019338965416,
0.12404396384954453,
-0.094101682305336,
-0.015169626101851463,
-0.0579879991710186,
-0.023221466690301895,
-0.12976431846618652,
0.17666581273078918,
0.0005055056535638869,
0.06282909214496613,
-0.10508931428194046,
-0.02635834738612175,
-0.05412470921874046,
-0.0938863530755043,
0.05689074471592903,
-0.05875395983457565,
-0.11313382536172867,
-0.07437975704669952,
0.05092552676796913,
0.09425725042819977,
-0.049418505281209946,
-0.14691384136676788,
-0.06684763729572296,
-0.08537491410970688,
0.0012793909991160035,
0.16088014841079712,
0.10075905174016953,
-0.013143057934939861,
-0.12883196771144867,
-0.028697971254587173,
-0.13404682278633118,
0.04317694529891014,
0.046715039759874344,
-0.05589478462934494,
-0.08762308210134506,
0.2236272245645523,
0.039011429995298386,
0.01700252667069435,
-0.06199201941490173,
-0.030223336070775986,
0.0007816986762918532,
0.06536252051591873,
-0.14305876195430756,
-0.017470739781856537,
-0.04941825568675995,
-0.003719503292813897,
0.010213770903646946,
-0.06426335871219635,
-0.009516839869320393,
0.08050192892551422,
-0.09273497015237808,
0.05510348081588745,
-0.04595699906349182,
0.029385924339294434,
-0.04625355824828148,
0.05330144241452217,
-0.005513641983270645,
-0.05031058192253113,
0.07782676070928574,
0.1969105452299118,
-0.07534318417310715,
0.17347697913646698,
-0.19177989661693573,
0.011731804348528385,
0.0606868751347065,
0.0819907933473587,
0.005748342256993055,
-0.07626083493232727,
0.04436860978603363,
0.11746856570243835,
0.06216884031891823,
0.004009997006505728,
0.06753359735012054,
-0.0027734802570194006,
0.009272821247577667,
-0.030182696878910065,
0.005475564859807491,
-0.024293001741170883,
0.05462167412042618,
0.07766179740428925,
0.14662322402000427,
0.17353346943855286,
-0.10842102766036987,
0.08565285801887512,
-0.03204985707998276,
0.02157154679298401,
-0.0742570012807846,
0.015260872431099415,
-0.1713215857744217,
-0.08521953225135803,
0.08258681744337082,
-0.02227178029716015,
0.1357409656047821,
0.007791523821651936,
0.12298615276813507,
-0.00995211023837328,
-0.06366970390081406,
0.06742388010025024,
0.005792443640530109,
0.2541362941265106,
0.04360196739435196,
0.04721047729253769,
-0.028460349887609482,
0.03604593127965927,
-0.001839919132180512,
0.06810660660266876,
0.0008368039270862937,
0.1104283407330513,
0.027592049911618233,
0.12799116969108582,
0.12072284519672394,
-0.05629623308777809,
-0.04048779606819153,
-0.12914711236953735,
-0.14360161125659943,
0.020255841314792633,
-0.0607270710170269,
0.20588411390781403,
0.12035668641328812,
-0.09179999679327011,
0.07953216135501862,
-0.0027945064939558506,
-0.09292219579219818,
-0.1475886106491089,
-0.09363231807947159,
-0.04534098133444786,
-0.16305376589298248,
0.012230584397912025,
-0.08687393367290497,
0.006363235879689455,
-0.007939763367176056,
0.040181610733270645,
-0.03990868851542473,
0.18822816014289856,
-0.03214084357023239,
-0.07582484930753708,
0.08132539689540863,
-0.08182980865240097,
0.019473601132631302,
-0.04558433219790459,
0.010073230601847172,
0.1294623762369156,
-0.015584755688905716,
0.0655585303902626,
0.00008910395263228565,
-0.06655821949243546,
-0.0021258429624140263,
-0.06488610059022903,
-0.04807816073298454,
-0.002820137422531843,
-0.015427788719534874,
0.1077798381447792,
0.18077564239501953,
0.1011795848608017,
-0.09248172491788864,
-0.026319049298763275,
0.11087118089199066,
-0.03979407995939255,
-0.15305033326148987,
-0.16205830872058868,
0.14256688952445984,
0.011631225235760212,
0.048969704657793045,
-0.042696863412857056,
0.008544259704649448,
-0.022092966362833977,
0.25670748949050903,
0.18420766294002533,
0.05879506096243858,
0.006290126591920853,
-0.05112834647297859,
-0.0004651553463190794,
0.021033596247434616,
0.07830174267292023,
0.038810767233371735,
0.23108617961406708,
0.005648798309266567,
-0.009469899348914623,
-0.10291151702404022,
-0.0312562920153141,
-0.002457819879055023,
0.014364110305905342,
-0.0905832052230835,
-0.13861709833145142,
0.010841652750968933,
0.13833744823932648,
-0.06648451834917068,
-0.05144009366631508,
-0.1442507654428482,
-0.019864501431584358,
-0.06984280794858932,
0.02557583525776863,
0.0372881256043911,
0.11206407845020294,
0.09127450734376907,
-0.0792911946773529,
0.010008464567363262,
0.13121330738067627,
-0.010791931301355362,
-0.042580217123031616,
-0.05924829840660095,
0.0387352891266346,
-0.129054993391037,
0.005174733232706785,
0.00047069232095964253,
0.1657256931066513,
0.04429154098033905,
0.12700694799423218,
-0.012389695271849632,
0.13018189370632172,
-0.03192105144262314,
-0.1110977828502655,
0.10214813798666,
0.088203065097332,
-0.07566423714160919,
0.07055997103452682,
0.022517096251249313,
-0.16127590835094452,
0.09045399725437164,
-0.09428834170103073,
0.005051358137279749,
-0.04080691561102867,
0.10201618820428848,
-0.05788751691579819,
0.07587496191263199,
0.12427378445863724,
-0.07337726652622223,
-0.06991078704595566,
-0.02641540952026844,
0.06450801342725754,
0.021018659695982933,
-0.0396789088845253,
-0.03704481199383736,
-0.24632181227207184,
-0.022620877251029015,
-0.1410718858242035,
-0.0781625434756279,
-0.1678238809108734,
-0.009309723973274231,
-0.037010617554187775,
-0.06603184342384338,
0.02927561290562153,
0.011040900833904743,
0.025147706270217896,
-0.0026181149296462536,
0.003806253196671605,
0.09222503751516342,
0.08035282790660858,
0.1306082010269165,
-0.17545819282531738,
-0.13897603750228882
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Ukrainian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Ukrainian using the [Common Voice](https://huggingface.co/datasets/common_voice) and sample of [M-AILABS Ukrainian Corpus](https://www.caito.de/2019/01/the-m-ailabs-speech-dataset/) datasets.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "uk", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("arampacha/wav2vec2-large-xlsr-ukrainian")
model = Wav2Vec2ForCTC.from_pretrained("arampacha/wav2vec2-large-xlsr-ukrainian")
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = torchaudio.transforms.Resample(sampling_rate, 16_000)(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Ukrainian test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "uk", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("arampacha/wav2vec2-large-xlsr-ukrainian")
model = Wav2Vec2ForCTC.from_pretrained("arampacha/wav2vec2-large-xlsr-ukrainian")
model.to("cuda")
chars_to_ignore = [",", "?", ".", "!", "-", ";", ":", '""', "%", "'", '"', "�", '«', '»', '—', '…', '(', ')', '*', '”', '“']
chars_to_ignore_regex = f'[{"".join(chars_to_ignore)}]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays and normalize charecters
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(re.compile("['`]"), '’', batch['sentence'])
batch["sentence"] = re.sub(re.compile(chars_to_ignore_regex), '', batch["sentence"]).lower().strip()
batch["sentence"] = re.sub(re.compile('i'), 'і', batch['sentence'])
batch["sentence"] = re.sub(re.compile('o'), 'о', batch['sentence'])
batch["sentence"] = re.sub(re.compile('a'), 'а', batch['sentence'])
batch["sentence"] = re.sub(re.compile('ы'), 'и', batch['sentence'])
batch["sentence"] = re.sub(re.compile("–"), '', batch['sentence'])
batch['sentence'] = re.sub(' ', ' ', batch['sentence'])
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = torchaudio.transforms.Resample(sampling_rate, 16_000)(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 29.89
## Training
The Common Voice `train`, `validation` and the M-AILABS Ukrainian corpus.
The script used for training will be available [here](https://github.com/arampacha/hf-sprint-xlsr) soon.
|
{"language": "uk", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "metrics": "wer", "dataset": "common_voice", "model-index": [{"name": "Ukrainian XLSR Wav2Vec2 Large 53", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice uk", "type": "common_voice", "args": "uk"}, "metrics": [{"type": "wer", "value": 29.89, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-large-xlsr-ukrainian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"uk",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"uk"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #uk #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Ukrainian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice and sample of M-AILABS Ukrainian Corpus datasets.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Ukrainian test data of Common Voice.
Test Result: 29.89
## Training
The Common Voice 'train', 'validation' and the M-AILABS Ukrainian corpus.
The script used for training will be available here soon.
|
[
"# Wav2Vec2-Large-XLSR-53-Ukrainian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice and sample of M-AILABS Ukrainian Corpus datasets.\n\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Ukrainian test data of Common Voice.\n\n\n\nTest Result: 29.89",
"## Training\n\nThe Common Voice 'train', 'validation' and the M-AILABS Ukrainian corpus.\n\nThe script used for training will be available here soon."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #uk #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Ukrainian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice and sample of M-AILABS Ukrainian Corpus datasets.\n\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Ukrainian test data of Common Voice.\n\n\n\nTest Result: 29.89",
"## Training\n\nThe Common Voice 'train', 'validation' and the M-AILABS Ukrainian corpus.\n\nThe script used for training will be available here soon."
] |
[
71,
79,
20,
27,
37
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #uk #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Ukrainian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Ukrainian using the Common Voice and sample of M-AILABS Ukrainian Corpus datasets.\n\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Ukrainian test data of Common Voice.\n\n\n\nTest Result: 29.89## Training\n\nThe Common Voice 'train', 'validation' and the M-AILABS Ukrainian corpus.\n\nThe script used for training will be available here soon."
] |
[
-0.15776360034942627,
0.006267589516937733,
-0.005296268500387669,
-0.06046145409345627,
0.1178831160068512,
-0.046480920165777206,
0.19585223495960236,
0.09804362803697586,
0.06975635141134262,
0.0013508818810805678,
0.014821887947618961,
-0.018387436866760254,
0.046648234128952026,
0.0914294645190239,
0.07922690361738205,
-0.18634770810604095,
0.027446327731013298,
-0.04284267500042915,
0.010611564852297306,
0.04945977032184601,
0.1417289674282074,
-0.034917283803224564,
-0.05509968474507332,
0.04903339967131615,
-0.06897290050983429,
0.06670450419187546,
0.06534191966056824,
-0.14189626276493073,
0.08188819885253906,
0.0402543731033802,
0.021068427711725235,
-0.0013659622054547071,
0.054464418441057205,
-0.1917840838432312,
0.029190881177783012,
0.0017981318524107337,
0.019416632130742073,
0.012355761602520943,
0.1153002455830574,
-0.03897811472415924,
0.2529103755950928,
-0.028929581865668297,
-0.05949172377586365,
0.07391858100891113,
-0.04699695110321045,
-0.20140045881271362,
-0.020422937348484993,
0.06770258396863937,
0.09830433875322342,
0.18861457705497742,
-0.07197090238332748,
0.15131638944149017,
-0.01100863330066204,
0.08544323593378067,
0.09922490268945694,
-0.13438284397125244,
0.024113252758979797,
0.01851225085556507,
0.01084807887673378,
0.10247673839330673,
-0.019137142226099968,
0.0481693297624588,
-0.02757945843040943,
-0.009387131780385971,
-0.034781813621520996,
0.005226167384535074,
-0.16755886375904083,
-0.01934942416846752,
-0.16226531565189362,
-0.07215693593025208,
0.24834875762462616,
-0.0432572178542614,
-0.01898411475121975,
-0.08941875398159027,
0.040269043296575546,
0.03526250645518303,
-0.038884855806827545,
-0.01070675253868103,
-0.025600168853998184,
-0.007437422405928373,
0.01920267939567566,
-0.09512705355882645,
-0.12782129645347595,
-0.07803253829479218,
0.010245459154248238,
0.20594894886016846,
0.033584095537662506,
0.04105981066823006,
-0.08136333525180817,
0.02816057577729225,
-0.06330585479736328,
-0.04258555546402931,
0.007211907766759396,
0.06976932287216187,
-0.1020982563495636,
0.007228748872876167,
-0.08817888796329498,
-0.13388413190841675,
0.06852046400308609,
-0.04926661401987076,
-0.03531908616423607,
-0.009819223545491695,
0.0718081071972847,
0.07213158905506134,
-0.009493274614214897,
0.046415142714977264,
-0.05608111247420311,
0.04840530827641487,
-0.009214334189891815,
-0.04784427955746651,
-0.029412904754281044,
0.030582979321479797,
-0.08940888196229935,
-0.1204533725976944,
0.012664970010519028,
0.09833717346191406,
-0.020671293139457703,
-0.02811632864177227,
0.013106542639434338,
-0.04849463701248169,
0.029011033475399017,
-0.11900308728218079,
-0.044963642954826355,
0.026710594072937965,
-0.052177973091602325,
0.07338664680719376,
0.022259002551436424,
0.03125176951289177,
-0.1566023826599121,
-0.03498552367091179,
0.06326945126056671,
0.08756820112466812,
-0.02212345600128174,
-0.08789659291505814,
0.03158501908183098,
0.029535261914134026,
-0.004755796864628792,
-0.1241544559597969,
-0.043050624430179596,
-0.059489645063877106,
-0.017756786197423935,
0.029185473918914795,
0.05181580036878586,
-0.055046044290065765,
-0.03113286755979061,
-0.029548082500696182,
-0.04699549451470375,
0.08506873995065689,
-0.06213017925620079,
0.05620856583118439,
0.014086209237575531,
0.05973722040653229,
0.028057103976607323,
0.06637226045131683,
-0.08072900772094727,
-0.1352311372756958,
0.000584190827794373,
0.09167647361755371,
-0.05694509670138359,
-0.059953443706035614,
-0.06559061259031296,
-0.0212248582392931,
-0.005363290663808584,
0.07681430876255035,
0.04849746823310852,
0.08235753327608109,
-0.13356252014636993,
-0.09054930508136749,
0.1866965889930725,
-0.12171512842178345,
-0.017877044156193733,
0.1827511042356491,
0.0225814376026392,
0.015176893211901188,
0.14468532800674438,
0.19843310117721558,
0.1290784329175949,
-0.23083458840847015,
0.008021097630262375,
-0.015336166135966778,
-0.07736534625291824,
-0.11689215153455734,
0.14461566507816315,
-0.04494737088680267,
0.043497972190380096,
0.039834726601839066,
-0.11373231559991837,
0.07703596353530884,
-0.02049744315445423,
-0.08399409055709839,
-0.0014431874733418226,
-0.0660814568400383,
-0.05656055361032486,
0.025541676208376884,
0.008217710070312023,
-0.09749439358711243,
-0.06184237450361252,
0.00967884436249733,
0.13341274857521057,
-0.053083185106515884,
0.0762234777212143,
-0.11338561028242111,
0.027495166286826134,
-0.025460589677095413,
0.03083708882331848,
-0.16111886501312256,
0.17347672581672668,
-0.03972742334008217,
-0.02446945384144783,
0.13058854639530182,
0.06530481576919556,
-0.007508109323680401,
0.02877148799598217,
-0.040065474808216095,
0.022367188706994057,
0.03479958325624466,
-0.05858486890792847,
-0.05865807831287384,
-0.12442409247159958,
0.01964203640818596,
-0.035969145596027374,
0.024694154039025307,
-0.06762687861919403,
-0.010316180065274239,
-0.03592122718691826,
0.03952876478433609,
-0.02468048594892025,
0.009766076691448689,
0.04143848270177841,
0.08624988049268723,
-0.00978069193661213,
0.04967126250267029,
0.07833734899759293,
-0.012372056022286415,
-0.03662918135523796,
0.14938701689243317,
-0.13995695114135742,
-0.07695930451154709,
0.1378079354763031,
0.018871905282139778,
0.003544757142663002,
-0.0007836798904463649,
-0.005408806726336479,
0.0077619291841983795,
-0.007517009507864714,
-0.010824953205883503,
0.2240859568119049,
-0.01325990166515112,
0.08418222516775131,
-0.05626502260565758,
0.0010328004136681557,
0.03202475607395172,
-0.0994655042886734,
-0.05077280476689339,
0.0677236020565033,
-0.04755181446671486,
-0.014153962954878807,
-0.003318479750305414,
-0.10578753799200058,
-0.07482120394706726,
0.2820279598236084,
-0.03391198068857193,
-0.14015528559684753,
0.018204720690846443,
-0.030596377328038216,
0.0333053395152092,
0.1424700766801834,
-0.18367530405521393,
-0.06881942600011826,
0.046872131526470184,
0.03831809014081955,
0.09322924166917801,
-0.11571135371923447,
0.011321920901536942,
-0.023451989516615868,
-0.16414965689182281,
-0.13796815276145935,
0.06459319591522217,
-0.04977914318442345,
0.07692617923021317,
-0.09092768281698227,
0.03362679481506348,
-0.07047152519226074,
-0.09157656878232956,
-0.16800662875175476,
0.0711217075586319,
-0.10629294067621231,
-0.18611042201519012,
-0.20483370125293732,
0.026366764679551125,
0.03184495493769646,
0.041119009256362915,
0.09693723917007446,
-0.14855337142944336,
0.00500560412183404,
0.0008371190051548183,
0.18434184789657593,
-0.0052228160202503204,
-0.03980904817581177,
-0.04475874453783035,
0.13265030086040497,
0.061664700508117676,
-0.10379643738269806,
0.003608301980420947,
-0.05378216505050659,
-0.009555850178003311,
0.02175051160156727,
-0.017381910234689713,
0.04732668399810791,
0.10649137198925018,
-0.040144264698028564,
0.020370693877339363,
-0.033364858478307724,
0.07376579940319061,
-0.12556196749210358,
-0.06575217843055725,
0.20688410103321075,
0.05731230601668358,
-0.054640211164951324,
0.04327605292201042,
-0.0064735934138298035,
-0.013731728307902813,
0.038692034780979156,
-0.06018359586596489,
-0.07017495483160019,
-0.2532497048377991,
-0.19902412593364716,
-0.055763766169548035,
-0.08023882657289505,
-0.03874773904681206,
-0.020323170349001884,
-0.02920859307050705,
-0.0009262891835533082,
-0.12897460162639618,
-0.15230673551559448,
0.07194960117340088,
0.02540980838239193,
0.07642599195241928,
0.04671715945005417,
0.05609406158328056,
-0.06929755955934525,
-0.004833289887756109,
0.07296916097402573,
-0.0011013606563210487,
0.1835925132036209,
0.046195078641176224,
0.08515465259552002,
0.07657437771558762,
0.11209052801132202,
0.04131621494889259,
0.05244337022304535,
0.039934441447257996,
0.02662048302590847,
0.041836366057395935,
-0.10287519544363022,
0.046066779643297195,
0.029297519475221634,
0.11090230941772461,
-0.0834452286362648,
-0.08522073179483414,
0.008965646848082542,
0.026408573612570763,
0.13763926923274994,
0.07325691729784012,
-0.13751503825187683,
-0.13796205818653107,
-0.016901792958378792,
-0.0670270249247551,
0.025566160678863525,
0.023106878623366356,
0.06390751898288727,
-0.11189690977334976,
0.04203749820590019,
0.027794305235147476,
0.08681308478116989,
0.04755415767431259,
0.039399653673172,
-0.06320643424987793,
0.06526874750852585,
0.007886727340519428,
0.14406456053256989,
-0.29406991600990295,
0.27995678782463074,
0.007868963293731213,
0.16102050244808197,
-0.06630636751651764,
-0.017451444640755653,
0.06731636077165604,
-0.03615785017609596,
0.08627086877822876,
0.012489506043493748,
-0.09365266561508179,
-0.14534756541252136,
-0.0924074724316597,
0.04454211890697479,
0.033465128391981125,
0.027302509173750877,
0.10410669445991516,
-0.000673870847094804,
0.008940457366406918,
-0.018777349963784218,
-0.05484715849161148,
-0.19392085075378418,
-0.09174651652574539,
0.008778066374361515,
0.16735130548477173,
0.06001322716474533,
-0.011276155710220337,
-0.09586554765701294,
-0.054353829473257065,
-0.043573569506406784,
-0.10783085227012634,
-0.09519185870885849,
-0.06272842735052109,
0.007516541983932257,
0.11463575810194016,
-0.10018879920244217,
-0.06389468908309937,
0.08020901679992676,
0.16381429135799408,
-0.0687062218785286,
-0.0594276525080204,
0.032480638474226,
-0.10139944404363632,
-0.08847366273403168,
0.01674356311559677,
0.1906803995370865,
0.1294541358947754,
0.08026258647441864,
0.008191751316189766,
0.0429498590528965,
0.008134633302688599,
-0.01011374406516552,
0.030333539471030235,
-0.0054391114972531796,
-0.02558000199496746,
0.11745096743106842,
-0.05975718051195145,
-0.270566463470459,
-0.21187163889408112,
-0.1271631121635437,
0.11561143398284912,
0.13884282112121582,
-0.06618881225585938,
0.164091095328331,
0.18329143524169922,
-0.095295250415802,
-0.22283633053302765,
-0.04409962520003319,
0.10477030277252197,
0.12260285019874573,
-0.031808823347091675,
-0.2081717848777771,
0.027252336964011192,
-0.009620082564651966,
-0.03054378367960453,
-0.03659828007221222,
-0.22048111259937286,
-0.1835755854845047,
0.12630674242973328,
-0.0062720621936023235,
0.04972216114401817,
-0.02773442678153515,
0.009954087436199188,
-0.059954915195703506,
0.053377825766801834,
-0.007888132706284523,
-0.07889826595783234,
0.1149214580655098,
0.023474151268601418,
0.08674803376197815,
0.052591606974601746,
-0.0073617505840957165,
0.10181873291730881,
0.08574347198009491,
-0.017506277188658714,
-0.02884368598461151,
0.10376480221748352,
0.11054746061563492,
0.005434185266494751,
0.17858988046646118,
-0.011115806177258492,
0.013045480474829674,
-0.14086423814296722,
-0.08403318375349045,
-0.07111234217882156,
0.07833847403526306,
0.021007992327213287,
-0.008540326729416847,
-0.011343605816364288,
0.0014562857104465365,
-0.006564402487128973,
-0.027594424784183502,
0.045481059700250626,
-0.06806018948554993,
0.03769144415855408,
0.055035993456840515,
0.15509045124053955,
0.00002878298437281046,
-0.0394119918346405,
0.0196754839271307,
-0.022812839597463608,
0.09196105599403381,
-0.008507657796144485,
-0.02036784030497074,
0.03802783787250519,
0.023605747148394585,
0.025280488654971123,
-0.03547196090221405,
-0.13878564536571503,
0.06391440331935883,
0.06487483531236649,
-0.05690985918045044,
-0.08539991825819016,
-0.03450584411621094,
0.04807453975081444,
0.027763521298766136,
-0.003236830933019519,
0.08875581622123718,
-0.10417801141738892,
0.0035553404595702887,
-0.041528813540935516,
-0.008181232027709484,
-0.10288949310779572,
0.2159554362297058,
0.0346069373190403,
0.0632588118314743,
-0.08693066239356995,
0.09724755585193634,
0.01907411217689514,
-0.09767995774745941,
0.045454222708940506,
0.03908146545290947,
-0.09904477000236511,
-0.08274795860052109,
0.01850605197250843,
0.09141900390386581,
-0.07992346584796906,
-0.15123315155506134,
-0.06967175751924515,
-0.06764055043458939,
0.03892822191119194,
0.041881024837493896,
0.0845513716340065,
-0.07839394360780716,
-0.060836587101221085,
-0.017762871459126472,
-0.10203345865011215,
0.1046948954463005,
0.11558143049478531,
-0.06849659234285355,
-0.0507044643163681,
0.2063080370426178,
0.040706466883420944,
0.014090792275965214,
-0.048261214047670364,
-0.059731774032115936,
0.019870521500706673,
0.10248596221208572,
-0.1383647322654724,
-0.004272621124982834,
-0.1092151626944542,
-0.03888487443327904,
-0.016230667009949684,
-0.0968344658613205,
-0.010453679598867893,
0.10401347279548645,
-0.05384264513850212,
0.036582328379154205,
-0.08558869361877441,
0.07384486496448517,
-0.0724950060248375,
0.021401934325695038,
-0.029223604127764702,
-0.021738778799772263,
0.08173394203186035,
0.19333596527576447,
-0.06966492533683777,
0.15529677271842957,
-0.12980559468269348,
-0.0184987373650074,
0.021352525800466537,
0.08043450862169266,
-0.03715825453400612,
-0.11294069141149521,
0.04467989131808281,
0.09261903911828995,
0.12429675459861755,
0.00891275517642498,
0.10532421618700027,
-0.04946776106953621,
-0.024517696350812912,
-0.06900317221879959,
0.03957298398017883,
-0.04021892696619034,
0.09214714914560318,
0.060815420001745224,
0.10669904202222824,
0.2227116972208023,
-0.16303768754005432,
0.06714002043008804,
-0.06629082560539246,
0.034590303897857666,
-0.0665336549282074,
-0.017863493412733078,
-0.16437575221061707,
-0.029530523344874382,
0.07649146020412445,
-0.04444529861211777,
0.12629900872707367,
0.04853987321257591,
0.13280242681503296,
0.07253489643335342,
-0.0665230005979538,
0.0164823979139328,
0.004947919398546219,
0.19050820171833038,
0.0761466696858406,
0.020861685276031494,
-0.034475695341825485,
0.0383606031537056,
0.029948381707072258,
0.03355317935347557,
0.015638135373592377,
0.20894144475460052,
0.13159389793872833,
0.11348937451839447,
0.09464441239833832,
-0.058901481330394745,
-0.04086488112807274,
-0.10881729423999786,
-0.09455233067274094,
-0.017684364691376686,
-0.04106561467051506,
0.13301871716976166,
0.22108210623264313,
-0.14126333594322205,
0.092890165746212,
0.012471672147512436,
-0.08180024474859238,
-0.10743492096662521,
-0.14943015575408936,
-0.027305161580443382,
-0.09216401726007462,
-0.009327852167189121,
-0.09551672637462616,
0.008792865090072155,
0.05202634260058403,
0.05700662359595299,
-0.05662411078810692,
0.11813456565141678,
0.00837018620222807,
-0.1181112602353096,
0.09670372307300568,
-0.07642240077257156,
0.04827120527625084,
-0.005367566365748644,
0.034205351024866104,
0.17016315460205078,
-0.08196072280406952,
0.0717615857720375,
0.07180175185203552,
-0.09260497987270355,
0.018059350550174713,
-0.07945790141820908,
-0.09267923980951309,
0.020262112841010094,
-0.006637652404606342,
0.13937675952911377,
0.19343486428260803,
0.09924758225679398,
-0.07254671305418015,
0.004918200429528952,
0.17974428832530975,
-0.0026802702341228724,
-0.16076116263866425,
-0.07039370387792587,
0.15011727809906006,
0.0506361685693264,
0.014975656755268574,
-0.01824374869465828,
-0.043625541031360626,
-0.08067835867404938,
0.22186845541000366,
0.23034529387950897,
0.08812279999256134,
0.00808790698647499,
-0.04147079959511757,
0.0022764832247048616,
-0.011291585862636566,
0.09807610511779785,
0.038222577422857285,
0.2136271893978119,
0.026527604088187218,
0.021038172766566277,
-0.11200734972953796,
-0.014453374780714512,
-0.09108832478523254,
0.01566910557448864,
-0.06531193852424622,
-0.11578090488910675,
0.03491413593292236,
0.14233426749706268,
-0.05804026126861572,
-0.14767412841320038,
-0.15721407532691956,
-0.031066281720995903,
-0.07369492202997208,
0.018316563218832016,
0.05918911099433899,
0.154801145195961,
0.05569016933441162,
-0.03449402004480362,
0.056552499532699585,
0.024968033656477928,
0.02009214460849762,
-0.09216540306806564,
-0.07394566386938095,
0.030151864513754845,
-0.04231121763586998,
0.01947270892560482,
-0.005576144903898239,
0.20812244713306427,
-0.006012342404574156,
0.06103702262043953,
-0.04146184027194977,
0.14449913799762726,
-0.01666058413684368,
-0.025462985038757324,
0.10418716073036194,
0.13968485593795776,
-0.04995633661746979,
0.06581319868564606,
0.08122552931308746,
-0.10471519827842712,
0.030804932117462158,
-0.08727781474590302,
0.009635024704039097,
-0.06812892854213715,
0.12179134786128998,
-0.08015530556440353,
0.04109546169638634,
0.12290873378515244,
-0.04506431519985199,
-0.04464085400104523,
-0.03587232902646065,
0.034539077430963516,
0.08230326324701309,
-0.0016748362686485052,
-0.01791713386774063,
-0.21957212686538696,
-0.03806929662823677,
-0.1425921469926834,
-0.0009761675028130412,
-0.22679853439331055,
-0.013117263093590736,
-0.026433173567056656,
-0.070423923432827,
0.022297613322734833,
0.07626691460609436,
0.11547207087278366,
0.02367766574025154,
-0.0004149387532379478,
-0.05962133780121803,
0.10177673399448395,
0.10923019051551819,
-0.19941668212413788,
-0.14124727249145508
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - HY-AM dataset.
It achieves the following results on the evaluation set:
- Loss: **0.4521**
- Wer: **0.5141**
- Cer: **0.1100**
- Wer+LM: **0.2756**
- Cer+LM: **0.0866**
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: tristage
- lr_scheduler_ratios: [0.1, 0.4, 0.5]
- training_steps: 1400
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
| 6.1298 | 19.87 | 100 | 3.1204 | 1.0 | 1.0 |
| 2.7269 | 39.87 | 200 | 0.6200 | 0.7592 | 0.1755 |
| 1.4643 | 59.87 | 300 | 0.4796 | 0.5921 | 0.1277 |
| 1.1242 | 79.87 | 400 | 0.4637 | 0.5359 | 0.1145 |
| 0.9592 | 99.87 | 500 | 0.4521 | 0.5141 | 0.1100 |
| 0.8704 | 119.87 | 600 | 0.4736 | 0.4914 | 0.1045 |
| 0.7908 | 139.87 | 700 | 0.5394 | 0.5250 | 0.1124 |
| 0.7049 | 159.87 | 800 | 0.4822 | 0.4754 | 0.0985 |
| 0.6299 | 179.87 | 900 | 0.4890 | 0.4809 | 0.1028 |
| 0.5832 | 199.87 | 1000 | 0.5233 | 0.4813 | 0.1028 |
| 0.5145 | 219.87 | 1100 | 0.5350 | 0.4781 | 0.0994 |
| 0.4604 | 239.87 | 1200 | 0.5223 | 0.4715 | 0.0984 |
| 0.4226 | 259.87 | 1300 | 0.5167 | 0.4625 | 0.0953 |
| 0.3946 | 279.87 | 1400 | 0.5248 | 0.4614 | 0.0950 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
|
{"language": ["hy"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "robust-speech-event", "hy", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-xls-r-1b-hy-cv", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice hy-AM", "type": "mozilla-foundation/common_voice_8_0", "args": "hy-AM"}, "metrics": [{"type": "wer", "value": 0.2755659640905542, "name": "WER LM"}, {"type": "cer", "value": 0.08659585230146687, "name": "CER LM"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-1b-hy-cv
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"robust-speech-event",
"hy",
"hf-asr-leaderboard",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hy"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hy #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - HY-AM dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4521
* Wer: 0.5141
* Cer: 0.1100
* Wer+LM: 0.2756
* Cer+LM: 0.0866
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 8e-05
* train\_batch\_size: 16
* eval\_batch\_size: 64
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
* lr\_scheduler\_type: tristage
* lr\_scheduler\_ratios: [0.1, 0.4, 0.5]
* training\_steps: 1400
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: tristage\n* lr\\_scheduler\\_ratios: [0.1, 0.4, 0.5]\n* training\\_steps: 1400\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hy #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: tristage\n* lr\\_scheduler\\_ratios: [0.1, 0.4, 0.5]\n* training\\_steps: 1400\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
115,
162,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hy #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: tristage\n* lr\\_scheduler\\_ratios: [0.1, 0.4, 0.5]\n* training\\_steps: 1400\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
-0.12427755445241928,
0.0939052551984787,
-0.006090680602937937,
0.04269460588693619,
0.1118260994553566,
0.019063856452703476,
0.09133483469486237,
0.16858915984630585,
-0.08584283292293549,
0.09327121078968048,
0.06354662775993347,
0.06338301301002502,
0.08293169736862183,
0.0890369638800621,
-0.016006357967853546,
-0.28210166096687317,
0.015954555943608284,
-0.017542613670229912,
-0.11869984865188599,
0.08891072869300842,
0.10079743713140488,
-0.09753147512674332,
0.021654359996318817,
0.020064741373062134,
-0.07069739699363708,
0.016169417649507523,
-0.030920611694455147,
-0.046904485672712326,
0.09025036543607712,
0.039545994251966476,
0.05398952215909958,
0.035757288336753845,
0.0913311317563057,
-0.24100971221923828,
0.008491283282637596,
0.07774943113327026,
0.036802228540182114,
0.06379572302103043,
0.10419979691505432,
0.008325825445353985,
0.10396881401538849,
-0.06328445672988892,
0.037663549184799194,
0.057431962341070175,
-0.09658979624509811,
-0.271047979593277,
-0.0941615179181099,
0.033576417714357376,
0.12622302770614624,
0.08239316195249557,
-0.02820943109691143,
0.02745162695646286,
-0.08289185166358948,
0.08399564027786255,
0.2231408804655075,
-0.18670888245105743,
-0.06501288712024689,
-0.022709187120199203,
0.03179818019270897,
0.03039398603141308,
-0.10017214715480804,
-0.020692121237516403,
0.016863994300365448,
0.01903500221669674,
0.09011241048574448,
0.014956815168261528,
0.018440838903188705,
0.003888859413564205,
-0.12973593175411224,
-0.05937604233622551,
0.12075234949588776,
0.0682104229927063,
0.0026314666029065847,
-0.09983506053686142,
-0.030732527375221252,
-0.17690899968147278,
-0.05043875798583031,
0.038152143359184265,
0.01812787726521492,
-0.0417998768389225,
-0.0126902861520648,
0.021498242393136024,
-0.05108681693673134,
-0.07912731170654297,
0.07132575660943985,
0.08781229704618454,
0.05556125566363335,
-0.03221923112869263,
-0.004542355425655842,
0.11091461777687073,
0.065055251121521,
-0.17349284887313843,
-0.021393505856394768,
0.043010082095861435,
-0.10892385989427567,
0.0010842380579560995,
-0.006770037580281496,
0.004895061254501343,
0.04521822929382324,
0.1095847487449646,
-0.031158773228526115,
0.09701484441757202,
-0.0012355365324765444,
0.0204470157623291,
-0.08883070200681686,
0.1632547378540039,
-0.07121371477842331,
-0.04950492084026337,
-0.034253574907779694,
0.13761043548583984,
-0.004645611625164747,
-0.010269686579704285,
-0.07016249746084213,
0.02512456104159355,
0.10103625804185867,
0.05041653290390968,
-0.011497916653752327,
0.025902055203914642,
-0.06183028221130371,
-0.026945916935801506,
0.012899118475615978,
-0.1372157484292984,
0.0369432270526886,
0.06206503510475159,
-0.08387308567762375,
-0.007253972813487053,
-0.005976811051368713,
0.0019166928250342607,
-0.0550093799829483,
0.08093607425689697,
-0.042536985129117966,
0.00029809746774844825,
-0.08879666030406952,
-0.08829426765441895,
0.037039101123809814,
-0.04325659200549126,
-0.005985306575894356,
-0.04951179027557373,
-0.1163831576704979,
-0.06467978656291962,
0.04667314514517784,
-0.07702868431806564,
-0.07085946202278137,
-0.06337481737136841,
-0.09538348764181137,
0.05541537329554558,
-0.026449380442500114,
0.16322560608386993,
-0.05475806072354317,
0.08254005014896393,
0.03393308445811272,
0.04096435010433197,
0.09612661600112915,
0.07359568774700165,
-0.010662099346518517,
0.06313726305961609,
-0.1405111402273178,
0.10273145884275436,
-0.12095066159963608,
0.06123458594083786,
-0.11673369258642197,
-0.10818707942962646,
-0.009046551771461964,
-0.002864454174414277,
0.10416844487190247,
0.11586907505989075,
-0.16780516505241394,
-0.08904467523097992,
0.15130172669887543,
-0.05607721209526062,
-0.07438182830810547,
0.1165318489074707,
-0.011908363550901413,
-0.04398170858621597,
0.01953248679637909,
0.1730402708053589,
0.14986030757427216,
-0.09527826309204102,
0.016865933313965797,
-0.04546995833516121,
0.1197376698255539,
0.05079503357410431,
0.07438185811042786,
-0.029484476894140244,
0.06142827123403549,
0.00881196465343237,
-0.03248262405395508,
0.056688226759433746,
-0.06753692775964737,
-0.08114583790302277,
-0.020837603136897087,
-0.06532017141580582,
-0.0015074852854013443,
0.07190243154764175,
0.019193602725863457,
-0.06689407676458359,
-0.12942779064178467,
0.02735961601138115,
0.1125175878405571,
-0.10078930109739304,
0.021054282784461975,
-0.08053971827030182,
0.06832332164049149,
0.00438154861330986,
-0.001350183505564928,
-0.1445036083459854,
-0.009453793056309223,
0.03147585317492485,
-0.059243373572826385,
0.003652272280305624,
0.011788986623287201,
0.07125360518693924,
0.03232793137431145,
-0.04198283329606056,
-0.07458676397800446,
-0.05871176719665527,
-0.00011696536967065185,
-0.0400458499789238,
-0.23051026463508606,
-0.08248307555913925,
-0.014220372773706913,
0.17306146025657654,
-0.20904728770256042,
0.015212017111480236,
0.06619422137737274,
0.15089166164398193,
0.010858429595828056,
-0.036246441304683685,
0.015237818472087383,
0.05367046967148781,
-0.01827302761375904,
-0.07732795923948288,
0.028652919456362724,
0.004347709473222494,
-0.08677201718091965,
0.015593177638947964,
-0.11677703261375427,
0.06374956667423248,
0.07815703749656677,
0.005772389471530914,
-0.05452448129653931,
-0.022903725504875183,
-0.05770895257592201,
-0.06800311803817749,
-0.015160233713686466,
-0.027784453704953194,
0.16641102731227875,
0.017771262675523758,
0.12368144094944,
-0.07981064915657043,
-0.06138351559638977,
0.03718657046556473,
0.025003934279084206,
-0.0035817790776491165,
0.15236356854438782,
0.04420220106840134,
-0.02903871051967144,
0.08602770417928696,
0.015828615054488182,
-0.06346423923969269,
0.1743442416191101,
-0.09119007736444473,
-0.09868215769529343,
-0.01716403104364872,
0.024255676195025444,
0.02628013677895069,
0.11981078237295151,
-0.1812785416841507,
-0.0218184944242239,
0.01858450286090374,
0.023609621450304985,
0.0328652523458004,
-0.1887000948190689,
0.0012512542307376862,
0.025354541838169098,
-0.08998025208711624,
-0.00031162131926976144,
-0.0008834355394355953,
0.004940717946738005,
0.08432219177484512,
0.00001436694037693087,
-0.08689940720796585,
-0.02714413031935692,
-0.05157559737563133,
-0.08301777392625809,
0.15244176983833313,
-0.09380070120096207,
-0.14794787764549255,
-0.11726126074790955,
-0.009084525518119335,
-0.009873701259493828,
-0.028178304433822632,
0.029159951955080032,
-0.10003190487623215,
-0.034829724580049515,
-0.0673515573143959,
0.006061581429094076,
-0.036820974200963974,
0.002868061186745763,
0.04369853436946869,
0.011783384718000889,
0.09090837091207504,
-0.10619243234395981,
0.015380465425550938,
-0.010339058004319668,
-0.03219359368085861,
-0.0007238106918521225,
0.019206684082746506,
0.07678007334470749,
0.169521763920784,
0.050157755613327026,
0.04929763451218605,
-0.026778694242239,
0.17216818034648895,
-0.13390988111495972,
0.016429923474788666,
0.10998284071683884,
0.014575950801372528,
0.050440024584531784,
0.14978571236133575,
0.0477755181491375,
-0.06019238010048866,
0.0012892753584310412,
0.04429750144481659,
-0.01528983935713768,
-0.23215363919734955,
-0.02742796577513218,
-0.07363506406545639,
-0.03319711983203888,
0.09658600389957428,
0.039696719497442245,
0.018366456031799316,
0.00005204170520300977,
-0.03302760422229767,
-0.011955822817981243,
0.04855704680085182,
0.03782828152179718,
0.0863262265920639,
0.03645562753081322,
0.10352332890033722,
-0.011742858216166496,
-0.02807375229895115,
0.016269933432340622,
0.008991967886686325,
0.22157296538352966,
-0.02892261929810047,
0.18481537699699402,
0.04471772164106369,
0.14871682226657867,
-0.007617776282131672,
0.040159981697797775,
-0.010374471545219421,
-0.0023470676969736814,
0.03612107038497925,
-0.05317104980349541,
-0.0035358748864382505,
0.030346041545271873,
0.08484356105327606,
0.0267690047621727,
-0.08772283792495728,
0.00525940116494894,
0.0221974216401577,
0.34678858518600464,
0.08828438818454742,
-0.26657506823539734,
-0.06031836196780205,
0.013789191842079163,
-0.06590532511472702,
-0.04349708557128906,
0.030301431193947792,
0.11349459737539291,
-0.06751727312803268,
0.08509498834609985,
-0.055385757237672806,
0.089631088078022,
-0.05455131456255913,
0.0053183576092123985,
0.1082429364323616,
0.09633317589759827,
0.018511340022087097,
0.07410018146038055,
-0.26257774233818054,
0.254835307598114,
-0.015897957608103752,
0.07685662806034088,
-0.04195287078619003,
0.05157855898141861,
0.038880135864019394,
-0.01629853993654251,
0.05959901586174965,
-0.000832303601782769,
-0.09697984904050827,
-0.15836863219738007,
-0.08185790479183197,
0.003950163256376982,
0.10021112859249115,
-0.03772158548235893,
0.11108353734016418,
-0.039833784103393555,
-0.04576384648680687,
0.041269540786743164,
-0.0865369662642479,
-0.10907593369483948,
-0.09775959700345993,
0.06185263767838478,
0.02056354098021984,
0.0636260136961937,
-0.09643708914518356,
-0.09491226822137833,
-0.07999344170093536,
0.12985862791538239,
-0.13015925884246826,
-0.037455786019563675,
-0.12824814021587372,
0.0708722248673439,
0.15484873950481415,
-0.0566333569586277,
0.03445250913500786,
0.030499041080474854,
0.1398320198059082,
0.04223266616463661,
-0.022494401782751083,
0.09183944761753082,
-0.07296284288167953,
-0.20923738181591034,
-0.04089600220322609,
0.18153803050518036,
0.02826220728456974,
0.06568118184804916,
-0.019502589479088783,
0.021497994661331177,
-0.0026388452388346195,
-0.06613267213106155,
0.0716826468706131,
0.037770263850688934,
0.010502178221940994,
0.04647652804851532,
-0.032727327197790146,
0.008888958021998405,
-0.07592103630304337,
-0.05215034633874893,
0.0846184641122818,
0.220145583152771,
-0.08303173631429672,
0.0555797703564167,
0.008081483654677868,
-0.07351398468017578,
-0.15801219642162323,
-0.0010347218485549092,
0.1378587931394577,
0.04219023138284683,
-0.02500452846288681,
-0.20940223336219788,
0.010812508873641491,
0.06576083600521088,
-0.02410825714468956,
0.08192180842161179,
-0.3294132351875305,
-0.1315716952085495,
0.08501777052879333,
0.05198867246508598,
-0.051000241190195084,
-0.1573473960161209,
-0.06352255493402481,
-0.017869433388113976,
-0.08000636845827103,
0.01871299371123314,
-0.041631922125816345,
0.10964901000261307,
-0.0066445451229810715,
0.034796323627233505,
0.01714266650378704,
-0.04731770604848862,
0.14553169906139374,
0.004124350845813751,
0.04475286975502968,
-0.012976973317563534,
0.029819203540682793,
0.043952248990535736,
-0.07653908431529999,
0.014153852127492428,
-0.06015513464808464,
0.024939144030213356,
-0.16027921438217163,
-0.013095431961119175,
-0.10926070064306259,
0.013078280724585056,
-0.05229179933667183,
-0.007826797664165497,
-0.009535610675811768,
0.046869225800037384,
0.08563907444477081,
0.01544385775923729,
0.11888004094362259,
-0.06455127894878387,
0.1281200349330902,
0.13030123710632324,
0.11430901288986206,
-0.010863910429179668,
-0.10533503443002701,
-0.011424044147133827,
0.02333919145166874,
0.03489202260971069,
-0.11937589198350906,
0.05193379148840904,
0.14552044868469238,
0.04543711990118027,
0.14267586171627045,
0.049998171627521515,
-0.0969337671995163,
0.0029618770349770784,
0.062000155448913574,
-0.07110091298818588,
-0.12938259541988373,
-0.034990157932043076,
0.03513525426387787,
-0.11431051045656204,
-0.004123305901885033,
0.1107051894068718,
-0.04176819697022438,
-0.0008470499305985868,
0.027545081451535225,
0.05067853257060051,
-0.03916669264435768,
0.2285563349723816,
0.036912526935338974,
0.1065237894654274,
-0.09549364447593689,
0.07007729262113571,
0.047814540565013885,
-0.09729386866092682,
0.039130136370658875,
0.11728198826313019,
-0.042392272502183914,
-0.02201247215270996,
-0.0008455913048237562,
0.08456411212682724,
0.06351599097251892,
-0.06431879103183746,
-0.1316094547510147,
-0.1552603840827942,
0.09188529849052429,
0.07336612045764923,
0.018033722415566444,
0.025630773976445198,
-0.023742949590086937,
0.022174794226884842,
-0.08940200507640839,
0.1129605770111084,
0.08945969492197037,
0.062238290905952454,
-0.139412060379982,
0.09635881334543228,
0.011673462577164173,
0.017853233963251114,
0.000636336684692651,
-0.010540028102695942,
-0.09067437797784805,
0.038146909326314926,
-0.14725326001644135,
-0.0037394315004348755,
-0.03616557642817497,
-0.00370038696564734,
0.003302735975012183,
-0.05375266820192337,
-0.04899568855762482,
0.03789370134472847,
-0.11258723586797714,
-0.03532642871141434,
-0.03256019577383995,
0.06292106956243515,
-0.08820997923612595,
-0.0190016757696867,
0.02447211742401123,
-0.12984181940555573,
0.09083595871925354,
0.038077954202890396,
0.005873409099876881,
0.024842604994773865,
-0.07005634158849716,
-0.01988549903035164,
0.022912079468369484,
0.014208811335265636,
0.039023689925670624,
-0.1712116152048111,
-0.0078035565093159676,
-0.02709650807082653,
0.006975966971367598,
-0.022641487419605255,
-0.02822175808250904,
-0.11285112798213959,
0.002540471265092492,
-0.025897875428199768,
-0.05119723081588745,
-0.049670591950416565,
0.08022627234458923,
0.06847603619098663,
0.02444440871477127,
0.13985668122768402,
-0.07743444293737411,
0.05760384351015091,
-0.23474670946598053,
0.003457912476733327,
-0.006158597767353058,
-0.0680709183216095,
-0.03983532264828682,
-0.01853162981569767,
0.10631440579891205,
-0.06463269889354706,
0.08557843416929245,
-0.011890599504113197,
0.041785117238759995,
0.030606012791395187,
-0.12046115100383759,
0.01270665880292654,
0.06245391443371773,
0.15551944077014923,
0.03516354039311409,
-0.010904740542173386,
0.08454372733831406,
-0.03723059222102165,
0.04435863345861435,
0.14716219902038574,
0.13280686736106873,
0.14590255916118622,
0.07906398177146912,
0.06815560162067413,
0.10393235087394714,
-0.1409611850976944,
-0.12673427164554596,
0.15932334959506989,
-0.07037817686796188,
0.16040784120559692,
-0.038333773612976074,
0.1821133941411972,
0.11010782420635223,
-0.19037151336669922,
0.07529858499765396,
-0.06085701286792755,
-0.09714054316282272,
-0.09247695654630661,
-0.09311293810606003,
-0.06851307302713394,
-0.1820891946554184,
0.015181967988610268,
-0.10262501984834671,
0.06919052451848984,
0.06729584187269211,
0.044964633882045746,
0.022953737527132034,
0.09962145984172821,
0.10422250628471375,
0.0009416003013029695,
0.12058069556951523,
0.015386868268251419,
-0.011890360154211521,
-0.04742356017231941,
-0.08900164067745209,
0.04958676919341087,
-0.0197613388299942,
0.05250335484743118,
-0.03614071384072304,
-0.08736360818147659,
0.044924698770046234,
0.023935561999678612,
-0.09771819412708282,
0.03834432736039162,
-0.025146739557385445,
0.04982453212141991,
0.0750877782702446,
0.03874474763870239,
-0.00459198048338294,
-0.017996832728385925,
0.20223622024059296,
-0.08949066698551178,
-0.063667893409729,
-0.12218151986598969,
0.19631440937519073,
0.0013391656102612615,
-0.011636501178145409,
0.03276378661394119,
-0.07133762538433075,
-0.013250700198113918,
0.1653054803609848,
0.1389540433883667,
-0.016808917745947838,
-0.022754939272999763,
0.0014737537130713463,
-0.006577038671821356,
-0.038521554321050644,
0.07706204801797867,
0.12295152992010117,
0.042744461447000504,
-0.03472135588526726,
-0.022848263382911682,
-0.017477067187428474,
-0.06717660278081894,
-0.03478020429611206,
0.07467587292194366,
0.01679156720638275,
0.0022287066094577312,
-0.022155046463012695,
0.09735467284917831,
-0.057303328067064285,
-0.16520576179027557,
0.02555495873093605,
-0.17313508689403534,
-0.18776936829090118,
-0.0440969280898571,
0.07019203156232834,
0.03092268481850624,
0.0479498952627182,
-0.011268357746303082,
-0.028714671730995178,
0.12208735197782516,
-0.003975246101617813,
-0.02316964417695999,
-0.097613625228405,
0.07585020363330841,
-0.15180958807468414,
0.1696641743183136,
-0.05271312594413757,
0.022608360275626183,
0.11047610640525818,
0.0457429364323616,
-0.0856669545173645,
0.016538793221116066,
0.08794564008712769,
-0.1320473849773407,
0.024862099438905716,
0.2104412317276001,
-0.029698222875595093,
0.11250841617584229,
0.0338912233710289,
-0.10041273385286331,
0.005073286592960358,
-0.06349322199821472,
-0.04541231691837311,
-0.0574926882982254,
-0.007132974453270435,
-0.03225793316960335,
0.13597463071346283,
0.2111182063817978,
-0.06721193343400955,
-0.013973135501146317,
-0.057866036891937256,
-0.0031532160937786102,
0.008734145201742649,
0.1325763463973999,
-0.03591368347406387,
-0.2584979832172394,
0.021345701068639755,
0.005150961223989725,
0.02121759206056595,
-0.18953078985214233,
-0.07972577214241028,
0.04182957857847214,
-0.06567570567131042,
-0.056121163070201874,
0.12528708577156067,
0.059611670672893524,
0.05695352330803871,
-0.04999281466007233,
-0.08612818270921707,
-0.034803926944732666,
0.18193493783473969,
-0.17832013964653015,
-0.04747173562645912
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the /WORKSPACE/DATA/HY/NOIZY_STUDENT_4/ - NA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1693
- Wer: 0.2373
- Cer: 0.0429
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 842
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 5000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 1.255 | 7.24 | 500 | 0.2978 | 0.4294 | 0.0758 |
| 1.0058 | 14.49 | 1000 | 0.1883 | 0.2838 | 0.0483 |
| 0.9371 | 21.73 | 1500 | 0.1813 | 0.2627 | 0.0457 |
| 0.8999 | 28.98 | 2000 | 0.1693 | 0.2373 | 0.0429 |
| 0.8814 | 36.23 | 2500 | 0.1760 | 0.2420 | 0.0435 |
| 0.8364 | 43.47 | 3000 | 0.1765 | 0.2416 | 0.0419 |
| 0.8019 | 50.72 | 3500 | 0.1758 | 0.2311 | 0.0398 |
| 0.7665 | 57.96 | 4000 | 0.1745 | 0.2240 | 0.0399 |
| 0.7376 | 65.22 | 4500 | 0.1717 | 0.2190 | 0.0385 |
| 0.716 | 72.46 | 5000 | 0.1700 | 0.2147 | 0.0382 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
|
{"language": ["hy"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "hy", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-xls-r-1b-hy-cv", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice hy-AM", "type": "mozilla-foundation/common_voice_8_0", "args": "hy-AM"}, "metrics": [{"type": "wer", "value": 10.811865729898516, "name": "WER LM"}, {"type": "cer", "value": 2.2205361659079412, "name": "CER LM"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "hy"}, "metrics": [{"type": "wer", "value": 18.219363037089988, "name": "Test WER"}, {"type": "cer", "value": 7.075988867335752, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-1b-hy
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"hy",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hy"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #hy #mozilla-foundation/common_voice_8_0 #robust-speech-event #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the /WORKSPACE/DATA/HY/NOIZY\_STUDENT\_4/ - NA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1693
* Wer: 0.2373
* Cer: 0.0429
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 64
* seed: 842
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 5000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 842\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #hy #mozilla-foundation/common_voice_8_0 #robust-speech-event #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 842\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
105,
160,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #hy #mozilla-foundation/common_voice_8_0 #robust-speech-event #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 842\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 5000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
-0.1434638351202011,
0.09852688014507294,
-0.004679653327912092,
0.03766827657818794,
0.10030479729175568,
0.016110572963953018,
0.10541713237762451,
0.1671248972415924,
-0.08230260014533997,
0.11081942915916443,
0.06990891695022583,
0.04980700463056564,
0.0996708944439888,
0.13196101784706116,
-0.01826825551688671,
-0.28081831336021423,
0.011402809992432594,
-0.03825055807828903,
-0.1539580225944519,
0.08440636843442917,
0.11002512276172638,
-0.11155758798122406,
0.045309290289878845,
0.02683108299970627,
-0.09073536843061447,
-0.011957494542002678,
-0.038113515824079514,
-0.043131910264492035,
0.08187972754240036,
0.05374014377593994,
0.04626227170228958,
0.048580102622509,
0.08911038935184479,
-0.24880139529705048,
0.011120989918708801,
0.07384567707777023,
0.04197954013943672,
0.057604238390922546,
0.10199425369501114,
-0.0037121016066521406,
0.078047014772892,
-0.08282328397035599,
0.02791704796254635,
0.0762677863240242,
-0.09833908081054688,
-0.2705111801624298,
-0.09680956602096558,
0.026272745802998543,
0.1266476809978485,
0.08074744790792465,
-0.027711302042007446,
0.04227343574166298,
-0.05222892761230469,
0.07582627981901169,
0.22049619257450104,
-0.18935926258563995,
-0.08326633274555206,
-0.0035050564911216497,
0.03760523721575737,
0.04318533465266228,
-0.1133350059390068,
0.006599128246307373,
0.021761804819107056,
0.005254611372947693,
0.07829804718494415,
-0.005702258087694645,
0.013379101641476154,
-0.0031449550297111273,
-0.1156492829322815,
-0.045711301267147064,
0.17417041957378387,
0.07709482312202454,
-0.013435562141239643,
-0.09382922947406769,
-0.015880264341831207,
-0.16266342997550964,
-0.05863003432750702,
0.030344298109412193,
0.03567899018526077,
-0.04282926395535469,
-0.056563764810562134,
0.0019036601297557354,
-0.07002056390047073,
-0.06747056543827057,
0.07389809191226959,
0.09239406883716583,
0.02126440405845642,
-0.02374456077814102,
0.004536068066954613,
0.09163835644721985,
0.051814865320920944,
-0.18338210880756378,
-0.036450013518333435,
0.037586670368909836,
-0.10318466275930405,
-0.011294622905552387,
-0.02279847487807274,
0.02369001880288124,
0.07868406176567078,
0.11103249341249466,
-0.045060545206069946,
0.0941072553396225,
0.00786259863525629,
0.013269761577248573,
-0.07684551924467087,
0.17364713549613953,
-0.06626632064580917,
-0.08847702294588089,
-0.0300239659845829,
0.12416357547044754,
0.0038164034485816956,
-0.016615169122815132,
-0.0826253890991211,
0.03886227682232857,
0.08500289916992188,
0.05581004545092583,
-0.0019252730999141932,
0.01883213222026825,
-0.05709267780184746,
-0.034541551023721695,
0.013841748237609863,
-0.11882998049259186,
0.036015745252370834,
0.06585680693387985,
-0.08331407606601715,
-0.013076207600533962,
-0.022423306480050087,
0.0218166783452034,
-0.03913753107190132,
0.055744536221027374,
-0.06737551838159561,
-0.008341418579220772,
-0.0766061544418335,
-0.0850885733962059,
0.04507589712738991,
-0.04040658473968506,
-0.0099493358284235,
-0.05098805949091911,
-0.09568775445222855,
-0.07539181411266327,
0.04574953392148018,
-0.06536858528852463,
-0.07089191675186157,
-0.07360456138849258,
-0.09163781255483627,
0.05723007768392563,
-0.010401641950011253,
0.16493664681911469,
-0.05956413596868515,
0.06354891508817673,
0.024400748312473297,
0.03834353759884834,
0.07847907394170761,
0.06333330273628235,
-0.013136949390172958,
0.06453061103820801,
-0.14201393723487854,
0.0946984514594078,
-0.12603680789470673,
0.08264070004224777,
-0.1300261914730072,
-0.09890559315681458,
0.007321733515709639,
-0.01022983156144619,
0.08357347548007965,
0.13254497945308685,
-0.14765973389148712,
-0.09486202150583267,
0.1468455195426941,
-0.050932396203279495,
-0.10202368348836899,
0.11703329533338547,
-0.0033069017808884382,
-0.03417785465717316,
0.02400387078523636,
0.1597096472978592,
0.16331882774829865,
-0.09899590164422989,
0.0011724847136065364,
-0.02589328959584236,
0.14495068788528442,
0.047763891518116,
0.0943416953086853,
-0.05341647192835808,
0.06072709709405899,
0.009219036437571049,
-0.030755111947655678,
0.03439590334892273,
-0.07568395137786865,
-0.09141682833433151,
-0.012012904509902,
-0.08272489160299301,
-0.021955057978630066,
0.06123891472816467,
0.019813720136880875,
-0.06959588080644608,
-0.12356552481651306,
-0.024753907695412636,
0.11963925510644913,
-0.09661176800727844,
0.012310127727687359,
-0.07958008348941803,
0.061231911182403564,
0.013898874633014202,
-0.00046087970258668065,
-0.1342993676662445,
-0.040283456444740295,
0.042106665670871735,
-0.05535224452614784,
0.01581438072025776,
-0.013673298060894012,
0.06301459670066833,
0.03925593942403793,
-0.03836717829108238,
-0.054101068526506424,
-0.035899270325899124,
-0.011916205286979675,
-0.038991089910268784,
-0.25081491470336914,
-0.07717915624380112,
-0.012146182358264923,
0.2388271689414978,
-0.19044335186481476,
0.016127362847328186,
0.09166386723518372,
0.14276885986328125,
0.023378191515803337,
-0.03919180855154991,
0.022778887301683426,
0.05255955830216408,
-0.023799818009138107,
-0.08466842025518417,
0.026777105405926704,
0.012045955285429955,
-0.1004621833562851,
0.01990704983472824,
-0.14952035248279572,
0.05367278307676315,
0.08634896576404572,
0.00944034568965435,
-0.04274242743849754,
-0.0633355975151062,
-0.05476773902773857,
-0.053742773830890656,
-0.00823409017175436,
-0.024155043065547943,
0.12652628123760223,
0.004164456389844418,
0.11047512292861938,
-0.08863577246665955,
-0.05752190575003624,
0.05148845538496971,
0.020224185660481453,
-0.0030468616168946028,
0.12217827886343002,
0.042633626610040665,
-0.05147154629230499,
0.09967994689941406,
0.019230924546718597,
-0.070327028632164,
0.1807510256767273,
-0.08834615349769592,
-0.079119972884655,
-0.03414349630475044,
0.02527805231511593,
0.038479872047901154,
0.13406313955783844,
-0.1896083950996399,
-0.028231337666511536,
0.026569988578557968,
0.005680459551513195,
0.023809071630239487,
-0.17405496537685394,
0.013762986287474632,
0.021795984357595444,
-0.08620142191648483,
0.008659778162837029,
-0.003444054163992405,
-0.015791555866599083,
0.07320933043956757,
0.010253294371068478,
-0.05917242914438248,
-0.03847184777259827,
-0.05316125974059105,
-0.08630690723657608,
0.1798337996006012,
-0.11119531840085983,
-0.1485888957977295,
-0.12718795239925385,
-0.0031926450319588184,
-0.036943137645721436,
-0.006736040115356445,
0.026981422677636147,
-0.09138429909944534,
-0.031253840774297714,
-0.061185065656900406,
0.008621650747954845,
-0.03360738232731819,
0.02629556506872177,
0.021870622411370277,
0.020123209804296494,
0.07697562873363495,
-0.10630639642477036,
0.008036289364099503,
0.0016975820763036609,
-0.03576924651861191,
-0.008040225133299828,
0.0017486907308921218,
0.09827319532632828,
0.16970132291316986,
0.06450731307268143,
0.04407515376806259,
-0.03525836393237114,
0.17605702579021454,
-0.15604853630065918,
0.020812280476093292,
0.10568246990442276,
0.028672432526946068,
0.0459270179271698,
0.14956369996070862,
0.045155271887779236,
-0.06100444495677948,
-0.005582957062870264,
0.04123914986848831,
-0.019423341378569603,
-0.22032210230827332,
-0.03494122996926308,
-0.08670110255479813,
-0.008237162604928017,
0.09825461357831955,
0.028203321620821953,
0.05225208401679993,
0.02567727491259575,
-0.05304230749607086,
-0.002608934883028269,
0.05922764167189598,
0.04678105190396309,
0.08269844949245453,
0.05489792302250862,
0.11514323204755783,
-0.008537298999726772,
-0.016878534108400345,
0.03143313527107239,
0.012735358439385891,
0.22803102433681488,
-0.006279184482991695,
0.2247353047132492,
0.04771135747432709,
0.1325940042734146,
-0.014338560402393341,
0.04138569533824921,
0.014153098687529564,
-0.0034992077853530645,
0.03296823427081108,
-0.06941476464271545,
-0.0026311015244573355,
0.042968831956386566,
0.1108015850186348,
0.016884172335267067,
-0.10029599815607071,
0.030773000791668892,
0.03300132974982262,
0.36079490184783936,
0.09202424436807632,
-0.28512564301490784,
-0.07696851342916489,
0.023813001811504364,
-0.04299582168459892,
-0.035669904202222824,
0.01896650344133377,
0.12082715332508087,
-0.07936441898345947,
0.10893824696540833,
-0.04617549479007721,
0.08920278400182724,
-0.04450611770153046,
0.003950899932533503,
0.09065373986959457,
0.10046933591365814,
0.0037815384566783905,
0.0669165775179863,
-0.23868170380592346,
0.2625313103199005,
0.002890021540224552,
0.07091252505779266,
-0.05033383145928383,
0.05905895680189133,
0.0494389533996582,
0.026582401245832443,
0.06733270734548569,
-0.003891187720000744,
-0.10029177367687225,
-0.1537724733352661,
-0.07873241603374481,
0.01439722627401352,
0.12327782064676285,
-0.07132737338542938,
0.11676107347011566,
-0.036855313926935196,
-0.051742199808359146,
0.03423584625124931,
-0.06483597308397293,
-0.10730177164077759,
-0.11175159364938736,
0.06610386818647385,
0.02458869107067585,
0.06669733673334122,
-0.09790560603141785,
-0.08325608819723129,
-0.04112325608730316,
0.13929857313632965,
-0.13136231899261475,
-0.03445257991552353,
-0.1402706354856491,
0.07724297791719437,
0.152301624417305,
-0.06014985963702202,
0.03448672220110893,
0.0022647827863693237,
0.16524925827980042,
0.02824966236948967,
-0.009664393030107021,
0.0900501012802124,
-0.0857432410120964,
-0.22443094849586487,
-0.04519372060894966,
0.18266835808753967,
0.015007728710770607,
0.062485214322805405,
-0.01124507188796997,
0.02309398539364338,
-0.012669484131038189,
-0.07762333750724792,
0.07362078130245209,
0.04120134562253952,
0.0012315536150708795,
0.035769566893577576,
-0.013785927556455135,
0.012512004934251308,
-0.07676780223846436,
-0.044673092663288116,
0.07259073108434677,
0.24437330663204193,
-0.08186271041631699,
0.03496386855840683,
0.03746527433395386,
-0.06602154672145844,
-0.16450433433055878,
0.004831571131944656,
0.13294903934001923,
0.04096704348921776,
-0.059189487248659134,
-0.21057064831256866,
0.030112791806459427,
0.07392032444477081,
-0.03146631643176079,
0.09929317981004715,
-0.2989424467086792,
-0.14224688708782196,
0.07807290554046631,
0.024865029379725456,
-0.051459845155477524,
-0.17944279313087463,
-0.07425934076309204,
-0.04117797687649727,
-0.06240667775273323,
0.036281704902648926,
-0.0712890475988388,
0.11406788229942322,
0.010063519701361656,
0.014138291589915752,
0.014918620698153973,
-0.04068118706345558,
0.1652427762746811,
0.012097838334739208,
0.03412700071930885,
-0.010126697830855846,
0.01768883876502514,
0.0701041892170906,
-0.07735703140497208,
0.003334855893626809,
-0.08553874492645264,
0.02512732893228531,
-0.1433316171169281,
-0.016530003398656845,
-0.09422279894351959,
0.03721780702471733,
-0.04912671074271202,
-0.015842773020267487,
-0.02805027924478054,
0.04747826233506203,
0.07426933199167252,
0.019803833216428757,
0.1334332525730133,
-0.06925112754106522,
0.13984185457229614,
0.1740381419658661,
0.1315242350101471,
-0.03323863819241524,
-0.07709977775812149,
-0.0035302459727972746,
0.0005495482473634183,
0.040030770003795624,
-0.12219162285327911,
0.04532356932759285,
0.13331010937690735,
0.049874551594257355,
0.1366877406835556,
0.045681972056627274,
-0.0902126207947731,
0.009615682065486908,
0.06945938616991043,
-0.06948772817850113,
-0.1515524983406067,
-0.045036040246486664,
0.040039096027612686,
-0.14152999222278595,
0.004935596603900194,
0.13084718585014343,
-0.02698122337460518,
0.004760495387017727,
0.015544173307716846,
0.044956643134355545,
-0.031379468739032745,
0.23764923214912415,
0.03749070316553116,
0.11200064420700073,
-0.1065690740942955,
0.07339151948690414,
0.05478104203939438,
-0.0968947634100914,
0.026192976161837578,
0.116379514336586,
-0.05809444561600685,
-0.030544079840183258,
0.007287926506251097,
0.07816367596387863,
0.022447960451245308,
-0.06569815427064896,
-0.13277524709701538,
-0.16386102139949799,
0.08467555046081543,
0.09850740432739258,
0.024989137426018715,
0.027645554393529892,
-0.0016819180455058813,
0.02956526167690754,
-0.08616651594638824,
0.11867571622133255,
0.06355088204145432,
0.06570510566234589,
-0.12410937249660492,
0.10676801204681396,
0.004241108428686857,
0.014347271993756294,
0.0025695941876620054,
-0.01582970842719078,
-0.09649285674095154,
0.013598832301795483,
-0.14720889925956726,
0.00025976781034842134,
-0.043753042817115784,
-0.0064514377154409885,
0.015594713389873505,
-0.05849772319197655,
-0.0702938660979271,
0.03479684516787529,
-0.11072739213705063,
-0.04850132390856743,
-0.04784074053168297,
0.06986724585294724,
-0.09165284782648087,
-0.015712764114141464,
0.02555128000676632,
-0.12991583347320557,
0.09521900862455368,
0.03190533444285393,
0.017659902572631836,
0.013827579095959663,
-0.056096483021974564,
-0.011455812491476536,
0.020668871700763702,
0.01657341793179512,
0.04536694660782814,
-0.15964284539222717,
-0.014187868684530258,
-0.032010164111852646,
0.01212838850915432,
-0.01963018998503685,
-0.0008759806514717638,
-0.09881283342838287,
0.00673302449285984,
-0.0241148229688406,
-0.04761679098010063,
-0.04976093769073486,
0.08247511833906174,
0.08139174431562424,
0.011696962639689445,
0.1347992867231369,
-0.06559736281633377,
0.050248369574546814,
-0.23703396320343018,
0.010247956030070782,
-0.007157167419791222,
-0.0780760645866394,
-0.05328528955578804,
-0.02809854969382286,
0.1116262823343277,
-0.06808994710445404,
0.07283676415681839,
-0.02545105665922165,
0.08213022351264954,
0.03035188652575016,
-0.0989292562007904,
0.02961554192006588,
0.07540426403284073,
0.15510791540145874,
0.059943169355392456,
-0.012408869341015816,
0.09549017995595932,
-0.021707996726036072,
0.06364184617996216,
0.11983811855316162,
0.15741321444511414,
0.11273334920406342,
0.0755431056022644,
0.0878913551568985,
0.12112260609865189,
-0.14753380417823792,
-0.1215226948261261,
0.1474733203649521,
-0.07234405726194382,
0.15771937370300293,
-0.03765936940908432,
0.14815232157707214,
0.12902148067951202,
-0.2091788649559021,
0.05837518721818924,
-0.06136950105428696,
-0.09226392209529877,
-0.10163760185241699,
-0.07999011129140854,
-0.08497712016105652,
-0.19403214752674103,
-0.00010363636101828888,
-0.11871440708637238,
0.06198754534125328,
0.0525643490254879,
0.04575567692518234,
0.03088219277560711,
0.106958769261837,
0.05434843897819519,
-0.012449057772755623,
0.11997342109680176,
0.008990486152470112,
-0.008280893787741661,
-0.046666115522384644,
-0.1134718731045723,
0.04663864150643349,
-0.03469742834568024,
0.05308575555682182,
-0.0435158871114254,
-0.07928844541311264,
0.05912112072110176,
0.01739160157740116,
-0.10309003293514252,
0.039258699864149094,
-0.029057009145617485,
0.041737187653779984,
0.05854100361466408,
0.02109660394489765,
-0.0016469163820147514,
0.0004610166943166405,
0.196833997964859,
-0.09319163113832474,
-0.06192312389612198,
-0.14387498795986176,
0.18913796544075012,
-0.013205314055085182,
-0.007237720303237438,
0.029548322781920433,
-0.07058010995388031,
-0.025138065218925476,
0.16962023079395294,
0.16564112901687622,
-0.02111954055726528,
-0.01810391992330551,
0.013831461779773235,
-0.010104672983288765,
-0.043801724910736084,
0.07457730174064636,
0.10118354856967926,
0.04203405603766441,
-0.03725631162524223,
-0.011973402462899685,
0.008601353503763676,
-0.0696575716137886,
-0.04409048333764076,
0.08555280417203903,
0.015803994610905647,
-0.0003803216095548123,
-0.007613527588546276,
0.10905198007822037,
-0.055270615965127945,
-0.13974903523921967,
0.05074899271130562,
-0.19225531816482544,
-0.18596018850803375,
-0.045442234724760056,
0.03403913974761963,
0.037332698702812195,
0.06015399843454361,
-0.007886109873652458,
-0.04492468759417534,
0.11819642037153244,
0.001143686124123633,
-0.04388512670993805,
-0.12398932874202728,
0.0927494466304779,
-0.1638180911540985,
0.17508482933044434,
-0.06418107450008392,
0.011279101483523846,
0.12332817167043686,
0.036110568791627884,
-0.08401050418615341,
0.002771839266642928,
0.10419648140668869,
-0.135173499584198,
0.03697434067726135,
0.1944488286972046,
-0.04526545852422714,
0.14022240042686462,
0.044908978044986725,
-0.09268227219581604,
0.0011293701827526093,
-0.06081671640276909,
-0.024140363559126854,
-0.055880531668663025,
-0.010546079836785793,
-0.05231711268424988,
0.12999141216278076,
0.19931696355342865,
-0.07124780118465424,
-0.023435786366462708,
-0.0342617891728878,
0.012451414950191975,
0.033641841262578964,
0.12633919715881348,
-0.05108014866709709,
-0.27580738067626953,
0.021915357559919357,
-0.012527677230536938,
0.02048170380294323,
-0.18165524303913116,
-0.06431867182254791,
0.03163076937198639,
-0.04757586494088173,
-0.0359361469745636,
0.13253267109394073,
0.05928732454776764,
0.02923201210796833,
-0.057196103036403656,
-0.09981603175401688,
-0.03168006241321564,
0.17638635635375977,
-0.17928661406040192,
-0.057442113757133484
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xls-r-1b-ka
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the /WORKSPACE/DATA/KA/NOIZY_STUDENT_2/ - KA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1022
- Wer: 0.1527
- Cer: 0.0221
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 1.2839 | 6.45 | 400 | 0.2229 | 0.3609 | 0.0557 |
| 0.9775 | 12.9 | 800 | 0.1271 | 0.2202 | 0.0317 |
| 0.9045 | 19.35 | 1200 | 0.1268 | 0.2030 | 0.0294 |
| 0.8652 | 25.8 | 1600 | 0.1211 | 0.1940 | 0.0287 |
| 0.8505 | 32.26 | 2000 | 0.1192 | 0.1912 | 0.0276 |
| 0.8168 | 38.7 | 2400 | 0.1086 | 0.1763 | 0.0260 |
| 0.7737 | 45.16 | 2800 | 0.1098 | 0.1753 | 0.0256 |
| 0.744 | 51.61 | 3200 | 0.1054 | 0.1646 | 0.0239 |
| 0.7114 | 58.06 | 3600 | 0.1034 | 0.1573 | 0.0228 |
| 0.6773 | 64.51 | 4000 | 0.1022 | 0.1527 | 0.0221 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
|
{"language": ["ka"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-xls-r-1b-ka", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ka", "type": "mozilla-foundation/common_voice_8_0", "args": "ka"}, "metrics": [{"type": "wer", "value": 7.39778066580026, "name": "WER LM"}, {"type": "cer", "value": 1.1882089427096434, "name": "CER LM"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "ka"}, "metrics": [{"type": "wer", "value": 22.61, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ka"}, "metrics": [{"type": "wer", "value": 21.58, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-1b-ka
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"ka",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ka"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ka #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-xls-r-1b-ka
====================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the /WORKSPACE/DATA/KA/NOIZY\_STUDENT\_2/ - KA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1022
* Wer: 0.1527
* Cer: 0.0221
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7e-05
* train\_batch\_size: 16
* eval\_batch\_size: 64
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 4000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ka #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
105,
159,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #ka #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
-0.13129888474941254,
0.09767179191112518,
-0.004943055100739002,
0.042217664420604706,
0.09875945746898651,
0.017679782584309578,
0.11430191993713379,
0.15731622278690338,
-0.07356996834278107,
0.11729719489812851,
0.07281719893217087,
0.05303691327571869,
0.08814992755651474,
0.13446082174777985,
-0.01632075197994709,
-0.3024098873138428,
0.019482340663671494,
-0.028705276548862457,
-0.14517590403556824,
0.09228334575891495,
0.10089357197284698,
-0.10628414899110794,
0.040006231516599655,
0.016004016622900963,
-0.08970198035240173,
-0.019279006868600845,
-0.029835831373929977,
-0.05807797983288765,
0.086299829185009,
0.046632200479507446,
0.05425049364566803,
0.03878777474164963,
0.09190044552087784,
-0.25113046169281006,
0.011053838767111301,
0.06866590678691864,
0.045098334550857544,
0.06541627645492554,
0.10229918360710144,
0.014986147172749043,
0.08731617778539658,
-0.06972675025463104,
0.04661351814866066,
0.06601416319608688,
-0.10365816205739975,
-0.2620413899421692,
-0.08617112040519714,
0.052533216774463654,
0.12820349633693695,
0.07660509645938873,
-0.026016199961304665,
0.04518193379044533,
-0.04767357185482979,
0.0883309617638588,
0.18547898530960083,
-0.2117059975862503,
-0.08542585372924805,
-0.006830040831118822,
0.04321742057800293,
0.05634019151329994,
-0.11039537936449051,
-0.0025775707326829433,
0.00572313042357564,
0.014963162131607533,
0.077931247651577,
-0.011947275139391422,
0.0023940729442983866,
-0.01905369572341442,
-0.12010326236486435,
-0.04606602340936661,
0.15942473709583282,
0.07656237483024597,
-0.014863919466733932,
-0.10694181174039841,
-0.020565228536725044,
-0.1720321923494339,
-0.05882417783141136,
0.006693657021969557,
0.03173941746354103,
-0.03815920650959015,
-0.05970442667603493,
-0.00300900312140584,
-0.06321320682764053,
-0.073719322681427,
0.06691082566976547,
0.1083504781126976,
0.02545800432562828,
-0.02773427963256836,
0.010849160142242908,
0.08555051684379578,
0.03088279440999031,
-0.17797249555587769,
-0.026702843606472015,
0.036032743752002716,
-0.0972188264131546,
-0.017188651487231255,
-0.013730518519878387,
0.02573230303823948,
0.06030040979385376,
0.12893584370613098,
-0.04540136829018593,
0.09556102007627487,
0.00251385779120028,
0.013446657918393612,
-0.09045001119375229,
0.19647350907325745,
-0.051382776349782944,
-0.09246064722537994,
-0.0330108217895031,
0.13084611296653748,
0.0040629832074046135,
-0.009226208552718163,
-0.08539796620607376,
0.027582233771681786,
0.08041160553693771,
0.05655491352081299,
-0.007295053917914629,
0.028303081169724464,
-0.05155566707253456,
-0.03622468560934067,
0.03736826777458191,
-0.11055392771959305,
0.040140870958566666,
0.05953044444322586,
-0.07050617784261703,
-0.00209529185667634,
-0.010456482879817486,
0.011511930264532566,
-0.03896559774875641,
0.056367162615060806,
-0.060488246381282806,
-0.012168426997959614,
-0.07018956542015076,
-0.08494289964437485,
0.04160119965672493,
-0.05583677068352699,
-0.005626798141747713,
-0.0558261014521122,
-0.09504289925098419,
-0.07143731415271759,
0.03750523179769516,
-0.06122991070151329,
-0.08506113290786743,
-0.06646891683340073,
-0.09669836610555649,
0.058706868439912796,
-0.018377752974629402,
0.14915460348129272,
-0.05541693791747093,
0.06556089222431183,
0.030612073838710785,
0.033388156443834305,
0.0753558874130249,
0.06353280693292618,
-0.019940396770834923,
0.06286950409412384,
-0.17611098289489746,
0.09631611406803131,
-0.11952096223831177,
0.05662520229816437,
-0.13002347946166992,
-0.09600388258695602,
-0.004979385528713465,
-0.0035148842725902796,
0.07695602625608444,
0.12098155170679092,
-0.1429179459810257,
-0.09564471989870071,
0.1557115912437439,
-0.0628344789147377,
-0.09538161009550095,
0.11429047584533691,
0.0008051714976318181,
-0.039403509348630905,
0.02384641207754612,
0.16561686992645264,
0.16821254789829254,
-0.097071573138237,
0.00006296753417700529,
-0.028242675587534904,
0.14075584709644318,
0.05244908854365349,
0.0899721160531044,
-0.05419676750898361,
0.06348136067390442,
0.00876276008784771,
-0.036870408803224564,
0.04395585134625435,
-0.06567629426717758,
-0.07915320992469788,
-0.013187547214329243,
-0.08042997121810913,
-0.011261646635830402,
0.05981522053480148,
0.01865517906844616,
-0.06695656478404999,
-0.13102389872074127,
-0.015693936496973038,
0.11639845371246338,
-0.10002429038286209,
0.01271553710103035,
-0.07559996843338013,
0.051793769001960754,
0.00529644126072526,
0.0020994527731090784,
-0.14800207316875458,
-0.04200204461812973,
0.05111508443951607,
-0.04881450906395912,
0.02867218665778637,
-0.012956229969859123,
0.06971563398838043,
0.04339388757944107,
-0.043984249234199524,
-0.06207269802689552,
-0.037578560411930084,
-0.013063455931842327,
-0.050753120332956314,
-0.23010024428367615,
-0.07443253695964813,
-0.02099839597940445,
0.2083967924118042,
-0.20018212497234344,
0.014902175404131413,
0.06460493057966232,
0.13246223330497742,
0.03383253887295723,
-0.04455010965466499,
0.020706402137875557,
0.06076109781861305,
-0.019682830199599266,
-0.08452722430229187,
0.02386804111301899,
0.005113156046718359,
-0.10092088580131531,
0.02122754603624344,
-0.16754500567913055,
0.059518203139305115,
0.08430551737546921,
0.014250884763896465,
-0.06209452077746391,
-0.05061258375644684,
-0.051325973123311996,
-0.05285773053765297,
0.003512180410325527,
-0.015334432944655418,
0.15776918828487396,
0.015704164281487465,
0.11199602484703064,
-0.0728212222456932,
-0.04421132057905197,
0.04579920694231987,
0.024647003039717674,
0.0024615107104182243,
0.12423622608184814,
0.035394251346588135,
-0.029822085052728653,
0.09993721544742584,
0.036012567579746246,
-0.06283245235681534,
0.15614593029022217,
-0.09079970419406891,
-0.08281189203262329,
-0.03130301460623741,
0.020982010290026665,
0.037194088101387024,
0.1130816861987114,
-0.17169681191444397,
-0.02427798882126808,
0.0258353129029274,
0.013520471751689911,
0.014116368256509304,
-0.16679486632347107,
0.017645299434661865,
0.02840466983616352,
-0.0908006802201271,
0.01743088848888874,
-0.007553095929324627,
-0.008711185306310654,
0.07086759805679321,
0.010772335343062878,
-0.07009384036064148,
-0.0368163026869297,
-0.041600074619054794,
-0.0773264467716217,
0.17538896203041077,
-0.10232561826705933,
-0.15186049044132233,
-0.14009489119052887,
0.003543819533661008,
-0.026789281517267227,
0.0011412021704018116,
0.022523989900946617,
-0.0827789455652237,
-0.03693216294050217,
-0.05968455970287323,
0.024601003155112267,
-0.025112271308898926,
0.021232519298791885,
0.019004719331860542,
0.008317581377923489,
0.0820636972784996,
-0.09770648926496506,
0.013297717086970806,
0.0035286250058561563,
-0.04975372925400734,
-0.012102682143449783,
0.009919775649905205,
0.0898691937327385,
0.1764133870601654,
0.06308655440807343,
0.029915064573287964,
-0.03672587871551514,
0.1661725789308548,
-0.15562348067760468,
0.0241613257676363,
0.12113354355096817,
0.007231971714645624,
0.03737345337867737,
0.14583544433116913,
0.041579995304346085,
-0.05596933886408806,
-0.006234299391508102,
0.03386741504073143,
-0.016500554978847504,
-0.21856963634490967,
-0.034441087394952774,
-0.07546074688434601,
0.007548486348241568,
0.1054760292172432,
0.03095986880362034,
0.05002580210566521,
0.03763556480407715,
-0.041981033980846405,
0.0002913936332333833,
0.06202247738838196,
0.05541323870420456,
0.10832659155130386,
0.04724234715104103,
0.11823362112045288,
-0.009512518532574177,
-0.026859669014811516,
0.02895706705749035,
0.017417730763554573,
0.2077503502368927,
0.011334863491356373,
0.2197204828262329,
0.028864778578281403,
0.11571541428565979,
-0.005564616993069649,
0.04378870129585266,
0.0248404648154974,
-0.004648216534405947,
0.03059861809015274,
-0.06191443279385567,
-0.012971843592822552,
0.034315261989831924,
0.11206190288066864,
0.016765549778938293,
-0.0945238322019577,
0.028255494311451912,
0.02743571065366268,
0.35865384340286255,
0.08329983800649643,
-0.2843531370162964,
-0.09048371016979218,
0.017177129164338112,
-0.0651644915342331,
-0.04432574659585953,
0.028610844165086746,
0.12343405187129974,
-0.08433444052934647,
0.09617340564727783,
-0.05178777873516083,
0.09040408581495285,
-0.05981195345520973,
0.008216011337935925,
0.09668634086847305,
0.09266364574432373,
0.015426256693899632,
0.05285094678401947,
-0.22345547378063202,
0.2565514147281647,
0.000819993088953197,
0.06626058369874954,
-0.04113622382283211,
0.059953413903713226,
0.03889991343021393,
0.017259197309613228,
0.0708559975028038,
-0.0160718634724617,
-0.1099809855222702,
-0.17099054157733917,
-0.09538587927818298,
0.009602389298379421,
0.1281188577413559,
-0.07214841991662979,
0.11927354335784912,
-0.03682776540517807,
-0.05169438570737839,
0.03767198324203491,
-0.09057716280221939,
-0.09829415380954742,
-0.10362157225608826,
0.04921245947480202,
0.03662421181797981,
0.06362573802471161,
-0.08759351819753647,
-0.07920524477958679,
-0.05020708963274956,
0.13339996337890625,
-0.14096271991729736,
-0.024696217849850655,
-0.13594786822795868,
0.06163632497191429,
0.16514664888381958,
-0.06574080139398575,
0.03761792555451393,
0.000634689349681139,
0.15380018949508667,
0.03023158758878708,
-0.009523275308310986,
0.09163954854011536,
-0.08195097744464874,
-0.23089776933193207,
-0.03828607499599457,
0.18404826521873474,
0.011218010447919369,
0.06800809502601624,
-0.01536612119525671,
0.03736620768904686,
0.0005934509099461138,
-0.07425855845212936,
0.06124885380268097,
0.024728672578930855,
-0.0035209055058658123,
0.041122715920209885,
-0.020079413428902626,
0.01388662587851286,
-0.07749390602111816,
-0.027377285063266754,
0.08260273933410645,
0.24439920485019684,
-0.0895756334066391,
0.03226982057094574,
0.03311808779835701,
-0.06547562777996063,
-0.15966400504112244,
-0.00026040725060738623,
0.1391715556383133,
0.03550871089100838,
-0.0379454605281353,
-0.21713261306285858,
0.040170732885599136,
0.07565929740667343,
-0.031577300280332565,
0.10098356008529663,
-0.2851484417915344,
-0.14235632121562958,
0.09592021256685257,
0.022362222895026207,
-0.06976931542158127,
-0.17877674102783203,
-0.07349471002817154,
-0.01995546743273735,
-0.05280435457825661,
0.03542908653616905,
-0.048050377517938614,
0.11917980760335922,
0.005805862136185169,
0.023835230618715286,
0.022970333695411682,
-0.037361450493335724,
0.1546434462070465,
0.011308913119137287,
0.036295969039201736,
-0.009344707243144512,
0.010996920056641102,
0.04294455423951149,
-0.0755813866853714,
0.011304382234811783,
-0.07743534445762634,
0.01967705599963665,
-0.1391608715057373,
-0.02018718793988228,
-0.08984244614839554,
0.026663724333047867,
-0.04140635207295418,
-0.012605306692421436,
-0.021955745294690132,
0.04354586452245712,
0.06014348939061165,
0.019644660875201225,
0.1513846218585968,
-0.06808505207300186,
0.14523471891880035,
0.181778684258461,
0.12384116649627686,
0.012234479188919067,
-0.0888160914182663,
-0.006201294716447592,
-0.00903288647532463,
0.034209705889225006,
-0.11085465550422668,
0.03853752464056015,
0.13378122448921204,
0.036564379930496216,
0.13704514503479004,
0.041384100914001465,
-0.07805662602186203,
0.00972745195031166,
0.06263294070959091,
-0.07222223281860352,
-0.1511644572019577,
-0.03210805729031563,
0.02294315956532955,
-0.1433226466178894,
-0.004548217169940472,
0.13747483491897583,
-0.0333743542432785,
0.0043158158659935,
0.01881638914346695,
0.04094870761036873,
-0.03485117852687836,
0.23027479648590088,
0.03541212156414986,
0.1039995402097702,
-0.09594514966011047,
0.07959787547588348,
0.05821347236633301,
-0.11075425148010254,
0.02717851847410202,
0.11304785311222076,
-0.05471639707684517,
-0.025662198662757874,
0.012613958679139614,
0.06618817895650864,
0.022753233090043068,
-0.06027517840266228,
-0.12024927139282227,
-0.16040411591529846,
0.08049511164426804,
0.10910606384277344,
0.020113861188292503,
0.029311152175068855,
-0.01822855696082115,
0.045872416347265244,
-0.09532160311937332,
0.11641354113817215,
0.06167362630367279,
0.07038489729166031,
-0.136611208319664,
0.1154639944434166,
0.003834022907540202,
0.005149240605533123,
-0.0007179188542068005,
-0.009704750031232834,
-0.09646894782781601,
0.007394568528980017,
-0.1347157210111618,
0.002789281541481614,
-0.03565428406000137,
-0.003352977568283677,
0.014181542210280895,
-0.05562951788306236,
-0.06290063261985779,
0.02607533149421215,
-0.10845208913087845,
-0.044322747737169266,
-0.04414529353380203,
0.06886262446641922,
-0.09211043268442154,
-0.013135330751538277,
0.028542175889015198,
-0.12321408838033676,
0.09674978256225586,
0.03641939535737038,
0.020116591826081276,
0.020634474232792854,
-0.054327305406332016,
0.0009843145962804556,
0.03258697688579559,
0.01788530871272087,
0.03307056054472923,
-0.15258529782295227,
-0.0064018238335847855,
-0.03766198083758354,
0.013056832365691662,
-0.013522481545805931,
0.011460448615252972,
-0.10747180134057999,
0.00887297373265028,
-0.027687812224030495,
-0.0549217127263546,
-0.05709395185112953,
0.07458194345235825,
0.07470912486314774,
0.014580446295440197,
0.13656581938266754,
-0.0774790421128273,
0.05580112338066101,
-0.2383316457271576,
0.013963967561721802,
-0.01139655988663435,
-0.07958357781171799,
-0.05051323026418686,
-0.026326345279812813,
0.10702057182788849,
-0.06168345361948013,
0.07328952103853226,
-0.033765457570552826,
0.08575385063886642,
0.029094692319631577,
-0.10925078392028809,
0.020817499607801437,
0.0703076720237732,
0.14777134358882904,
0.06912942975759506,
-0.024989858269691467,
0.09122531116008759,
-0.017456278204917908,
0.05808861553668976,
0.1130296066403389,
0.15155398845672607,
0.12082257866859436,
0.04107270389795303,
0.08852757513523102,
0.11858086287975311,
-0.13609729707241058,
-0.1166759580373764,
0.1271669715642929,
-0.07652854174375534,
0.148737370967865,
-0.0395379438996315,
0.16727939248085022,
0.10553845018148422,
-0.2049868255853653,
0.05884387344121933,
-0.05941024422645569,
-0.08835005015134811,
-0.11063840985298157,
-0.06975884735584259,
-0.08402490615844727,
-0.18212401866912842,
0.0030268076807260513,
-0.11612409353256226,
0.05657380446791649,
0.05109606683254242,
0.04021759703755379,
0.031275853514671326,
0.11788170039653778,
0.03494344279170036,
-0.0013667717576026917,
0.12327571958303452,
0.01796291582286358,
-0.0008159377030096948,
-0.056478019803762436,
-0.10352711379528046,
0.04655372351408005,
-0.02449083887040615,
0.05905606225132942,
-0.043091095983982086,
-0.08112110942602158,
0.05770864710211754,
0.008682183921337128,
-0.10041540861129761,
0.03814826160669327,
-0.017721038311719894,
0.0419185534119606,
0.06886427104473114,
0.016700826585292816,
-0.010591046884655952,
-0.008834383450448513,
0.20257523655891418,
-0.09792176634073257,
-0.05943385511636734,
-0.1378217190504074,
0.20163114368915558,
-0.02052575722336769,
-0.006775329355150461,
0.021167490631341934,
-0.06040744483470917,
-0.02692422643303871,
0.16119232773780823,
0.16097718477249146,
-0.023919282481074333,
-0.018817780539393425,
0.02149740606546402,
-0.008879286237061024,
-0.039251163601875305,
0.07794659584760666,
0.10630548000335693,
0.05089486017823219,
-0.03261744603514671,
-0.017113232985138893,
0.016675826162099838,
-0.06539091467857361,
-0.03644714504480362,
0.07310140132904053,
0.012423193082213402,
0.0024249020498245955,
-0.017333976924419403,
0.1044156551361084,
-0.06870271265506744,
-0.13032075762748718,
0.04678400605916977,
-0.19442994892597198,
-0.1849101185798645,
-0.04188847914338112,
0.03268332779407501,
0.037918251007795334,
0.05565668269991875,
-0.002755343448370695,
-0.050790559500455856,
0.1101423054933548,
-0.004503464791923761,
-0.03566692769527435,
-0.10950299352407455,
0.08307821303606033,
-0.14373637735843658,
0.17663776874542236,
-0.0563928447663784,
0.021604569628834724,
0.11459950357675552,
0.036405887454748154,
-0.08497681468725204,
0.01136641763150692,
0.09403260052204132,
-0.13921506702899933,
0.03559302166104317,
0.19535674154758453,
-0.04748355969786644,
0.14244265854358673,
0.04782099276781082,
-0.1008741706609726,
0.0029635753016918898,
-0.06932801008224487,
-0.04316646233201027,
-0.05801524966955185,
-0.01963280327618122,
-0.04843268170952797,
0.14034415781497955,
0.20861627161502838,
-0.06616077572107315,
-0.028698431327939034,
-0.03419353812932968,
0.011134025640785694,
0.04354235157370567,
0.1319754421710968,
-0.05169886723160744,
-0.2637244462966919,
0.015589863993227482,
-0.016198256984353065,
0.01937495358288288,
-0.19157595932483673,
-0.07309085875749588,
0.024242043495178223,
-0.04273870214819908,
-0.04913197457790375,
0.12722118198871613,
0.046633824706077576,
0.03672968968749046,
-0.05561384931206703,
-0.09549032896757126,
-0.024593481793999672,
0.1842077225446701,
-0.1904916763305664,
-0.05833807215094566
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - UK dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1747
- Wer: 0.2107
- Cer: 0.0408
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 8000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 1.3719 | 4.35 | 500 | 0.3389 | 0.4236 | 0.0833 |
| 1.1361 | 8.7 | 1000 | 0.2309 | 0.3162 | 0.0630 |
| 1.0517 | 13.04 | 1500 | 0.2166 | 0.3056 | 0.0597 |
| 1.0118 | 17.39 | 2000 | 0.2141 | 0.2784 | 0.0557 |
| 0.9922 | 21.74 | 2500 | 0.2231 | 0.2941 | 0.0594 |
| 0.9929 | 26.09 | 3000 | 0.2171 | 0.2892 | 0.0587 |
| 0.9485 | 30.43 | 3500 | 0.2236 | 0.2956 | 0.0599 |
| 0.9573 | 34.78 | 4000 | 0.2314 | 0.3043 | 0.0616 |
| 0.9195 | 39.13 | 4500 | 0.2169 | 0.2812 | 0.0580 |
| 0.8915 | 43.48 | 5000 | 0.2109 | 0.2780 | 0.0560 |
| 0.8449 | 47.83 | 5500 | 0.2050 | 0.2534 | 0.0514 |
| 0.8028 | 52.17 | 6000 | 0.2032 | 0.2456 | 0.0492 |
| 0.7881 | 56.52 | 6500 | 0.1890 | 0.2380 | 0.0469 |
| 0.7423 | 60.87 | 7000 | 0.1816 | 0.2245 | 0.0442 |
| 0.7248 | 65.22 | 7500 | 0.1789 | 0.2165 | 0.0422 |
| 0.6993 | 69.57 | 8000 | 0.1747 | 0.2107 | 0.0408 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
|
{"language": ["uk"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-xls-r-1b-hy-cv", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice uk", "type": "mozilla-foundation/common_voice_8_0", "args": "uk"}, "metrics": [{"type": "wer", "value": 12.246920571994902, "name": "WER LM"}, {"type": "cer", "value": 2.513653497966816, "name": "CER LM"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "uk"}, "metrics": [{"type": "wer", "value": 46.56, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "uk"}, "metrics": [{"type": "wer", "value": 35.98, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-1b-uk-cv
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"uk",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"uk"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #uk #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - UK dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1747
* Wer: 0.2107
* Cer: 0.0408
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 8e-05
* train\_batch\_size: 16
* eval\_batch\_size: 64
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 8000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 8000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #uk #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 8000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
119,
159,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #uk #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 8000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
-0.11287152767181396,
0.08894600719213486,
-0.006536797620356083,
0.042657461017370224,
0.10231440514326096,
0.025525910779833794,
0.10657408833503723,
0.1442563384771347,
-0.07836417108774185,
0.10270284861326218,
0.06819736957550049,
0.06527378410100937,
0.08784760534763336,
0.07675649225711823,
-0.011193470098078251,
-0.2899024784564972,
0.015107767656445503,
-0.028182953596115112,
-0.14307674765586853,
0.09193523228168488,
0.10955445468425751,
-0.1025102436542511,
0.020584257319569588,
0.022700130939483643,
-0.08703663945198059,
0.0017928642919287086,
-0.04007508233189583,
-0.04427443817257881,
0.09560501575469971,
0.05303272604942322,
0.06831750273704529,
0.05037649720907211,
0.09629832208156586,
-0.26409992575645447,
0.00935029610991478,
0.0696716159582138,
0.026458963751792908,
0.06107846274971962,
0.10647746920585632,
-0.0000790173071436584,
0.09299544245004654,
-0.05020768567919731,
0.05091800168156624,
0.04768363758921623,
-0.10957830399274826,
-0.26694580912590027,
-0.08605864644050598,
0.02589433081448078,
0.12773185968399048,
0.08316586166620255,
-0.03031160496175289,
0.015541134402155876,
-0.08851604163646698,
0.07317305356264114,
0.2334606796503067,
-0.1931820511817932,
-0.07353422790765762,
-0.013207163661718369,
0.051532451063394547,
0.03520398959517479,
-0.10920580476522446,
-0.022988172248005867,
0.019995208829641342,
0.009153836406767368,
0.07688011229038239,
0.028168020769953728,
0.011917044408619404,
0.003924323711544275,
-0.1338149905204773,
-0.05536581948399544,
0.11868671327829361,
0.06443832069635391,
-0.00502421148121357,
-0.08711039274930954,
-0.02594166062772274,
-0.16426071524620056,
-0.038281504064798355,
0.021507252007722855,
0.028865298256278038,
-0.022435162216424942,
-0.021622847765684128,
0.01958080381155014,
-0.06081537902355194,
-0.07347918301820755,
0.05735982954502106,
0.12248855829238892,
0.04439390078186989,
-0.02732580527663231,
-0.008008787408471107,
0.10921720415353775,
0.08003146201372147,
-0.18355096876621246,
-0.02192297950387001,
0.04900839924812317,
-0.1049000546336174,
-0.012273651547729969,
-0.026048870757222176,
0.029774393886327744,
0.03607166185975075,
0.13461242616176605,
-0.015399415045976639,
0.07359331846237183,
0.004461138509213924,
0.03256994113326073,
-0.08035562932491302,
0.1641128659248352,
-0.06384850293397903,
-0.054030776023864746,
-0.04777761921286583,
0.131422221660614,
-0.01954086869955063,
-0.0169222392141819,
-0.06757135689258575,
0.015668949112296104,
0.07329759001731873,
0.046910904347896576,
-0.007078018505126238,
0.01783212088048458,
-0.07448935508728027,
-0.012727036140859127,
-0.0011811290169134736,
-0.12955839931964874,
0.03691355139017105,
0.06689321249723434,
-0.08368780463933945,
0.0032491108868271112,
-0.014894641935825348,
0.025137290358543396,
-0.029035281389951706,
0.08659734576940536,
-0.04379316046833992,
-0.008360516279935837,
-0.0897396057844162,
-0.07978666573762894,
0.04720386117696762,
-0.04936220124363899,
0.011330364271998405,
-0.04615653306245804,
-0.10707700252532959,
-0.06855998188257217,
0.05542941763997078,
-0.06904811412096024,
-0.06700093299150467,
-0.0714094415307045,
-0.09222537279129028,
0.055768534541130066,
-0.02519247867166996,
0.18013188242912292,
-0.0457204133272171,
0.0805160328745842,
0.022275423631072044,
0.05036485567688942,
0.10802292078733444,
0.0686998963356018,
-0.01227409578859806,
0.055124372243881226,
-0.11280693858861923,
0.09255573153495789,
-0.10143136233091354,
0.04101121425628662,
-0.1174754798412323,
-0.09279080480337143,
-0.014597885310649872,
0.004281016532331705,
0.09528008848428726,
0.11168623715639114,
-0.1675700694322586,
-0.08960927277803421,
0.14758117496967316,
-0.062211230397224426,
-0.0832529366016388,
0.13014008104801178,
-0.018681231886148453,
-0.03956806659698486,
0.012832853943109512,
0.14101098477840424,
0.13037413358688354,
-0.08577407151460648,
0.014198141172528267,
-0.056944429874420166,
0.12422258406877518,
0.07815810292959213,
0.1047946959733963,
-0.0335928276181221,
0.07016990333795547,
-0.006582578178495169,
-0.043319255113601685,
0.048466507345438004,
-0.08008521050214767,
-0.08672147989273071,
-0.006775887217372656,
-0.06312176585197449,
0.014693201519548893,
0.06279847025871277,
0.013181845657527447,
-0.07223435491323471,
-0.1433374434709549,
0.006194457411766052,
0.10315112769603729,
-0.09559247642755508,
0.023857243359088898,
-0.08523081243038177,
0.06084732711315155,
0.008320416323840618,
-0.0068380264565348625,
-0.1457684487104416,
-0.0288636963814497,
0.02523934468626976,
-0.04396553337574005,
-0.003967066295444965,
0.019643116742372513,
0.06476763635873795,
0.04877566546201706,
-0.03399733826518059,
-0.05854678153991699,
-0.0743594691157341,
-0.012307877652347088,
-0.04972197487950325,
-0.23852364718914032,
-0.07704376429319382,
-0.025800462812185287,
0.1660250425338745,
-0.2030135542154312,
0.015457541681826115,
0.060924921184778214,
0.14971494674682617,
0.016490604728460312,
-0.03683633357286453,
-0.011427660472691059,
0.048545658588409424,
-0.030436955392360687,
-0.07106593251228333,
0.01670183055102825,
-0.00258255610242486,
-0.08404110372066498,
0.0338774174451828,
-0.13495856523513794,
0.07893956452608109,
0.08899933099746704,
-0.01365626510232687,
-0.04941895976662636,
-0.026412278413772583,
-0.06095946580171585,
-0.04767797514796257,
-0.022707071155309677,
-0.0236363522708416,
0.18348947167396545,
0.005424567963927984,
0.10853227227926254,
-0.0861620306968689,
-0.07078523188829422,
0.03253060579299927,
0.01637374609708786,
-0.008897594176232815,
0.14408975839614868,
0.05781105160713196,
-0.02539590187370777,
0.09439667314291,
0.046171873807907104,
-0.059984996914863586,
0.1632576286792755,
-0.07539855688810349,
-0.09243687987327576,
-0.03385305404663086,
0.008174081332981586,
0.024271728470921516,
0.11486544460058212,
-0.18078218400478363,
-0.014177118428051472,
0.020701052621006966,
0.023932909592986107,
0.03221287205815315,
-0.19415389001369476,
-0.009905043989419937,
0.0521482452750206,
-0.0829290896654129,
-0.01630781963467598,
-0.0050524878315627575,
-0.0090687470510602,
0.08624883741140366,
0.006527132820338011,
-0.08845407515764236,
-0.02948857843875885,
-0.053416043519973755,
-0.09157425910234451,
0.1562768667936325,
-0.09006337076425552,
-0.1140102669596672,
-0.11906467378139496,
-0.018768135458230972,
0.0177470613270998,
-0.008534570224583149,
0.02981271781027317,
-0.10540831834077835,
-0.034901510924100876,
-0.06234363466501236,
0.029342398047447205,
-0.05100695416331291,
0.013492568396031857,
0.05474121868610382,
0.00968250073492527,
0.05075869336724281,
-0.09410136193037033,
0.018816178664565086,
-0.017272505909204483,
-0.029952272772789,
-0.006759701296687126,
0.005423002410680056,
0.07699637860059738,
0.17008042335510254,
0.04019561782479286,
0.03792627528309822,
-0.028350748121738434,
0.1706862449645996,
-0.14050514996051788,
0.00813156645745039,
0.1255398988723755,
-0.0013969031861051917,
0.042392753064632416,
0.14509320259094238,
0.04420551657676697,
-0.07032604515552521,
0.008757397532463074,
0.028905827552080154,
-0.021825382485985756,
-0.2548688054084778,
-0.01444312185049057,
-0.07434731721878052,
-0.03658831864595413,
0.0906480923295021,
0.028804093599319458,
0.017198674380779266,
0.020902520045638084,
-0.03841761499643326,
-0.020505350083112717,
0.07097113877534866,
0.03902328386902809,
0.059709660708904266,
0.04160449281334877,
0.10367313772439957,
-0.006397439166903496,
-0.027688050642609596,
0.013749039731919765,
-0.0014035613276064396,
0.21792949736118317,
-0.001842616475187242,
0.17026446759700775,
0.04610365629196167,
0.12595431506633759,
-0.0058830357156693935,
0.04358592629432678,
0.0014714142307639122,
-0.00593223050236702,
0.02136571891605854,
-0.044898416846990585,
-0.013641615398228168,
0.03735104575753212,
0.10002047568559647,
0.015060081146657467,
-0.10292764008045197,
0.014544935896992683,
0.01577574573457241,
0.3247906565666199,
0.07789727300405502,
-0.27113065123558044,
-0.06008739396929741,
0.017869727686047554,
-0.056266531348228455,
-0.024425411596894264,
0.028372541069984436,
0.12065885215997696,
-0.06355762481689453,
0.06665406376123428,
-0.040296804159879684,
0.09210743755102158,
-0.04494309425354004,
0.0018669688142836094,
0.10011305660009384,
0.08502213656902313,
0.013869604095816612,
0.06383811682462692,
-0.245321124792099,
0.2603475749492645,
-0.01960901729762554,
0.07490123808383942,
-0.043755967170000076,
0.054129138588905334,
0.02773752063512802,
-0.01265022438019514,
0.0663706511259079,
-0.00853618886321783,
-0.12505567073822021,
-0.15205518901348114,
-0.08077029883861542,
0.009694200940430164,
0.12174808233976364,
-0.08267221599817276,
0.1140604168176651,
-0.04017222672700882,
-0.03450744226574898,
0.05325978621840477,
-0.05005881190299988,
-0.1030086800456047,
-0.10954941064119339,
0.04637087509036064,
0.0028718477115035057,
0.0630594789981842,
-0.094316765666008,
-0.10201065987348557,
-0.07892510294914246,
0.1355145275592804,
-0.13699579238891602,
-0.015183670446276665,
-0.13702431321144104,
0.07528803497552872,
0.1542586386203766,
-0.05539511889219284,
0.025881575420498848,
0.03578239306807518,
0.15546497702598572,
0.018306776881217957,
-0.007581531535834074,
0.10234194993972778,
-0.08450474590063095,
-0.22610534727573395,
-0.06074434146285057,
0.17277109622955322,
0.04033093899488449,
0.07187379896640778,
-0.00983620248734951,
0.028119541704654694,
-0.013530935160815716,
-0.06101804971694946,
0.07976505160331726,
0.05434694141149521,
0.01193958055227995,
0.03731699660420418,
-0.02699636109173298,
0.019715825095772743,
-0.07033241540193558,
-0.0564706064760685,
0.08265243470668793,
0.24675238132476807,
-0.08206326514482498,
0.039990875869989395,
0.017967713996767998,
-0.07660553604364395,
-0.15071649849414825,
-0.0012739298399537802,
0.13561750948429108,
0.05123412609100342,
-0.02797551266849041,
-0.20235282182693481,
0.00751950778067112,
0.06650526076555252,
-0.02177593670785427,
0.052578963339328766,
-0.328224241733551,
-0.13257361948490143,
0.07639885693788528,
0.04844626039266586,
-0.030922455713152885,
-0.15017327666282654,
-0.052124470472335815,
-0.020719071850180626,
-0.09827467054128647,
0.03850284963846207,
-0.03278606757521629,
0.10365717858076096,
0.007229195442050695,
0.008503149263560772,
0.015230952762067318,
-0.049126431345939636,
0.16053950786590576,
0.0209296103566885,
0.042196642607450485,
-0.010152042843401432,
0.012847699224948883,
0.0394308939576149,
-0.06319998949766159,
-0.010245164856314659,
-0.05899611860513687,
0.019015124067664146,
-0.1469191014766693,
-0.03264915198087692,
-0.08870567381381989,
0.00487403804436326,
-0.05898769199848175,
-0.012916415929794312,
-0.022680196911096573,
0.047714147716760635,
0.09239652752876282,
0.024938320741057396,
0.10929714888334274,
-0.06485331058502197,
0.1294662207365036,
0.12447339296340942,
0.12463875114917755,
0.025504138320684433,
-0.08877932280302048,
-0.006615190766751766,
0.018139801919460297,
0.04402440786361694,
-0.12303481996059418,
0.040606774389743805,
0.14650419354438782,
0.0373694933950901,
0.1381808966398239,
0.062452636659145355,
-0.09625744074583054,
0.008672085590660572,
0.06255503743886948,
-0.05969646945595741,
-0.12822748720645905,
-0.021361878141760826,
0.03928050398826599,
-0.11756111681461334,
-0.029763324186205864,
0.11241963505744934,
-0.03844379261136055,
0.0020789781119674444,
0.016517186537384987,
0.044596899300813675,
-0.036139413714408875,
0.22334875166416168,
0.028137877583503723,
0.1017051562666893,
-0.08136850595474243,
0.06911574304103851,
0.053912680596113205,
-0.093205526471138,
0.027235737070441246,
0.09327203780412674,
-0.03795073553919792,
-0.02485341764986515,
0.009621462784707546,
0.07343299686908722,
0.03209810331463814,
-0.054821234196424484,
-0.12840473651885986,
-0.15814964473247528,
0.0949237123131752,
0.08484403789043427,
0.021974844858050346,
0.034626543521881104,
-0.012091397307813168,
0.04137737676501274,
-0.08113586902618408,
0.10391887277364731,
0.08328497409820557,
0.0625900998711586,
-0.10772296786308289,
0.10631298273801804,
0.00982136931270361,
0.005730051547288895,
0.0012782472185790539,
0.0019514750456437469,
-0.08689390867948532,
0.03498828783631325,
-0.11881466209888458,
-0.003909815102815628,
-0.028197292238473892,
-0.01249607466161251,
0.0068596405908465385,
-0.07140714675188065,
-0.05761098861694336,
0.023093758150935173,
-0.10989580303430557,
-0.03114202246069908,
-0.03756024315953255,
0.06023327261209488,
-0.09505403786897659,
-0.02371894009411335,
0.04617646709084511,
-0.1370040774345398,
0.09662521630525589,
0.031739626079797745,
0.023536693304777145,
0.01805504970252514,
-0.08914372324943542,
-0.010624686256051064,
0.011430392973124981,
0.021807760000228882,
0.02948462776839733,
-0.17049062252044678,
-0.002431808738037944,
-0.03420232981443405,
0.003064769087359309,
-0.02152320370078087,
-0.02552221156656742,
-0.10801083594560623,
0.018902838230133057,
-0.02272649109363556,
-0.07423096150159836,
-0.04193410277366638,
0.06098880246281624,
0.07323160767555237,
0.019003190100193024,
0.14183196425437927,
-0.06639817357063293,
0.07842059433460236,
-0.23466233909130096,
0.014431154355406761,
0.0015986555954441428,
-0.05039623752236366,
-0.033195704221725464,
-0.021334059536457062,
0.09782706946134567,
-0.07256706058979034,
0.08471453189849854,
-0.015552903525531292,
0.03671837970614433,
0.035666290670633316,
-0.11842844635248184,
0.03850226849317551,
0.07467195391654968,
0.14681895077228546,
0.058510128408670425,
-0.012740901671350002,
0.07073140889406204,
-0.038931284099817276,
0.05765845254063606,
0.10087795555591583,
0.14905992150306702,
0.14805404841899872,
0.10041027516126633,
0.07857590168714523,
0.11273916065692902,
-0.1447608470916748,
-0.1214446872472763,
0.15019161999225616,
-0.06563443690538406,
0.11965537071228027,
-0.026391467079520226,
0.18031200766563416,
0.13177403807640076,
-0.2029045671224594,
0.06141616404056549,
-0.04688519984483719,
-0.07843106985092163,
-0.10725563764572144,
-0.07496929913759232,
-0.07397545129060745,
-0.16227415204048157,
0.024114491418004036,
-0.10726961493492126,
0.06243327260017395,
0.0775551125407219,
0.045611388981342316,
0.03338787332177162,
0.07500968128442764,
0.08519138395786285,
0.006016593426465988,
0.10076771676540375,
0.017046473920345306,
0.004302981309592724,
-0.06559208035469055,
-0.10735004395246506,
0.0529712475836277,
-0.019710009917616844,
0.0512053444981575,
-0.03753818944096565,
-0.08696483820676804,
0.04761703684926033,
0.030086472630500793,
-0.09846193343400955,
0.037039775401353836,
-0.032842960208654404,
0.05907592549920082,
0.05602419748902321,
0.03112044744193554,
0.01022290624678135,
-0.008426395244896412,
0.1933576911687851,
-0.07165227830410004,
-0.0460604652762413,
-0.12740597128868103,
0.16449393332004547,
0.009784102439880371,
0.014393601566553116,
0.03557032719254494,
-0.0679916962981224,
-0.026934856548905373,
0.1503354012966156,
0.1370329111814499,
-0.03589024394750595,
-0.021596696227788925,
0.030875306576490402,
-0.004777638241648674,
-0.026261739432811737,
0.06159253790974617,
0.12269667536020279,
0.04976048320531845,
-0.046840254217386246,
-0.03293999657034874,
-0.02070126123726368,
-0.055088091641664505,
-0.010232338681817055,
0.08344356715679169,
0.011408013291656971,
-0.001587252481840551,
-0.025584474205970764,
0.09962217509746552,
-0.05350901186466217,
-0.15616366267204285,
0.047861989587545395,
-0.18407487869262695,
-0.1843661367893219,
-0.03508878871798515,
0.06600753962993622,
0.029931310564279556,
0.06977608799934387,
-0.004374620504677296,
-0.059799984097480774,
0.10303271561861038,
0.0058956146240234375,
-0.03530766814947128,
-0.08307377249002457,
0.07812289148569107,
-0.12617138028144836,
0.19338944554328918,
-0.05453641340136528,
0.03821592777967453,
0.11409658938646317,
0.05107336491346359,
-0.07970772683620453,
0.00004009394979220815,
0.09078706055879593,
-0.11876244843006134,
0.02646598219871521,
0.1911218762397766,
-0.04257607087492943,
0.12052428722381592,
0.0589265376329422,
-0.1021142527461052,
0.025078440085053444,
-0.06091391295194626,
-0.04743087664246559,
-0.05546562746167183,
-0.00009923785546561703,
-0.038864683359861374,
0.12850545346736908,
0.20355959236621857,
-0.06402895599603653,
-0.018707316368818283,
-0.03206009417772293,
0.009103575721383095,
0.011186896823346615,
0.15491150319576263,
-0.032283682376146317,
-0.2555520236492157,
0.02048519253730774,
-0.009798398241400719,
0.016063669696450233,
-0.17817926406860352,
-0.09017035365104675,
0.03932870179414749,
-0.0673617273569107,
-0.05996786803007126,
0.10826525092124939,
0.07349609583616257,
0.049592792987823486,
-0.05991261824965477,
-0.111496701836586,
-0.029404090717434883,
0.17511415481567383,
-0.16959112882614136,
-0.053162820637226105
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the /WORKSPACE/DATA/UK/COMPOSED_DATASET/ - NA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1092
- Wer: 0.1752
- Cer: 0.0323
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 12000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|
| 1.7005 | 1.61 | 500 | 0.4082 | 0.5584 | 0.1164 |
| 1.1555 | 3.22 | 1000 | 0.2020 | 0.2953 | 0.0557 |
| 1.0927 | 4.82 | 1500 | 0.1708 | 0.2584 | 0.0480 |
| 1.0707 | 6.43 | 2000 | 0.1563 | 0.2405 | 0.0450 |
| 1.0728 | 8.04 | 2500 | 0.1620 | 0.2442 | 0.0463 |
| 1.0268 | 9.65 | 3000 | 0.1588 | 0.2378 | 0.0458 |
| 1.0328 | 11.25 | 3500 | 0.1466 | 0.2352 | 0.0442 |
| 1.0249 | 12.86 | 4000 | 0.1552 | 0.2341 | 0.0449 |
| 1.016 | 14.47 | 4500 | 0.1602 | 0.2435 | 0.0473 |
| 1.0164 | 16.08 | 5000 | 0.1491 | 0.2337 | 0.0444 |
| 0.9935 | 17.68 | 5500 | 0.1539 | 0.2373 | 0.0458 |
| 0.9626 | 19.29 | 6000 | 0.1458 | 0.2305 | 0.0434 |
| 0.9505 | 20.9 | 6500 | 0.1368 | 0.2157 | 0.0407 |
| 0.9389 | 22.51 | 7000 | 0.1437 | 0.2231 | 0.0426 |
| 0.9129 | 24.12 | 7500 | 0.1313 | 0.2076 | 0.0394 |
| 0.9118 | 25.72 | 8000 | 0.1292 | 0.2040 | 0.0384 |
| 0.8848 | 27.33 | 8500 | 0.1299 | 0.2028 | 0.0384 |
| 0.8667 | 28.94 | 9000 | 0.1228 | 0.1945 | 0.0367 |
| 0.8641 | 30.55 | 9500 | 0.1223 | 0.1939 | 0.0364 |
| 0.8516 | 32.15 | 10000 | 0.1184 | 0.1876 | 0.0349 |
| 0.8379 | 33.76 | 10500 | 0.1137 | 0.1821 | 0.0338 |
| 0.8235 | 35.37 | 11000 | 0.1127 | 0.1779 | 0.0331 |
| 0.8112 | 36.98 | 11500 | 0.1103 | 0.1766 | 0.0327 |
| 0.8069 | 38.59 | 12000 | 0.1092 | 0.1752 | 0.0323 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
|
{"language": ["uk"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-xls-r-1b-hy", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice uk", "type": "mozilla-foundation/common_voice_8_0", "args": "uk"}, "metrics": [{"type": "wer", "value": 10.406342913776015, "name": "WER LM"}, {"type": "cer", "value": 2.0387492208601703, "name": "CER LM"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "uk"}, "metrics": [{"type": "wer", "value": 40.57, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "uk"}, "metrics": [{"type": "wer", "value": 28.95, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-1b-uk
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"uk",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"uk"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #uk #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the /WORKSPACE/DATA/UK/COMPOSED\_DATASET/ - NA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1092
* Wer: 0.1752
* Cer: 0.0323
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 64
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 12000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 12000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #uk #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 12000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
109,
160,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #uk #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 12000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
-0.13464748859405518,
0.09752877056598663,
-0.004913621582090855,
0.05576420575380325,
0.09661780297756195,
0.0188152939081192,
0.1052614077925682,
0.1574903428554535,
-0.09406475722789764,
0.11446955800056458,
0.08079302310943604,
0.05070704221725464,
0.09059906750917435,
0.12384342402219772,
-0.012790446169674397,
-0.29758837819099426,
0.01800215058028698,
-0.033166032284498215,
-0.14046244323253632,
0.09673725068569183,
0.10836353152990341,
-0.1071801483631134,
0.03655014932155609,
0.01897318661212921,
-0.08842969685792923,
-0.012777798809111118,
-0.048387836664915085,
-0.047506775707006454,
0.0834476426243782,
0.048411983996629715,
0.06669137626886368,
0.03869909793138504,
0.08864111453294754,
-0.2523813545703888,
0.008226106874644756,
0.06808452308177948,
0.0391814224421978,
0.0621577687561512,
0.10496653616428375,
-0.0034422026947140694,
0.09123250097036362,
-0.06499312818050385,
0.04785918444395065,
0.056429073214530945,
-0.11603844165802002,
-0.2471543848514557,
-0.07989203929901123,
0.04121444746851921,
0.12306129932403564,
0.0869145542383194,
-0.03391613811254501,
0.04084044322371483,
-0.07583095878362656,
0.08649816364049911,
0.20438122749328613,
-0.21118201315402985,
-0.07847009599208832,
0.006993492133915424,
0.050890419632196426,
0.056591738015413284,
-0.11762852966785431,
-0.018150221556425095,
0.01224011741578579,
0.012721856124699116,
0.06913328170776367,
0.004349933005869389,
0.002373995492234826,
-0.0006808101898059249,
-0.12495078891515732,
-0.06443032622337341,
0.15003852546215057,
0.07884005457162857,
-0.007869732566177845,
-0.11091255396604538,
-0.012881222181022167,
-0.19367636740207672,
-0.05587103217840195,
0.02410997450351715,
0.027296189218759537,
-0.024177081882953644,
-0.054136913269758224,
0.023706266656517982,
-0.07182351499795914,
-0.06798238307237625,
0.06531082838773727,
0.06753874570131302,
0.022916138172149658,
-0.028125541284680367,
0.005080772563815117,
0.10167965292930603,
0.03806939721107483,
-0.17631499469280243,
-0.025945089757442474,
0.03460679203271866,
-0.11169029027223587,
-0.017505912110209465,
-0.014122474007308483,
0.03239666670560837,
0.05203210934996605,
0.11678589135408401,
-0.0431516095995903,
0.08948051184415817,
0.0028708348982036114,
0.025347433984279633,
-0.07400046288967133,
0.17898954451084137,
-0.0627046599984169,
-0.08415592461824417,
-0.0564396008849144,
0.13564719259738922,
-0.020034238696098328,
-0.011048798449337482,
-0.07484303414821625,
0.02701149694621563,
0.09304620325565338,
0.052463598549366,
-0.0068327621556818485,
0.03167763352394104,
-0.07149814814329147,
-0.020729893818497658,
0.03536728397011757,
-0.11536752432584763,
0.04904915392398834,
0.06789769232273102,
-0.07704684883356094,
-0.017692599445581436,
-0.005668906960636377,
0.01585632748901844,
-0.03767473250627518,
0.06372535973787308,
-0.056202732026576996,
-0.010001345537602901,
-0.07883760333061218,
-0.08656927943229675,
0.04472416266798973,
-0.05367041751742363,
-0.004888435825705528,
-0.04947253689169884,
-0.10427245497703552,
-0.07822786271572113,
0.045398980379104614,
-0.06712939590215683,
-0.07668742537498474,
-0.07886812090873718,
-0.101485975086689,
0.05463090538978577,
-0.018050700426101685,
0.1781056821346283,
-0.0591132715344429,
0.07504604011774063,
0.016877569258213043,
0.04154001548886299,
0.08952201157808304,
0.06929832696914673,
-0.024337826296687126,
0.058014221489429474,
-0.14599820971488953,
0.10923829674720764,
-0.11688816547393799,
0.06193193420767784,
-0.12735062837600708,
-0.09917949140071869,
-0.022070618346333504,
0.0020426874980330467,
0.08085019886493683,
0.1176953911781311,
-0.17933955788612366,
-0.08710453659296036,
0.16391777992248535,
-0.052309054881334305,
-0.08643278479576111,
0.11699756234884262,
-0.015457423403859138,
-0.03226780146360397,
0.020238524302840233,
0.15674042701721191,
0.15667447447776794,
-0.09284418821334839,
0.0021735941991209984,
-0.034621402621269226,
0.11880656331777573,
0.05567306652665138,
0.08804652094841003,
-0.05175773799419403,
0.07033620774745941,
0.0042985714972019196,
-0.01140230055898428,
0.04025541618466377,
-0.0743548795580864,
-0.08898667991161346,
-0.011119276285171509,
-0.07633259147405624,
-0.014841101132333279,
0.06784574687480927,
0.01842772588133812,
-0.07397980242967606,
-0.13574056327342987,
-0.01772163435816765,
0.10568269342184067,
-0.09308288246393204,
0.01419094204902649,
-0.08009751886129379,
0.05248973146080971,
0.00658664433285594,
-0.003033735090866685,
-0.15667735040187836,
-0.02899753861129284,
0.03777462989091873,
-0.034646421670913696,
0.015411075204610825,
0.01043606922030449,
0.07620465010404587,
0.03470844402909279,
-0.03945554420351982,
-0.06139165163040161,
-0.030215952545404434,
-0.00928036030381918,
-0.05933062359690666,
-0.23641014099121094,
-0.07217429578304291,
-0.02799343504011631,
0.2111338973045349,
-0.2185785174369812,
0.014700173400342464,
0.07384047657251358,
0.13056695461273193,
0.030230127274990082,
-0.04758407548069954,
0.01886618696153164,
0.05707140266895294,
-0.0238651055842638,
-0.08244746178388596,
0.02305443398654461,
0.00771362753584981,
-0.09019811451435089,
0.020288385450839996,
-0.15964500606060028,
0.051343102008104324,
0.09418008476495743,
0.00797924492508173,
-0.06896153092384338,
-0.053068291395902634,
-0.06165335327386856,
-0.05001115798950195,
-0.006443747319281101,
-0.013612937182188034,
0.15047936141490936,
0.014025717042386532,
0.11067160218954086,
-0.0771046131849289,
-0.0582236684858799,
0.039731092751026154,
0.02220097742974758,
-0.00331153254956007,
0.14842736721038818,
0.03332405164837837,
-0.04231123998761177,
0.10176531225442886,
0.03262767195701599,
-0.060960568487644196,
0.1609712392091751,
-0.08892940729856491,
-0.09646129608154297,
-0.02948150783777237,
0.030057517811655998,
0.03835618123412132,
0.1130625382065773,
-0.18232199549674988,
-0.017306046560406685,
0.02889505960047245,
0.021903369575738907,
0.023076174780726433,
-0.18069571256637573,
0.007320324890315533,
0.03304740786552429,
-0.08919341117143631,
0.0037543885409832,
-0.01381490658968687,
-0.00047480850480496883,
0.07774752378463745,
0.0031374110840260983,
-0.0838032141327858,
-0.032215941697359085,
-0.04475272074341774,
-0.08475638180971146,
0.17324073612689972,
-0.09774652868509293,
-0.14763644337654114,
-0.12650704383850098,
0.001002618344500661,
-0.004916358273476362,
-0.0008381257066503167,
0.028635717928409576,
-0.0883232057094574,
-0.03571111336350441,
-0.07870551198720932,
0.0015387985622510314,
-0.03907841071486473,
0.02064322493970394,
0.012116105295717716,
0.017407815903425217,
0.07639272511005402,
-0.10412920266389847,
0.01724068447947502,
-0.0005573098314926028,
-0.03559477999806404,
0.00799525436013937,
0.0004953544703312218,
0.08785519003868103,
0.17661280930042267,
0.054731015115976334,
0.03971157595515251,
-0.03379777446389198,
0.18019971251487732,
-0.1533825546503067,
0.01479419320821762,
0.08340074121952057,
0.00908035971224308,
0.04177836328744888,
0.15397290885448456,
0.03773556649684906,
-0.0679578110575676,
-0.0015069765504449606,
0.03675233572721481,
-0.021228676661849022,
-0.2236713021993637,
-0.028151849284768105,
-0.08037069439888,
0.0007012055139057338,
0.10900503396987915,
0.03635350242257118,
0.03942649066448212,
0.02559315599501133,
-0.026227809488773346,
-0.01541365496814251,
0.06823714077472687,
0.04275936633348465,
0.058752864599227905,
0.044129710644483566,
0.11523228883743286,
-0.017516443505883217,
-0.02878444455564022,
0.025660010054707527,
0.005523717030882835,
0.21703335642814636,
-0.0007305651088245213,
0.19903822243213654,
0.04015573114156723,
0.1313985288143158,
0.00608258368447423,
0.053149618208408356,
0.012542287819087505,
-0.008670537732541561,
0.025128109380602837,
-0.05428491532802582,
-0.01318560354411602,
0.039143647998571396,
0.12886057794094086,
0.004853100515902042,
-0.09778225421905518,
0.011964607983827591,
0.02362458035349846,
0.34205362200737,
0.09372895210981369,
-0.2953726649284363,
-0.07327882945537567,
0.01735055446624756,
-0.062030129134655,
-0.045601993799209595,
0.025822538882493973,
0.1162465363740921,
-0.0817766860127449,
0.08200091123580933,
-0.04653465747833252,
0.09277556836605072,
-0.062114305794239044,
0.0021449143532663584,
0.08631545305252075,
0.07483137398958206,
0.0056817554868757725,
0.06451091170310974,
-0.24696509540081024,
0.27196431159973145,
-0.005069155246019363,
0.06629237532615662,
-0.046362634748220444,
0.05336323752999306,
0.028707128018140793,
0.0001331724924966693,
0.07646223902702332,
-0.012597634457051754,
-0.10961513966321945,
-0.16783034801483154,
-0.09389213472604752,
0.005556873977184296,
0.12043958157300949,
-0.06571044027805328,
0.1166062280535698,
-0.029205938801169395,
-0.04525039717555046,
0.04514734074473381,
-0.07958788424730301,
-0.09921956807374954,
-0.12322571128606796,
0.041505493223667145,
0.02758467011153698,
0.07002066820859909,
-0.08870209008455276,
-0.08145572990179062,
-0.06654728204011917,
0.14532582461833954,
-0.12165113538503647,
-0.01640941947698593,
-0.14016889035701752,
0.07477269321680069,
0.1679227501153946,
-0.05420361086726189,
0.028165485709905624,
0.008638841100037098,
0.14772145450115204,
0.031200675293803215,
0.005585398990660906,
0.09954191744327545,
-0.0753469169139862,
-0.2202923446893692,
-0.04704386368393898,
0.18034180998802185,
0.03325565531849861,
0.07330526411533356,
-0.0144979702308774,
0.03188387677073479,
-0.014390701428055763,
-0.06658443063497543,
0.0772717073559761,
0.0361485555768013,
0.002943933941423893,
0.04428400844335556,
-0.011814314872026443,
0.006484086159616709,
-0.0643114298582077,
-0.03249373659491539,
0.09565272927284241,
0.2433660328388214,
-0.08540410548448563,
0.03845437616109848,
0.030676495283842087,
-0.06228926032781601,
-0.15557484328746796,
0.008308161981403828,
0.13369201123714447,
0.045430988073349,
-0.04077405855059624,
-0.19681432843208313,
0.01953922025859356,
0.06730476766824722,
-0.024021044373512268,
0.07617692649364471,
-0.3079565167427063,
-0.13418440520763397,
0.0986110121011734,
0.019871968775987625,
-0.06627479195594788,
-0.1693369448184967,
-0.06429044157266617,
-0.016898537054657936,
-0.07030756771564484,
0.03568795695900917,
-0.041590698063373566,
0.11931135505437851,
0.010203281417489052,
0.02242138981819153,
0.01836100034415722,
-0.0434846431016922,
0.1545202136039734,
0.020826216787099838,
0.045957036316394806,
-0.012369866482913494,
0.016198232769966125,
0.043782930821180344,
-0.07469382137060165,
0.010974266566336155,
-0.07143156975507736,
0.0170602984726429,
-0.15700754523277283,
-0.022649161517620087,
-0.08937416225671768,
0.02262760140001774,
-0.04866603761911392,
-0.00871992763131857,
-0.025874411687254906,
0.049018342047929764,
0.07370227575302124,
0.0219207014888525,
0.1178009882569313,
-0.07054812461137772,
0.128097802400589,
0.15919671952724457,
0.12832002341747284,
0.008358174003660679,
-0.08903065323829651,
-0.004315221682190895,
0.0061074490658938885,
0.035264890640974045,
-0.1138325035572052,
0.038852352648973465,
0.13847847282886505,
0.03128296136856079,
0.14735117554664612,
0.05153900757431984,
-0.08883465826511383,
0.006364181637763977,
0.06812191009521484,
-0.06961990892887115,
-0.1363694965839386,
-0.01776873879134655,
0.022621409967541695,
-0.1410120129585266,
-0.024150405079126358,
0.11734543740749359,
-0.027594855055212975,
0.008467699401080608,
0.0190613754093647,
0.044517092406749725,
-0.03939442336559296,
0.2307373285293579,
0.03333486244082451,
0.10294298827648163,
-0.09603467583656311,
0.06860128045082092,
0.061531681567430496,
-0.1086975634098053,
0.03171779587864876,
0.11018897593021393,
-0.051529232412576675,
-0.028803331777453423,
0.019292939454317093,
0.09028184413909912,
0.03724104166030884,
-0.058024391531944275,
-0.10573610663414001,
-0.15265502035617828,
0.09795908629894257,
0.07898754626512527,
0.021489741280674934,
0.037031181156635284,
-0.015985550358891487,
0.03755713999271393,
-0.09097771346569061,
0.11070699244737625,
0.08669465780258179,
0.0672476589679718,
-0.11944862455129623,
0.11299861967563629,
-0.0007839135942049325,
-0.006426837760955095,
0.004547785501927137,
-0.010186013765633106,
-0.09801149368286133,
0.023727910593152046,
-0.12381144613027573,
-0.0001286296028411016,
-0.03779219090938568,
-0.00418355967849493,
0.014859066344797611,
-0.057104792445898056,
-0.05390850454568863,
0.01444499846547842,
-0.11414399743080139,
-0.044430553913116455,
-0.041539859026670456,
0.07361209392547607,
-0.10065016895532608,
-0.015679774805903435,
0.033121638000011444,
-0.12258517742156982,
0.09378242492675781,
0.03591978922486305,
0.009415736421942711,
0.008480047807097435,
-0.07752402871847153,
-0.008786498568952084,
0.03179065138101578,
0.012040114030241966,
0.04351654648780823,
-0.15233534574508667,
-0.007020722609013319,
-0.028378594666719437,
0.013082791119813919,
-0.009965993463993073,
-0.009509234689176083,
-0.10799290239810944,
0.01951693370938301,
-0.02527463249862194,
-0.06250280886888504,
-0.05339917540550232,
0.0757482498884201,
0.08241693675518036,
0.014626854099333286,
0.14863525331020355,
-0.07109655439853668,
0.06022787094116211,
-0.23908673226833344,
0.009924260899424553,
-0.002666323445737362,
-0.060489341616630554,
-0.035798080265522,
-0.027737019583582878,
0.11043597757816315,
-0.07014092803001404,
0.08460060507059097,
-0.017171915620565414,
0.05247313529253006,
0.026674536988139153,
-0.11563804745674133,
0.03721649572253227,
0.07558023929595947,
0.1471175104379654,
0.067363440990448,
-0.025160133838653564,
0.09483717381954193,
-0.020360836759209633,
0.059470344334840775,
0.117717444896698,
0.1561950296163559,
0.1292179673910141,
0.05812647566199303,
0.08703053742647171,
0.10095056891441345,
-0.1456758826971054,
-0.11043189465999603,
0.1490439772605896,
-0.07984055578708649,
0.13905414938926697,
-0.04275386035442352,
0.17969901859760284,
0.11170554906129837,
-0.20801201462745667,
0.06487113982439041,
-0.0626252293586731,
-0.09587206691503525,
-0.09881126135587692,
-0.08190186321735382,
-0.08552799373865128,
-0.17188391089439392,
0.011267447844147682,
-0.1117674931883812,
0.06732591986656189,
0.0649227499961853,
0.04650487005710602,
0.038593098521232605,
0.09949936717748642,
0.053827133029699326,
0.013970906846225262,
0.1097385361790657,
0.011691139079630375,
0.007140045985579491,
-0.064603790640831,
-0.09782724827528,
0.04647703096270561,
-0.02697625383734703,
0.057304203510284424,
-0.0378512479364872,
-0.08077486604452133,
0.05995791405439377,
0.01521162036806345,
-0.09915133565664291,
0.03155320882797241,
-0.024877378717064857,
0.04269961267709732,
0.07206912338733673,
0.03202955797314644,
-0.005664890632033348,
-0.008548840880393982,
0.19347862899303436,
-0.08932018280029297,
-0.07004687935113907,
-0.14246614277362823,
0.174964040517807,
-0.016141245141625404,
-0.0015136526199057698,
0.018392235040664673,
-0.06845465302467346,
-0.01529741007834673,
0.16766802966594696,
0.175308957695961,
-0.029553508386015892,
-0.01475059986114502,
0.021440304815769196,
-0.007766416762024164,
-0.0323077067732811,
0.06635407358407974,
0.11420182138681412,
0.05562322959303856,
-0.04926075041294098,
-0.02133476734161377,
-0.001203815103508532,
-0.06919983774423599,
-0.02991621382534504,
0.0855817124247551,
0.0205819234251976,
0.002529134275391698,
-0.025806032121181488,
0.10542039573192596,
-0.07405555993318558,
-0.1428915113210678,
0.038658201694488525,
-0.18295982480049133,
-0.18077343702316284,
-0.04082515835762024,
0.044252172112464905,
0.03615059331059456,
0.057124797254800797,
-0.008478991687297821,
-0.05203587934374809,
0.09664862602949142,
-0.006721113342791796,
-0.04257335141301155,
-0.10356198996305466,
0.0748317688703537,
-0.13979066908359528,
0.1775413155555725,
-0.0544792115688324,
0.017455432564020157,
0.12527187168598175,
0.041294705122709274,
-0.08606752753257751,
0.0045340172946453094,
0.09257213771343231,
-0.11420220136642456,
0.03602687269449234,
0.1880549192428589,
-0.040676865726709366,
0.13084743916988373,
0.052712101489305496,
-0.0946599692106247,
0.017128484323620796,
-0.08151216059923172,
-0.028726112097501755,
-0.06822001934051514,
-0.02151348441839218,
-0.0343477688729763,
0.13511629402637482,
0.20849184691905975,
-0.06309661269187927,
-0.019100109115242958,
-0.03618742525577545,
0.006140972021967173,
0.019788259640336037,
0.1269010454416275,
-0.05324844643473625,
-0.25916680693626404,
0.015971556305885315,
0.0008971959468908608,
0.019653746858239174,
-0.18853667378425598,
-0.08155748248100281,
0.033383674919605255,
-0.0542759895324707,
-0.0466635562479496,
0.12897460162639618,
0.05890697240829468,
0.04698454216122627,
-0.05656189098954201,
-0.07988396286964417,
-0.02514641359448433,
0.18327824771404266,
-0.1766946166753769,
-0.05705359950661659
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - HY-AM dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5891
- Wer: 0.6569
**Note**: If you aim for best performance use [this model](https://huggingface.co/arampacha/wav2vec2-xls-r-300m-hy). It is trained using noizy student procedure and achieves considerably better results.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 1200
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 9.167 | 16.67 | 100 | 3.5599 | 1.0 |
| 3.2645 | 33.33 | 200 | 3.1771 | 1.0 |
| 3.1509 | 50.0 | 300 | 3.1321 | 1.0 |
| 3.0757 | 66.67 | 400 | 2.8594 | 1.0 |
| 2.5274 | 83.33 | 500 | 1.5286 | 0.9797 |
| 1.6826 | 100.0 | 600 | 0.8058 | 0.7974 |
| 1.2868 | 116.67 | 700 | 0.6713 | 0.7279 |
| 1.1262 | 133.33 | 800 | 0.6308 | 0.7034 |
| 1.0408 | 150.0 | 900 | 0.6056 | 0.6745 |
| 0.9617 | 166.67 | 1000 | 0.5891 | 0.6569 |
| 0.9196 | 183.33 | 1100 | 0.5913 | 0.6432 |
| 0.8853 | 200.0 | 1200 | 0.5924 | 0.6347 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
|
{"language": ["hy-AM"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "hy"], "datasets": ["common_voice"], "model-index": [{"name": "", "results": []}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-300m-hy-cv
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"hy",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hy-AM"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #hy #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - HY-AM dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5891
* Wer: 0.6569
Note: If you aim for best performance use this model. It is trained using noizy student procedure and achieves considerably better results.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 1200
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 1200\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #hy #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 1200\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
83,
157,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #hy #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 1200\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] |
[
-0.13575373589992523,
0.13489344716072083,
-0.003951790276914835,
0.04049893468618393,
0.10317390412092209,
0.012720158323645592,
0.08469774574041367,
0.17779508233070374,
-0.07519304007291794,
0.10888035595417023,
0.09518972039222717,
0.094859778881073,
0.08609843999147415,
0.13181012868881226,
-0.01539403386414051,
-0.29416850209236145,
0.011679910123348236,
-0.02198682352900505,
-0.11645516008138657,
0.10230311751365662,
0.09743358194828033,
-0.10587601363658905,
0.01310394611209631,
0.004120827652513981,
-0.08485067635774612,
-0.027073441073298454,
-0.02984905242919922,
-0.06492035835981369,
0.10895069688558578,
0.02472391538321972,
0.0348307304084301,
0.04656223580241203,
0.0899452343583107,
-0.2687889635562897,
0.009426403790712357,
0.05874527245759964,
0.054578930139541626,
0.06502405554056168,
0.09150172770023346,
-0.007253650110214949,
0.08296086639165878,
-0.07658328860998154,
0.04718497395515442,
0.05381903052330017,
-0.07761272042989731,
-0.29662802815437317,
-0.09101193398237228,
0.020794594660401344,
0.13677076995372772,
0.08407928049564362,
-0.034060198813676834,
0.02222621999680996,
-0.0601707398891449,
0.07672242075204849,
0.2374035269021988,
-0.2213583141565323,
-0.061144713312387466,
-0.00004371038812678307,
0.04186028614640236,
0.04325960576534271,
-0.10935427248477936,
0.003507286310195923,
0.01858058013021946,
0.01723242923617363,
0.10258040577173233,
0.01338978298008442,
0.06193872541189194,
0.008287162519991398,
-0.13042402267456055,
-0.058470938354730606,
0.13018307089805603,
0.07595887780189514,
-0.021902786567807198,
-0.12348413467407227,
-0.018338752910494804,
-0.19569064676761627,
-0.05683065205812454,
0.021541357040405273,
0.015077619813382626,
-0.03268744423985481,
-0.08328289538621902,
0.0015469867503270507,
-0.04689629375934601,
-0.08466359227895737,
0.05834339186549187,
0.12459395825862885,
0.04228607192635536,
-0.04668065533041954,
0.03713906928896904,
0.10740586370229721,
0.07032611221075058,
-0.15929566323757172,
-0.009214088320732117,
0.05299737676978111,
-0.10852757841348648,
0.0028878364246338606,
-0.028051741421222687,
0.01852988451719284,
0.04881038889288902,
0.12458813190460205,
0.019244136288762093,
0.09604071825742722,
0.022785883396863937,
0.01758439652621746,
-0.08522535860538483,
0.1587226688861847,
-0.0663926899433136,
-0.0993337631225586,
-0.030082987621426582,
0.12725841999053955,
0.009477524086833,
-0.028112124651670456,
-0.08469478785991669,
0.022713208571076393,
0.11651451885700226,
0.062443722039461136,
-0.001240274403244257,
0.013897733762860298,
-0.07853632420301437,
-0.030023621395230293,
-0.004798500798642635,
-0.11910227686166763,
0.042562685906887054,
0.04306282848119736,
-0.06427076458930969,
-0.027911745011806488,
-0.022692451253533363,
0.019092561677098274,
-0.036531001329422,
0.08335395902395248,
-0.04109941050410271,
-0.013828402385115623,
-0.07037720829248428,
-0.07872550189495087,
0.02789423242211342,
-0.061426371335983276,
-0.003847990185022354,
-0.05055543780326843,
-0.08617056906223297,
-0.06591004133224487,
0.06209249049425125,
-0.07380747050046921,
-0.08253125101327896,
-0.09863109886646271,
-0.0797831118106842,
0.057245273143053055,
-0.023511245846748352,
0.1824045330286026,
-0.05395050719380379,
0.113685742020607,
-0.0046986485831439495,
0.06742452830076218,
0.09558002650737762,
0.0812636986374855,
0.0027557197026908398,
0.05011225864291191,
-0.16428536176681519,
0.12401412427425385,
-0.116362564265728,
0.03839047625660896,
-0.15501143038272858,
-0.08952013403177261,
0.006237383466213942,
-0.004923656117171049,
0.10177603363990784,
0.12979188561439514,
-0.18648841977119446,
-0.0898076593875885,
0.14823755621910095,
-0.04184655472636223,
-0.06404440850019455,
0.1394416242837906,
-0.024429798126220703,
-0.02265441045165062,
0.020382259041070938,
0.20320764183998108,
0.11978932470083237,
-0.0870015099644661,
0.014862829819321632,
-0.04152267798781395,
0.12642283737659454,
0.03216705843806267,
0.10181178897619247,
-0.06042204052209854,
0.031645167618989944,
-0.003497662488371134,
-0.04281450808048248,
0.05829712748527527,
-0.07459705322980881,
-0.07754985243082047,
-0.00976464431732893,
-0.07702409476041794,
0.02305592969059944,
0.05410468950867653,
0.013619794510304928,
-0.07802785187959671,
-0.13461847603321075,
-0.028081178665161133,
0.12193962186574936,
-0.11916738003492355,
0.031834498047828674,
-0.08123824000358582,
0.07557957619428635,
-0.017011363059282303,
-0.001369228819385171,
-0.1492718607187271,
0.03536546602845192,
0.04421704262495041,
-0.029368015006184578,
-0.0015863942680880427,
-0.023518389090895653,
0.061563972383737564,
0.025092607364058495,
-0.04734296724200249,
-0.07210893929004669,
-0.03939102590084076,
-0.0008327299146912992,
-0.055790409445762634,
-0.2352338284254074,
-0.07149459421634674,
-0.030474424362182617,
0.20019866526126862,
-0.18269787728786469,
0.008471707813441753,
0.06429171562194824,
0.12475983798503876,
0.024630814790725708,
-0.05297219008207321,
0.025450613349676132,
0.053348153829574585,
-0.01828848198056221,
-0.09115596860647202,
0.0372980497777462,
0.014833446592092514,
-0.12172836810350418,
0.008965754881501198,
-0.09946669638156891,
0.04619818553328514,
0.09234561026096344,
0.02857230044901371,
-0.06314744800329208,
-0.0698571652173996,
-0.0491892546415329,
-0.04930855706334114,
-0.030105628073215485,
0.0016105091199278831,
0.17212291061878204,
0.02664080262184143,
0.09920524805784225,
-0.08298380672931671,
-0.046690069139003754,
0.045142579823732376,
0.0193096362054348,
-0.010188894346356392,
0.14035017788410187,
0.05706861987709999,
-0.02540677785873413,
0.08291208744049072,
0.05368845909833908,
-0.0484803132712841,
0.14403051137924194,
-0.07933511584997177,
-0.094664566218853,
-0.030423929914832115,
0.032483361661434174,
0.03205665573477745,
0.11620251834392548,
-0.1855476051568985,
-0.029297109693288803,
0.026250846683979034,
0.025256549939513206,
0.012385282665491104,
-0.19128042459487915,
-0.00150122563354671,
0.035646215081214905,
-0.0880323275923729,
0.0011536362580955029,
-0.0077489945106208324,
-0.025741904973983765,
0.07505903393030167,
0.021248893812298775,
-0.08286413550376892,
-0.010045863687992096,
-0.043147459626197815,
-0.08728936314582825,
0.15873754024505615,
-0.10769130289554596,
-0.1425202339887619,
-0.10619176924228668,
-0.042174458503723145,
-0.011096905916929245,
-0.008968941867351532,
0.04408881440758705,
-0.1021571010351181,
-0.02207253687083721,
-0.05720922350883484,
0.014385613612830639,
-0.05353774502873421,
0.04595043882727623,
0.026018083095550537,
0.010520301759243011,
0.04942050203680992,
-0.08554673939943314,
0.01696513406932354,
-0.01397081557661295,
-0.0160401351749897,
0.009287139400839806,
0.0051417360082268715,
0.09149080514907837,
0.18692487478256226,
0.056916508823633194,
0.049683213233947754,
-0.04395446181297302,
0.15131093561649323,
-0.14505231380462646,
0.020339010283350945,
0.10856783390045166,
0.022971006110310555,
0.045757636427879333,
0.15161050856113434,
0.03526092320680618,
-0.0865897387266159,
0.01744747906923294,
0.01671784371137619,
-0.019497694447636604,
-0.22407887876033783,
-0.034401845186948776,
-0.06817939877510071,
-0.028144923970103264,
0.10608990490436554,
0.02970488741993904,
0.0017290068790316582,
0.01855907030403614,
-0.007363513577729464,
-0.007394266780465841,
0.038906410336494446,
0.04934591427445412,
0.06769000738859177,
0.03834781050682068,
0.10753990709781647,
-0.005824729800224304,
-0.010647588409483433,
0.028344284743070602,
0.021560808643698692,
0.25980862975120544,
-0.012793513014912605,
0.21201038360595703,
0.04190276935696602,
0.16586460173130035,
0.0008626745548099279,
0.04174859821796417,
0.005260249134153128,
-0.0016546695260331035,
0.007400859147310257,
-0.049201034009456635,
-0.045748934149742126,
0.028820961713790894,
0.12158279120922089,
0.009763535112142563,
-0.10934506356716156,
0.02711525373160839,
0.016148589551448822,
0.37185734510421753,
0.09838273376226425,
-0.26401227712631226,
-0.07260358333587646,
0.013822484761476517,
-0.07179706543684006,
-0.03852006047964096,
0.038134824484586716,
0.12440939247608185,
-0.07018417119979858,
0.06664736568927765,
-0.061877280473709106,
0.07381488382816315,
-0.10152199864387512,
-0.0014759795740246773,
0.07065188139677048,
0.08250079303979874,
0.004766743630170822,
0.06397140026092529,
-0.2711087465286255,
0.26620495319366455,
-0.010407112538814545,
0.07913629710674286,
-0.06449760496616364,
0.04049468785524368,
0.03999984264373779,
-0.026204457506537437,
0.09615211933851242,
-0.004689086228609085,
-0.10868066549301147,
-0.14213477075099945,
-0.10736124217510223,
0.003520670346915722,
0.11594101786613464,
-0.07699772715568542,
0.10319315642118454,
-0.04155321419239044,
-0.04274454712867737,
0.03229275718331337,
-0.0859750434756279,
-0.07398266345262527,
-0.11694511026144028,
0.03585202246904373,
0.022198831662535667,
0.08434700220823288,
-0.0954451933503151,
-0.0948440283536911,
-0.05025868117809296,
0.14819620549678802,
-0.09954576194286346,
-0.02326475828886032,
-0.1378391832113266,
0.07697319239377975,
0.14834800362586975,
-0.0612742118537426,
0.05498791113495827,
0.019824529066681862,
0.13179168105125427,
0.018222754821181297,
0.005335815250873566,
0.10685894638299942,
-0.07560385018587112,
-0.20081540942192078,
-0.06324395537376404,
0.18243470788002014,
0.022373871877789497,
0.07624903321266174,
-0.0197877436876297,
0.02206258848309517,
-0.006739938631653786,
-0.07363951206207275,
0.08990081399679184,
0.042680613696575165,
0.017524613067507744,
0.031218411400914192,
-0.013487434014678001,
0.008150555193424225,
-0.08855491876602173,
-0.07803581655025482,
0.12651710212230682,
0.25584855675697327,
-0.09476917237043381,
0.06353624910116196,
0.04932231083512306,
-0.04323260486125946,
-0.1730722039937973,
-0.017131254076957703,
0.12750905752182007,
0.045361388474702835,
-0.027602998539805412,
-0.2182091921567917,
-0.0041346121579408646,
0.058109838515520096,
-0.03155265375971794,
0.07754559069871902,
-0.33840492367744446,
-0.12909777462482452,
0.0791461318731308,
0.07302149385213852,
-0.01609092578291893,
-0.14661750197410583,
-0.07652943581342697,
-0.009314251132309437,
-0.05289012938737869,
0.015909940004348755,
-0.0030430927872657776,
0.13438764214515686,
0.013608227483928204,
0.01956380158662796,
0.017971381545066833,
-0.04993033781647682,
0.13885881006717682,
0.013312818482518196,
0.022861000150442123,
-0.0058908239006996155,
0.021624930202960968,
-0.0048636216670274734,
-0.07431910187005997,
-0.006462855264544487,
-0.06261937320232391,
0.033266931772232056,
-0.1480216383934021,
-0.03382125496864319,
-0.08484413474798203,
0.006513088010251522,
-0.04007277637720108,
-0.036817505955696106,
-0.02642226591706276,
0.05008316785097122,
0.08542005717754364,
0.010581454262137413,
0.10628112405538559,
-0.06069042161107063,
0.09781362116336823,
0.12109467387199402,
0.10616998374462128,
-0.018083803355693817,
-0.08962709456682205,
-0.012945402413606644,
-0.014522688463330269,
0.03441677242517471,
-0.14296583831310272,
0.034224655479192734,
0.12763383984565735,
0.04886675998568535,
0.14338211715221405,
0.050119634717702866,
-0.08962561935186386,
0.01977735199034214,
0.054997093975543976,
-0.06580054759979248,
-0.16182631254196167,
-0.025797462090849876,
0.03766821697354317,
-0.11452966183423996,
0.005975918844342232,
0.10339758545160294,
-0.020467199385166168,
-0.008881310001015663,
0.01766243763267994,
0.03957275301218033,
-0.026834219694137573,
0.22069108486175537,
0.029611218720674515,
0.09630900621414185,
-0.1051248088479042,
0.08664241433143616,
0.05265399068593979,
-0.12484389543533325,
0.04488782212138176,
0.08052749186754227,
-0.07578469812870026,
-0.016810638830065727,
0.011575986631214619,
0.07540225237607956,
0.06529313325881958,
-0.04733525961637497,
-0.11724212020635605,
-0.14544276893138885,
0.09972304105758667,
0.030699104070663452,
0.031087417155504227,
0.02058248221874237,
-0.038672469556331635,
0.02128949947655201,
-0.10628736764192581,
0.11032287776470184,
0.08914152532815933,
0.051690518856048584,
-0.1246226504445076,
0.09441307932138443,
0.015175594948232174,
0.015817848965525627,
-0.004249784629791975,
-0.014108099043369293,
-0.0804811343550682,
0.032745786011219025,
-0.10131408274173737,
-0.01746591366827488,
-0.06140759214758873,
-0.005116734653711319,
0.0055594677105546,
-0.06302286684513092,
-0.03969714418053627,
0.022880813106894493,
-0.11480844020843506,
-0.05297357961535454,
-0.03831813856959343,
0.06012076511979103,
-0.07546766847372055,
-0.0252958033233881,
0.02418089658021927,
-0.13518019020557404,
0.11667833477258682,
0.03212551772594452,
0.023106373846530914,
0.008761417120695114,
-0.037701476365327835,
-0.013226042501628399,
0.02390669845044613,
-0.003724646056070924,
0.03272350877523422,
-0.20165279507637024,
-0.011873875744640827,
-0.03282427787780762,
0.005231996066868305,
-0.013219338841736317,
0.021232597529888153,
-0.119097501039505,
0.005521663464605808,
-0.06070385128259659,
-0.0397692546248436,
-0.060182973742485046,
0.08069317042827606,
0.08595056086778641,
0.02497284859418869,
0.1490039825439453,
-0.07513992488384247,
0.05435429885983467,
-0.2095922976732254,
0.008830023929476738,
-0.030264152213931084,
-0.06769783049821854,
-0.06948412209749222,
-0.03017912432551384,
0.10175332427024841,
-0.05469662323594093,
0.05346585810184479,
-0.0495951808989048,
0.04864208400249481,
0.020835891366004944,
-0.11573538184165955,
0.021298479288816452,
0.05270576477050781,
0.18753230571746826,
0.05237001180648804,
-0.017392881214618683,
0.09240081161260605,
-0.0035352902486920357,
0.0639820396900177,
0.17550957202911377,
0.11541077494621277,
0.1422833502292633,
0.10779861360788345,
0.08701572567224503,
0.05713529884815216,
-0.13799525797367096,
-0.16454599797725677,
0.20569904148578644,
-0.05947298929095268,
0.1410360187292099,
-0.017318015918135643,
0.2046537846326828,
0.10160098224878311,
-0.20642584562301636,
0.05811591446399689,
-0.02433517388999462,
-0.08834704011678696,
-0.0936424508690834,
-0.09364981949329376,
-0.08108382672071457,
-0.19265714287757874,
0.0022982305381447077,
-0.08966035395860672,
0.05691983923316002,
0.04635122790932655,
0.04519125074148178,
0.04368910938501358,
0.0859518051147461,
0.06253638863563538,
-0.002308832947164774,
0.1385669857263565,
0.013849295675754547,
-0.026552489027380943,
-0.0529642291367054,
-0.1173277497291565,
0.03931094706058502,
-0.036368682980537415,
0.06745696812868118,
-0.03484653681516647,
-0.10314271599054337,
0.05524042993783951,
0.028250837698578835,
-0.1044948622584343,
0.026814734563231468,
-0.023650528863072395,
0.05958840250968933,
0.11760836094617844,
0.0430731438100338,
-0.018258903175592422,
-0.000379907462047413,
0.21902018785476685,
-0.09697792679071426,
-0.060749247670173645,
-0.1343778371810913,
0.19151583313941956,
-0.001984617905691266,
0.0031104586087167263,
0.03176266700029373,
-0.07876434177160263,
-0.015444745309650898,
0.17193502187728882,
0.13725142180919647,
-0.02065030112862587,
-0.026808500289916992,
0.03988698124885559,
-0.01329406350851059,
-0.046474918723106384,
0.07653159648180008,
0.12337619811296463,
0.029968710616230965,
-0.04966103658080101,
-0.03412759676575661,
-0.02827928029000759,
-0.07329662144184113,
-0.03753138333559036,
0.08490876108407974,
0.014477520249783993,
-0.007638133130967617,
-0.01415115687996149,
0.1111222356557846,
-0.05019927769899368,
-0.17949840426445007,
0.042823318392038345,
-0.18541863560676575,
-0.19634844362735748,
-0.02988502010703087,
0.04183125123381615,
0.03547096252441406,
0.05587819591164589,
0.003957771230489016,
-0.010143499821424484,
0.1294259876012802,
0.002469330094754696,
-0.03527314215898514,
-0.08090342581272125,
0.08017656952142715,
-0.10677051544189453,
0.1684529334306717,
-0.05097804591059685,
0.003279461059719324,
0.12598860263824463,
0.0672384575009346,
-0.09415215253829956,
0.03849842771887779,
0.09117941558361053,
-0.10070553421974182,
0.05932565778493881,
0.1950235515832901,
-0.04151323437690735,
0.13661397993564606,
0.04599369317293167,
-0.08263380080461502,
0.022059373557567596,
-0.09994503855705261,
-0.04031822830438614,
-0.06519123911857605,
-0.0029286339413374662,
-0.03786676004528999,
0.13335639238357544,
0.17756235599517822,
-0.07505808770656586,
-0.009444964118301868,
-0.043840691447257996,
0.0010814918205142021,
0.027438687160611153,
0.13588209450244904,
-0.039283979684114456,
-0.2603992223739624,
0.012934190221130848,
0.005935043562203646,
0.01997014507651329,
-0.20629219710826874,
-0.07815347611904144,
0.027378346771001816,
-0.05770828574895859,
-0.069269098341465,
0.1276051104068756,
0.043894410133361816,
0.04060007259249687,
-0.057017214596271515,
-0.0907888263463974,
-0.03499605506658554,
0.1767970770597458,
-0.18869677186012268,
-0.04507880657911301
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the /WORKSPACE/DATA/HY/NOIZY_STUDENT_3/ - NA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2293
- Wer: 0.3333
- Cer: 0.0602
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 842
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 3.1471 | 7.02 | 400 | 3.1599 | 1.0 | 1.0 |
| 1.8691 | 14.04 | 800 | 0.7674 | 0.7361 | 0.1686 |
| 1.3227 | 21.05 | 1200 | 0.3849 | 0.5336 | 0.1007 |
| 1.163 | 28.07 | 1600 | 0.3015 | 0.4559 | 0.0823 |
| 1.0768 | 35.09 | 2000 | 0.2721 | 0.4032 | 0.0728 |
| 1.0224 | 42.11 | 2400 | 0.2586 | 0.3825 | 0.0691 |
| 0.9817 | 49.12 | 2800 | 0.2458 | 0.3653 | 0.0653 |
| 0.941 | 56.14 | 3200 | 0.2306 | 0.3388 | 0.0605 |
| 0.9235 | 63.16 | 3600 | 0.2315 | 0.3380 | 0.0615 |
| 0.9141 | 70.18 | 4000 | 0.2293 | 0.3333 | 0.0602 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.4.dev0
- Tokenizers 0.11.0
|
{"language": ["hy"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "robust-speech-event", "hy", "hf-asr-leaderboard"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-xls-r-300m-hy", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice hy-AM", "type": "mozilla-foundation/common_voice_8_0", "args": "hy-AM"}, "metrics": [{"type": "wer", "value": 13.192818110850899, "name": "WER LM"}, {"type": "cer", "value": 2.787051087506323, "name": "CER LM"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "hy"}, "metrics": [{"type": "wer", "value": 22.246048764990867, "name": "Test WER"}, {"type": "cer", "value": 7.59406739840239, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
arampacha/wav2vec2-xls-r-300m-hy
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"robust-speech-event",
"hy",
"hf-asr-leaderboard",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hy"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hy #hf-asr-leaderboard #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the /WORKSPACE/DATA/HY/NOIZY\_STUDENT\_3/ - NA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2293
* Wer: 0.3333
* Cer: 0.0602
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 842
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* training\_steps: 4000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.4.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 842\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hy #hf-asr-leaderboard #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 842\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
105,
160,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hy #hf-asr-leaderboard #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 842\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.4.dev0\n* Tokenizers 0.11.0"
] |
[
-0.13624855875968933,
0.11401476711034775,
-0.004548346623778343,
0.047640252858400345,
0.09465013444423676,
0.019716165959835052,
0.10843633860349655,
0.16496054828166962,
-0.08179760724306107,
0.11530119925737381,
0.0775846540927887,
0.05104159563779831,
0.09048489481210709,
0.1386871337890625,
-0.01884542405605316,
-0.294019877910614,
0.013939252123236656,
-0.03197392821311951,
-0.15412430465221405,
0.09099990129470825,
0.10632628947496414,
-0.10883110016584396,
0.04884026572108269,
0.018977267667651176,
-0.0923549234867096,
-0.01591443084180355,
-0.03467801585793495,
-0.0499299094080925,
0.08698409050703049,
0.04843415319919586,
0.058315426111221313,
0.039776962250471115,
0.09446712583303452,
-0.24844595789909363,
0.008373246528208256,
0.07148099690675735,
0.04298616200685501,
0.0691480040550232,
0.0977102518081665,
0.0012020524591207504,
0.08033253252506256,
-0.07781720906496048,
0.04704414680600166,
0.06659932434558868,
-0.10600749403238297,
-0.2697301506996155,
-0.09303431957960129,
0.041291724890470505,
0.12163984775543213,
0.078150674700737,
-0.025549884885549545,
0.05292559415102005,
-0.04207974672317505,
0.08357442915439606,
0.19415156543254852,
-0.20476186275482178,
-0.0822044238448143,
-0.002939165337011218,
0.039249155670404434,
0.051954444497823715,
-0.10696610063314438,
-0.0030359067022800446,
0.011479852721095085,
0.013498526997864246,
0.07576708495616913,
-0.007594859693199396,
-0.00549000408500433,
-0.0049596684984862804,
-0.1191912591457367,
-0.05002022162079811,
0.17129062116146088,
0.07801870256662369,
-0.0138959726318717,
-0.10242152959108353,
-0.02180691435933113,
-0.1765316277742386,
-0.05801225081086159,
0.013208895921707153,
0.036365728825330734,
-0.03661259636282921,
-0.07146081328392029,
0.0034108224790543318,
-0.07063552737236023,
-0.06955133378505707,
0.06377892941236496,
0.08706258237361908,
0.02158643677830696,
-0.01763617992401123,
0.006151922512799501,
0.0894644483923912,
0.029385343194007874,
-0.17825697362422943,
-0.031832512468099594,
0.031183987855911255,
-0.10061965882778168,
-0.018316514790058136,
-0.016830390319228172,
0.031524658203125,
0.06924877315759659,
0.13340437412261963,
-0.05749473720788956,
0.09569322317838669,
0.006652785465121269,
0.011150787584483624,
-0.07847059518098831,
0.186466246843338,
-0.05329484865069389,
-0.09226445853710175,
-0.03481895104050636,
0.1211172565817833,
0.0038636329118162394,
-0.010275132954120636,
-0.08295715600252151,
0.03567672520875931,
0.08410127460956573,
0.05321910232305527,
-0.00148605031426996,
0.023192401975393295,
-0.05325605720281601,
-0.032851673662662506,
0.04696366935968399,
-0.10947602242231369,
0.040352996438741684,
0.05864020809531212,
-0.07556957006454468,
-0.013898082077503204,
-0.008911211043596268,
0.017931701615452766,
-0.03150572255253792,
0.05343247205018997,
-0.06574006378650665,
-0.01711682230234146,
-0.07345575094223022,
-0.0900777280330658,
0.04269707202911377,
-0.0437743179500103,
-0.00677886139601469,
-0.05492309853434563,
-0.10249977558851242,
-0.07355692237615585,
0.04130930453538895,
-0.06647051870822906,
-0.07806982100009918,
-0.06602923572063446,
-0.09303595870733261,
0.05966620147228241,
-0.012299914844334126,
0.1548236459493637,
-0.059826772660017014,
0.06251300871372223,
0.026307910680770874,
0.03590238839387894,
0.07047687470912933,
0.06044662371277809,
-0.02443920262157917,
0.06442674994468689,
-0.15943126380443573,
0.0966477170586586,
-0.11906670778989792,
0.06684674322605133,
-0.1360425353050232,
-0.09727837145328522,
0.0005900777177885175,
-0.007416951935738325,
0.07566476613283157,
0.12600623071193695,
-0.15365947782993317,
-0.09142273664474487,
0.15919317305088043,
-0.059341441839933395,
-0.10387847572565079,
0.11051905900239944,
-0.007195781916379929,
-0.040287140756845474,
0.02513735368847847,
0.15989002585411072,
0.16127482056617737,
-0.0975758358836174,
-0.0015272980090230703,
-0.025380939245224,
0.13770875334739685,
0.04883105680346489,
0.09222062677145004,
-0.05783946067094803,
0.06132805719971657,
0.007795330602675676,
-0.03300941735506058,
0.033290889114141464,
-0.07146593928337097,
-0.08449260145425797,
-0.01420231070369482,
-0.0809674859046936,
-0.02229340374469757,
0.05860858038067818,
0.022965528070926666,
-0.07153640687465668,
-0.13024544715881348,
-0.014381648041307926,
0.11915502697229385,
-0.09842123091220856,
0.011631373316049576,
-0.07543817162513733,
0.047953952103853226,
0.010899335145950317,
-0.002736369613558054,
-0.14601226150989532,
-0.04941461980342865,
0.0479474700987339,
-0.049582865089178085,
0.0194474458694458,
-0.010273398831486702,
0.07025159150362015,
0.04350519925355911,
-0.04213297739624977,
-0.053161703050136566,
-0.028881488367915154,
-0.01139350701123476,
-0.046270180493593216,
-0.2406480461359024,
-0.07475478202104568,
-0.021285392343997955,
0.22933855652809143,
-0.2011171579360962,
0.01460307091474533,
0.07302111387252808,
0.13668695092201233,
0.03352474793791771,
-0.0460963249206543,
0.01719357632100582,
0.05521578714251518,
-0.023475700989365578,
-0.08721121400594711,
0.02109731175005436,
0.008292355574667454,
-0.09966953098773956,
0.01208378653973341,
-0.1629454791545868,
0.0528436042368412,
0.08778314292430878,
0.0179360993206501,
-0.06360584497451782,
-0.06265391409397125,
-0.055749304592609406,
-0.051039330661296844,
-0.0022922735661268234,
-0.02074921689927578,
0.13550950586795807,
0.012666561640799046,
0.1110735833644867,
-0.0739070475101471,
-0.05534582585096359,
0.04238700121641159,
0.023326335474848747,
0.000751475163269788,
0.12360893934965134,
0.03643936291337013,
-0.04585019499063492,
0.10150288790464401,
0.029666109010577202,
-0.06223684921860695,
0.1582353264093399,
-0.08840566128492355,
-0.08110594004392624,
-0.03445904701948166,
0.019987372681498528,
0.04118037596344948,
0.119597889482975,
-0.16764461994171143,
-0.022017261013388634,
0.030296795070171356,
0.009742164984345436,
0.01815706118941307,
-0.16683006286621094,
0.015929387882351875,
0.027512043714523315,
-0.08533183485269547,
0.011550451628863811,
-0.012361162342131138,
-0.010540804825723171,
0.0712428092956543,
0.012494316324591637,
-0.05989418923854828,
-0.03716035932302475,
-0.042530253529548645,
-0.0842442512512207,
0.1849018782377243,
-0.11076974123716354,
-0.14539813995361328,
-0.13861188292503357,
-0.0005656261346302927,
-0.02270815335214138,
0.0018306357087567449,
0.023899836465716362,
-0.08434164524078369,
-0.03965277597308159,
-0.06691189110279083,
0.015175977721810341,
-0.032089706510305405,
0.02322288602590561,
0.01511821337044239,
0.015552685596048832,
0.08280815929174423,
-0.10669218748807907,
0.010338659398257732,
0.0036952223163098097,
-0.040539130568504333,
-0.0034935770090669394,
0.006739750970155001,
0.09170443564653397,
0.17169885337352753,
0.06485366821289062,
0.034867387264966965,
-0.03562092408537865,
0.17882436513900757,
-0.15020257234573364,
0.024059249088168144,
0.10398975014686584,
0.012077494524419308,
0.03988323733210564,
0.14608602225780487,
0.04548793286085129,
-0.06111147999763489,
-0.002532507758587599,
0.03649907559156418,
-0.02040005475282669,
-0.2171994000673294,
-0.03670595958828926,
-0.07713524997234344,
0.005637588445097208,
0.10956552624702454,
0.03241077437996864,
0.052892591804265976,
0.0337025411427021,
-0.04217059910297394,
-0.0021112633403390646,
0.056209854781627655,
0.05552014708518982,
0.10310361534357071,
0.051588740199804306,
0.11890069395303726,
-0.012288213707506657,
-0.02325858734548092,
0.03533044084906578,
0.015702934935688972,
0.21394823491573334,
0.0049704499542713165,
0.22304224967956543,
0.035106636583805084,
0.12467068433761597,
-0.007806859444826841,
0.04618329554796219,
0.02314024232327938,
-0.003308576066046953,
0.028679266571998596,
-0.06611838191747665,
-0.005336153786629438,
0.044299960136413574,
0.10364199429750443,
0.010593377985060215,
-0.08876065164804459,
0.025930073112249374,
0.02930358238518238,
0.3517150282859802,
0.0922221913933754,
-0.30141764879226685,
-0.08803185820579529,
0.02618550695478916,
-0.05220205709338188,
-0.04113680124282837,
0.020199593156576157,
0.11623081564903259,
-0.08371243625879288,
0.10422194004058838,
-0.04873332753777504,
0.09083466976881027,
-0.057186562567949295,
0.0032322960905730724,
0.09332774579524994,
0.0920005515217781,
0.009257993660867214,
0.05591832846403122,
-0.2164098173379898,
0.2593427896499634,
0.0028102672658860683,
0.06121450290083885,
-0.04456891492009163,
0.06270217150449753,
0.035259488970041275,
0.018236735835671425,
0.07249081879854202,
-0.012637286446988583,
-0.11340415477752686,
-0.16436423361301422,
-0.09239983558654785,
0.013699959963560104,
0.12162531167268753,
-0.07621430605649948,
0.11993157118558884,
-0.03353071212768555,
-0.05304213985800743,
0.04067554697394371,
-0.08066519349813461,
-0.09841907024383545,
-0.11314846575260162,
0.049538835883140564,
0.02475268580019474,
0.06300856173038483,
-0.09028062224388123,
-0.07843741029500961,
-0.04515747353434563,
0.1420123130083084,
-0.13567984104156494,
-0.03271472081542015,
-0.14028126001358032,
0.07729114592075348,
0.16832295060157776,
-0.06298039853572845,
0.03984064981341362,
0.0006172377616167068,
0.1555550992488861,
0.031505994498729706,
-0.012362040579319,
0.0948299691081047,
-0.08417273312807083,
-0.22979594767093658,
-0.04500002786517143,
0.17705921828746796,
0.020358193665742874,
0.06744788587093353,
-0.016445215791463852,
0.04077303782105446,
-0.010443872772157192,
-0.07815591990947723,
0.06871594488620758,
0.03287281468510628,
0.006693792063742876,
0.03192959353327751,
-0.013932774774730206,
0.014587046578526497,
-0.06737100332975388,
-0.03179922699928284,
0.08269178867340088,
0.25067245960235596,
-0.09160688519477844,
0.03591739013791084,
0.03261983394622803,
-0.06003319472074509,
-0.1551147848367691,
0.005533577408641577,
0.1346411257982254,
0.033650822937488556,
-0.04488856717944145,
-0.2143993377685547,
0.04556627571582794,
0.07959973067045212,
-0.03170741721987724,
0.09822921454906464,
-0.2829616665840149,
-0.14364422857761383,
0.09296319633722305,
0.01735169067978859,
-0.06946375966072083,
-0.1786300539970398,
-0.07586589455604553,
-0.029165202751755714,
-0.0705404281616211,
0.04653770476579666,
-0.05727563053369522,
0.10945306718349457,
0.007309831213206053,
0.022912880405783653,
0.019073309376835823,
-0.039304282516241074,
0.1614658683538437,
0.014712408185005188,
0.038678910583257675,
-0.007872864603996277,
0.01056809350848198,
0.05465856194496155,
-0.07537715137004852,
0.010833453387022018,
-0.07408521324396133,
0.0249454565346241,
-0.13989168405532837,
-0.019276101142168045,
-0.08811675757169724,
0.03279297426342964,
-0.04455585032701492,
-0.018243715167045593,
-0.028329741209745407,
0.04266886040568352,
0.062240470200777054,
0.012302367947995663,
0.14518272876739502,
-0.06162027642130852,
0.1423981934785843,
0.18166707456111908,
0.1193956732749939,
0.000027552488973014988,
-0.07546248286962509,
0.001274952432140708,
-0.010607102885842323,
0.03199295327067375,
-0.11950566619634628,
0.03882303833961487,
0.13238848745822906,
0.04011565074324608,
0.13815350830554962,
0.04441692307591438,
-0.08271055668592453,
0.008703834377229214,
0.06670110672712326,
-0.07452527433633804,
-0.14219534397125244,
-0.030697030946612358,
0.02473767101764679,
-0.14830486476421356,
-0.0036271526478230953,
0.13471569120883942,
-0.02720423974096775,
0.0029188855551183224,
0.01605675369501114,
0.043286651372909546,
-0.032973527908325195,
0.2324695885181427,
0.04095907509326935,
0.10408605635166168,
-0.10365857183933258,
0.07586342096328735,
0.05899318680167198,
-0.10714347660541534,
0.027723446488380432,
0.11203784495592117,
-0.05969472602009773,
-0.023975586518645287,
0.013407393358647823,
0.07525008916854858,
0.004497086629271507,
-0.06196688488125801,
-0.12142132967710495,
-0.1555837094783783,
0.08533924072980881,
0.10119066387414932,
0.022504249587655067,
0.03453439474105835,
-0.009858333505690098,
0.04030589386820793,
-0.09187289327383041,
0.1143825575709343,
0.06497321277856827,
0.07048188149929047,
-0.12447299063205719,
0.11217308789491653,
-0.0023848700802773237,
0.0010375130223110318,
0.0012427116744220257,
-0.008482025004923344,
-0.09800254553556442,
0.005402314011007547,
-0.1396234929561615,
0.002863864181563258,
-0.04108966886997223,
-0.0049734218046069145,
0.016400931403040886,
-0.05647934600710869,
-0.06561413407325745,
0.02709485962986946,
-0.10986492037773132,
-0.0489288792014122,
-0.04644589498639107,
0.06802059710025787,
-0.10284250229597092,
-0.011828468181192875,
0.030512729659676552,
-0.12458803504705429,
0.09670117497444153,
0.037617750465869904,
0.022399425506591797,
0.016655661165714264,
-0.049887362867593765,
-0.00505285756662488,
0.03090463951230049,
0.014774591661989689,
0.04118673875927925,
-0.15653648972511292,
-0.012570997700095177,
-0.03597651422023773,
0.0088356863707304,
-0.013241040520370007,
0.012429969385266304,
-0.10280393064022064,
0.010965107940137386,
-0.024117596447467804,
-0.05336882174015045,
-0.05678883194923401,
0.07330510765314102,
0.08012767881155014,
0.014521070756018162,
0.13388387858867645,
-0.07172228395938873,
0.04920421913266182,
-0.24053065478801727,
0.010049022734165192,
-0.009575958363711834,
-0.07622788101434708,
-0.053933266550302505,
-0.021033674478530884,
0.10989735275506973,
-0.06173449382185936,
0.08621817082166672,
-0.03366202861070633,
0.0830930545926094,
0.03158009797334671,
-0.0954107865691185,
0.02668815106153488,
0.07283979654312134,
0.15568406879901886,
0.06700979173183441,
-0.023710619658231735,
0.08875338733196259,
-0.01648753136396408,
0.06298989802598953,
0.10605327785015106,
0.15621712803840637,
0.11885061860084534,
0.04693407937884331,
0.0818023830652237,
0.11339589953422546,
-0.1423138678073883,
-0.11664893478155136,
0.13247859477996826,
-0.07227752357721329,
0.1491357982158661,
-0.038938336074352264,
0.15900766849517822,
0.10721952468156815,
-0.2110278308391571,
0.0545085147023201,
-0.061467647552490234,
-0.09118782728910446,
-0.10861223191022873,
-0.06509757786989212,
-0.08845210075378418,
-0.18570485711097717,
0.0008300304762087762,
-0.12135805934667587,
0.060526806861162186,
0.05684851482510567,
0.04138195514678955,
0.035081733018159866,
0.10790537297725677,
0.041170403361320496,
-0.0020849339198321104,
0.10802080482244492,
0.019353464245796204,
-0.0029640146531164646,
-0.04864395409822464,
-0.1024249941110611,
0.04445338994264603,
-0.021468510851264,
0.05724674090743065,
-0.04363619536161423,
-0.0704517737030983,
0.06155738979578018,
0.011132240295410156,
-0.10251572728157043,
0.032509464770555496,
-0.017047448083758354,
0.03556136414408684,
0.057920996099710464,
0.02066658064723015,
-0.0039062004070729017,
-0.005035900045186281,
0.20793789625167847,
-0.09972178190946579,
-0.06635209172964096,
-0.14698132872581482,
0.198362797498703,
-0.018240569159388542,
-0.005232516676187515,
0.028361311182379723,
-0.06928498297929764,
-0.023875458166003227,
0.1570502668619156,
0.17622238397598267,
-0.019858576357364655,
-0.017367227002978325,
0.021233001723885536,
-0.010739355348050594,
-0.03866930305957794,
0.07288255542516708,
0.10075940191745758,
0.055614639073610306,
-0.040457528084516525,
-0.019613130018115044,
0.015736844390630722,
-0.06572429090738297,
-0.03702489659190178,
0.08108346164226532,
0.01631229557096958,
-0.0011426149867475033,
-0.013224371708929539,
0.10658024251461029,
-0.061021462082862854,
-0.13578885793685913,
0.04990838095545769,
-0.19633540511131287,
-0.18538807332515717,
-0.0397353433072567,
0.029983041808009148,
0.03386848419904709,
0.05663871765136719,
-0.005051047075539827,
-0.049193523824214935,
0.12057634443044662,
-0.0037175586912781,
-0.05032714083790779,
-0.11724892258644104,
0.08878134191036224,
-0.14757046103477478,
0.17824871838092804,
-0.05863461643457413,
0.020472681149840355,
0.12070515006780624,
0.03321068361401558,
-0.08721528947353363,
0.007150130346417427,
0.09807626157999039,
-0.1321704387664795,
0.03658950328826904,
0.18592378497123718,
-0.04758140444755554,
0.1440466195344925,
0.04988456144928932,
-0.10060948878526688,
0.0011452613398432732,
-0.06436234712600708,
-0.035624049603939056,
-0.061669204384088516,
-0.024293283000588417,
-0.0494375005364418,
0.13889440894126892,
0.2035183310508728,
-0.06886390596628189,
-0.02191285789012909,
-0.03304708003997803,
0.015737781301140785,
0.042651690542697906,
0.126248300075531,
-0.05144456773996353,
-0.26630592346191406,
0.020159173756837845,
-0.008074783720076084,
0.01839042454957962,
-0.1922827810049057,
-0.07418207824230194,
0.026975398883223534,
-0.041508082300424576,
-0.039328236132860184,
0.13204897940158844,
0.05200789123773575,
0.03564373403787613,
-0.05669204145669937,
-0.079336017370224,
-0.026743605732917786,
0.17840693891048431,
-0.18320584297180176,
-0.060019638389348984
] |
null | null |
transformers
|
---
datasets:
- squad
widget:
- text: "Which name is also used to describe the Amazon rainforest in English?"
context: "The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
- text: "How many square kilometers of rainforest is covered in the basin?"
context: "The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
|
{}
|
question-answering
|
aravind-812/roberta-train-json
|
[
"transformers",
"pytorch",
"jax",
"roberta",
"question-answering",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #roberta #question-answering #endpoints_compatible #region-us
|
---
datasets:
- squad
widget:
- text: "Which name is also used to describe the Amazon rainforest in English?"
context: "The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
- text: "How many square kilometers of rainforest is covered in the basin?"
context: "The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
|
[] |
[
"TAGS\n#transformers #pytorch #jax #roberta #question-answering #endpoints_compatible #region-us \n"
] |
[
33
] |
[
"passage: TAGS\n#transformers #pytorch #jax #roberta #question-answering #endpoints_compatible #region-us \n"
] |
[
-0.020764736458659172,
0.01961023174226284,
-0.010791548527777195,
-0.007276155054569244,
0.08957497030496597,
0.028539909049868584,
0.022440236061811447,
0.10318277031183243,
0.09738119691610336,
0.004166853614151478,
0.17602722346782684,
0.23654963076114655,
-0.05737054720520973,
-0.05537321791052818,
-0.08501704782247543,
-0.20344415307044983,
0.034290581941604614,
0.07460986077785492,
-0.03984908014535904,
0.1323043256998062,
0.055912308394908905,
-0.12090059369802475,
0.046618569642305374,
-0.02472815290093422,
-0.09361791610717773,
0.052862007170915604,
0.008213457651436329,
-0.05916151404380798,
0.12564034759998322,
0.0118644330650568,
0.15327170491218567,
0.041701678186655045,
-0.11686354875564575,
-0.17408387362957,
0.05060852691531181,
-0.02321775257587433,
-0.05716175585985184,
0.03500483185052872,
0.033279843628406525,
-0.11941194534301758,
0.019148414954543114,
0.0718604326248169,
0.008592159487307072,
0.06325527280569077,
-0.2020985335111618,
-0.1822252720594406,
-0.06838498264551163,
0.017078561708331108,
0.06750181317329407,
0.08316085487604141,
-0.022363808006048203,
0.18111053109169006,
-0.16704052686691284,
0.08658511191606522,
0.1814393252134323,
-0.3028716742992401,
-0.025909388437867165,
0.07791043817996979,
0.11505873501300812,
0.06271645426750183,
-0.016823846846818924,
0.07187379151582718,
0.03863447904586792,
0.015232869423925877,
-0.08752299845218658,
-0.12215379625558853,
-0.07630965113639832,
0.09036257117986679,
-0.07969453930854797,
-0.09280030429363251,
0.23503027856349945,
0.009875967167317867,
0.03854978457093239,
0.03856739401817322,
-0.08365172147750854,
0.0015113684348762035,
0.02180657535791397,
-0.03856024518609047,
-0.029915183782577515,
0.04971330612897873,
0.005404586438089609,
-0.03855803236365318,
-0.10425177216529846,
0.019803037866950035,
-0.21496669948101044,
0.22759385406970978,
0.02714472822844982,
0.09062093496322632,
-0.2393241822719574,
0.049738600850105286,
-0.020745526999235153,
-0.07918302714824677,
0.002756396308541298,
-0.08263976871967316,
-0.0020329179242253304,
-0.0014644080074504018,
-0.051215820014476776,
0.042542099952697754,
0.07131442427635193,
0.19896280765533447,
0.02261265367269516,
0.0231400765478611,
0.024748921394348145,
0.09659721702337265,
0.05768512934446335,
0.09228293597698212,
-0.02296619676053524,
0.0041454085148870945,
-0.005117831286042929,
-0.11919496208429337,
-0.01803724654018879,
-0.03676409274339676,
-0.08711154013872147,
-0.05444178730249405,
-0.00905407965183258,
0.1454620659351349,
0.10929393768310547,
-0.005028031300753355,
-0.07021140307188034,
-0.002077713143080473,
-0.04557899013161659,
-0.02006034553050995,
-0.027720896527171135,
-0.03040854074060917,
0.029260318726301193,
0.1721429079771042,
-0.07436476647853851,
0.04167190194129944,
-0.013513856567442417,
0.0642976462841034,
-0.07927416265010834,
-0.0343143604695797,
-0.023783626034855843,
0.006130971480160952,
0.0699862390756607,
-0.11909998953342438,
0.09235494583845139,
-0.1261984407901764,
-0.06717577576637268,
0.008239922113716602,
0.045363254845142365,
-0.01409397553652525,
-0.003664319636300206,
0.004755525384098291,
-0.034542858600616455,
-0.03956288844347,
-0.048255302011966705,
-0.023143142461776733,
-0.05566376447677612,
0.11430896073579788,
0.039912305772304535,
0.05772212892770767,
-0.04441726952791214,
0.0453539676964283,
-0.08430330455303192,
0.05102309212088585,
-0.07702958583831787,
-0.03412111848592758,
0.005205295514315367,
0.16750752925872803,
-0.01350058801472187,
-0.05380373075604439,
-0.10949444025754929,
0.03090004250407219,
-0.06055610254406929,
0.1871393769979477,
-0.013674082234501839,
-0.06598338484764099,
0.21298567950725555,
-0.03949926048517227,
-0.22605489194393158,
0.07734446227550507,
0.006676843855530024,
0.015816308557987213,
0.07064644992351532,
0.17505620419979095,
-0.027114136144518852,
-0.08377992361783981,
0.06703782081604004,
0.09074418246746063,
-0.1477142870426178,
-0.07583872973918915,
0.05775616690516472,
-0.0588579997420311,
-0.09247890114784241,
0.026952123269438744,
0.03474992886185646,
0.05114384740591049,
-0.0922393649816513,
-0.04386601969599724,
-0.015290370211005211,
-0.0017682829638943076,
0.03719266504049301,
0.07490076869726181,
0.045991264283657074,
-0.0828562080860138,
0.0007644754950888455,
-0.08639374375343323,
-0.009163327515125275,
0.06766004115343094,
0.020808512344956398,
-0.07403245568275452,
0.13537628948688507,
-0.12246942520141602,
0.008217085152864456,
-0.20696979761123657,
-0.09736818820238113,
-0.035763926804065704,
0.11071699857711792,
-0.008318805135786533,
0.21949028968811035,
0.08633019775152206,
-0.15427689254283905,
-0.022295106202363968,
-0.024589180946350098,
0.09850239008665085,
0.004486300051212311,
-0.010239955969154835,
-0.07137835770845413,
0.052954770624637604,
-0.07909728586673737,
-0.07162167876958847,
-0.02751961536705494,
-0.02152078226208687,
0.07488460093736649,
0.10982264578342438,
-0.0007445816299878061,
0.05139731988310814,
-0.0015630907146260142,
0.037622082978487015,
-0.00520124938338995,
0.03214215487241745,
0.09658495336771011,
-0.0455324724316597,
-0.06236215680837631,
0.11369359493255615,
-0.06781546026468277,
0.3021377921104431,
0.1724594682455063,
-0.2954479455947876,
0.004905192647129297,
-0.0005469819298014045,
-0.043335847556591034,
0.020345650613307953,
0.0799308493733406,
0.027209656313061714,
0.08154506236314774,
0.02279004268348217,
0.0787823274731636,
-0.043303679674863815,
-0.06296732276678085,
-0.011655271984636784,
-0.0616653673350811,
-0.035520847886800766,
0.11295445263385773,
0.07091022282838821,
-0.19860105216503143,
0.1492786705493927,
0.2943289279937744,
0.042056310921907425,
0.08543748408555984,
-0.05395924672484398,
-0.0360594280064106,
-0.0064434572122991085,
0.023977819830179214,
-0.044834788888692856,
0.0575965940952301,
-0.21770402789115906,
-0.004221691284328699,
0.07257774472236633,
-0.008594638668000698,
0.06384819000959396,
-0.12650756537914276,
-0.10615872591733932,
0.006197641137987375,
0.03748665750026703,
-0.0760338082909584,
0.12897208333015442,
0.03682927042245865,
0.09472763538360596,
0.03703536093235016,
-0.01896151341497898,
0.10102079808712006,
-0.009145846590399742,
-0.04793278127908707,
0.148935467004776,
-0.08764054626226425,
-0.23469123244285583,
-0.03274642303586006,
-0.09092171490192413,
0.028841177001595497,
-0.003613400273025036,
0.06772568076848984,
-0.09622729569673538,
-0.008362352848052979,
0.1120765432715416,
0.039967864751815796,
-0.19442859292030334,
0.00464168656617403,
-0.046538837254047394,
0.07214164733886719,
-0.0955968052148819,
-0.040477655827999115,
-0.06156347319483757,
-0.06728000193834305,
-0.06014839932322502,
0.11035246402025223,
-0.10330045968294144,
0.10606490075588226,
0.09626391530036926,
0.05433058738708496,
0.06315874308347702,
-0.011728383600711823,
0.2175217717885971,
-0.13635745644569397,
-0.04200872406363487,
0.18945087492465973,
-0.025169914588332176,
0.09554766863584518,
0.15341344475746155,
0.024773484095931053,
-0.07697732746601105,
-0.006401990540325642,
-0.03367644548416138,
-0.07195653766393661,
-0.25300222635269165,
-0.0336080826818943,
-0.1317436844110489,
0.052306801080703735,
-0.007495288737118244,
0.026541193947196007,
0.13510015606880188,
0.08897539228200912,
0.01854069158434868,
-0.13094620406627655,
-0.044999394565820694,
0.06767261028289795,
0.2330804318189621,
-0.05231485143303871,
0.08436768501996994,
-0.060196198523044586,
-0.121989905834198,
0.046509042382240295,
0.081588514149189,
0.1365339159965515,
0.13456954061985016,
-0.0030554274562746286,
0.09407562762498856,
0.17613235116004944,
0.13829034566879272,
0.06411093473434448,
-0.0030833384953439236,
-0.0750943273305893,
-0.025805950164794922,
0.010541236959397793,
-0.05198174715042114,
0.026340758427977562,
0.16037002205848694,
-0.10352399200201035,
-0.026685398072004318,
-0.1894531399011612,
0.07315010577440262,
0.0672268345952034,
0.05891250818967819,
-0.0843222439289093,
0.03853391855955124,
0.08976762741804123,
-0.018895326182246208,
-0.0499417670071125,
0.07946397364139557,
-0.018033701926469803,
-0.16003616154193878,
0.018559863790869713,
-0.04379931464791298,
0.1372683346271515,
0.031225033104419708,
0.07289943844079971,
-0.09392006695270538,
-0.1320929080247879,
0.054336220026016235,
0.09680154174566269,
-0.2714919149875641,
0.2979058027267456,
0.010481705889105797,
-0.0869201123714447,
-0.06080344691872597,
-0.04859201982617378,
-0.036281000822782516,
0.12169506400823593,
0.1805344820022583,
0.011572160758078098,
-0.056642211973667145,
-0.03901322931051254,
0.08413422852754593,
0.05441849306225777,
0.1305391937494278,
-0.04382450506091118,
-0.01264440268278122,
-0.01837112568318844,
0.02011200040578842,
-0.038663625717163086,
0.04889755696058273,
0.06084476783871651,
-0.1108851209282875,
0.04799014702439308,
-0.04187225177884102,
0.011454029940068722,
-0.004483891651034355,
0.005168136674910784,
-0.03681991621851921,
0.11803090572357178,
-0.034253235906362534,
-0.047268096357584,
-0.08282068371772766,
-0.1322200447320938,
0.13897787034511566,
-0.09425705671310425,
0.02235536277294159,
-0.09061553329229355,
-0.07869812846183777,
-0.06757305562496185,
-0.12765507400035858,
0.13097386062145233,
-0.10279438644647598,
-0.005648984108120203,
-0.04643775522708893,
0.18785695731639862,
-0.0746481642127037,
0.017388321459293365,
-0.0009128750534728169,
0.053779128938913345,
-0.16620074212551117,
-0.08486082404851913,
0.033808186650276184,
-0.10076703131198883,
0.07161131501197815,
0.058197058737277985,
0.009906512685120106,
0.09580618143081665,
-0.008122728206217289,
-0.004090220667421818,
0.19751504063606262,
0.24379189312458038,
-0.04098149761557579,
0.09425117075443268,
0.1574190855026245,
-0.007514298893511295,
-0.24383117258548737,
-0.06348484754562378,
-0.15920332074165344,
-0.06579046696424484,
0.003590884618461132,
-0.08624562621116638,
0.09944099932909012,
0.02431620843708515,
-0.04622792452573776,
0.09255626797676086,
-0.25259488821029663,
-0.023172345012426376,
0.13142921030521393,
-0.0076072909869253635,
0.5028795003890991,
-0.13192684948444366,
-0.0813179686665535,
0.030327770859003067,
-0.2478410303592682,
0.0762200579047203,
0.02224806323647499,
0.04143235459923744,
-0.039488255977630615,
0.08109046518802643,
0.03704897314310074,
-0.0746385008096695,
0.15071919560432434,
-0.017582552507519722,
0.012000137008726597,
-0.07021203637123108,
-0.13829073309898376,
0.0692637488245964,
0.017659444361925125,
-0.03714917227625847,
0.03515888378024101,
0.04152056947350502,
-0.16394294798374176,
-0.015243095345795155,
-0.13685184717178345,
0.055781520903110504,
0.014486709609627724,
-0.04303085803985596,
-0.03771449625492096,
-0.019120845943689346,
-0.01983542926609516,
-0.004262825008481741,
0.2776806354522705,
-0.0607423335313797,
0.19549520313739777,
-0.04786691814661026,
0.13222838938236237,
-0.1607922911643982,
-0.09484338015317917,
-0.05675845593214035,
-0.052377693355083466,
0.06999379396438599,
-0.06665780395269394,
0.03691823035478592,
0.16513662040233612,
-0.0075075579807162285,
0.016338057816028595,
0.0943584218621254,
0.011650747619569302,
-0.010236154310405254,
0.10675828158855438,
-0.2010224610567093,
-0.14111113548278809,
-0.01988077163696289,
-0.023796236142516136,
0.05910836160182953,
0.07490081340074539,
0.05228854715824127,
0.10444092005491257,
-0.04573218896985054,
0.012057805433869362,
-0.0341825857758522,
-0.0615251325070858,
-0.02220766805112362,
0.1034143716096878,
0.03120085969567299,
-0.10668440908193588,
0.05766346678137779,
-0.011280323378741741,
-0.24287669360637665,
-0.043256986886262894,
0.10909031331539154,
-0.08981604129076004,
-0.10206161439418793,
-0.09676352143287659,
0.0491475947201252,
-0.166581392288208,
-0.005604840815067291,
-0.0497390478849411,
-0.09960873425006866,
0.05245533585548401,
0.2267705500125885,
0.08582396060228348,
0.06080421060323715,
0.017108121886849403,
-0.03960411995649338,
0.027672825381159782,
-0.02523025870323181,
-0.02932330034673214,
-0.016977185383439064,
-0.07933106273412704,
-0.10022778064012527,
-0.010537462309002876,
0.20104962587356567,
-0.07411596924066544,
-0.07946030795574188,
-0.1659814864397049,
0.09785813093185425,
-0.12149212509393692,
-0.12046615779399872,
-0.12499627470970154,
-0.08136206865310669,
0.002371361944824457,
-0.12973026931285858,
-0.046438220888376236,
-0.03855244815349579,
-0.12921418249607086,
0.07226940989494324,
0.06428318470716476,
0.012280351482331753,
-0.06626791507005692,
-0.054242875427007675,
0.16733774542808533,
-0.026504816487431526,
0.09010526537895203,
0.14838741719722748,
-0.08243750035762787,
0.09731560945510864,
-0.10693705081939697,
-0.1315368413925171,
0.05170515924692154,
0.011642059311270714,
0.05943248048424721,
0.034163448959589005,
-0.0029062472749501467,
0.05955899879336357,
0.048485733568668365,
0.08402123302221298,
-0.08799217641353607,
-0.11261888593435287,
0.018388524651527405,
0.05015111714601517,
-0.19485872983932495,
-0.029843639582395554,
-0.09907194972038269,
0.1203363686800003,
0.0004649812763091177,
0.09007208049297333,
0.030026638880372047,
0.12774862349033356,
-0.05711160972714424,
0.021872514858841896,
-0.013658031821250916,
-0.15394547581672668,
0.019358651712536812,
-0.06486314535140991,
0.008180299773812294,
-0.025288954377174377,
0.24045340716838837,
-0.09197000414133072,
0.06217047944664955,
0.05162407085299492,
0.058479372411966324,
0.04751307889819145,
0.008445111103355885,
0.18269987404346466,
0.09222535043954849,
-0.06454727798700333,
-0.07570473849773407,
0.08938800543546677,
-0.05100926011800766,
-0.074056476354599,
0.12448746711015701,
0.15716853737831116,
0.12027066200971603,
0.0478706955909729,
0.0016460064798593521,
0.05697200819849968,
0.011483953334391117,
-0.21792130172252655,
0.03530324622988701,
-0.016048423945903778,
0.015018912963569164,
0.08485496044158936,
0.19854196906089783,
-0.011169644072651863,
0.0479406863451004,
-0.055824872106313705,
-0.0007418889435939491,
-0.14642541110515594,
-0.06397774815559387,
-0.059497009962797165,
-0.07250252366065979,
0.04904801398515701,
-0.09644749015569687,
-0.008551058359444141,
0.11571028083562851,
0.05495648831129074,
-0.06069279462099075,
0.08475339412689209,
0.09721504896879196,
-0.06794722378253937,
0.01432267390191555,
0.006370881572365761,
0.08349635452032089,
0.03649425506591797,
0.04918183758854866,
-0.12418810278177261,
-0.0940849706530571,
-0.06072999909520149,
0.03249484300613403,
-0.13237567245960236,
-0.046343397349119186,
-0.15584324300289154,
-0.10703645646572113,
-0.051301080733537674,
0.10950037091970444,
-0.02912118285894394,
0.13950477540493011,
-0.02222617343068123,
0.037925977259874344,
0.018658539280295372,
0.24650532007217407,
-0.05998064950108528,
-0.018176263198256493,
-0.0350148007273674,
0.16993817687034607,
0.023281896486878395,
0.08454954624176025,
-0.0033347811549901962,
0.027580000460147858,
-0.03287581354379654,
0.31801170110702515,
0.22296082973480225,
-0.06439770758152008,
0.042398542165756226,
0.05710572004318237,
0.04185853898525238,
0.08801392465829849,
-0.003839813405647874,
0.09943368285894394,
0.27158209681510925,
-0.11271157115697861,
-0.009221675805747509,
-0.026267530396580696,
0.021876543760299683,
-0.04127422720193863,
0.03549040108919144,
0.06168491765856743,
-0.05824999511241913,
-0.05333608388900757,
0.12458038330078125,
-0.15442508459091187,
0.11596741527318954,
0.05861674249172211,
-0.21143366396427155,
-0.0693858414888382,
-0.03635558485984802,
0.15511281788349152,
-0.0057012238539755344,
0.11665001511573792,
-0.035942405462265015,
-0.12757587432861328,
0.012561650946736336,
0.057440198957920074,
-0.23118849098682404,
-0.10241464525461197,
0.15595342218875885,
0.043267905712127686,
-0.009196724742650986,
-0.02350461669266224,
0.026662252843379974,
0.09518370777368546,
0.011068393476307392,
-0.050810229033231735,
0.005546593107283115,
0.0895325317978859,
-0.08883298188447952,
-0.10321615636348724,
-0.005332686938345432,
0.06529538333415985,
-0.10149944573640823,
0.09617508947849274,
-0.1316470503807068,
0.044280149042606354,
-0.04111987352371216,
-0.0041619762778282166,
-0.04155515134334564,
0.08947503566741943,
-0.06439544260501862,
0.025832384824752808,
0.06653797626495361,
-0.010252204723656178,
-0.0259067602455616,
-0.031160809099674225,
-0.004354055039584637,
0.06913192570209503,
-0.05023598670959473,
-0.14811062812805176,
0.019722528755664825,
-0.07064778357744217,
0.09203974157571793,
-0.037189822643995285,
-0.10869020968675613,
-0.031760405749082565,
-0.02257516048848629,
0.03180760145187378,
-0.08358912914991379,
0.016418717801570892,
0.051939912140369415,
0.04715248569846153,
0.027382029220461845,
-0.07958980649709702,
0.041527025401592255,
0.055377695709466934,
-0.12906725704669952,
-0.047657232731580734
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "model-index": [{"name": "results", "results": []}]}
|
text2text-generation
|
arawat/pegasus-custom-xsum
|
[
"transformers",
"pytorch",
"pegasus",
"text2text-generation",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us
|
# results
This model is a fine-tuned version of google/pegasus-large on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0
- Datasets 1.15.1
- Tokenizers 0.10.3
|
[
"# results\n\nThis model is a fine-tuned version of google/pegasus-large on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1",
"### Training results",
"### Framework versions\n\n- Transformers 4.12.5\n- Pytorch 1.10.0\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n",
"# results\n\nThis model is a fine-tuned version of google/pegasus-large on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1",
"### Training results",
"### Framework versions\n\n- Transformers 4.12.5\n- Pytorch 1.10.0\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
47,
28,
6,
12,
8,
3,
105,
4,
30
] |
[
"passage: TAGS\n#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n# results\n\nThis model is a fine-tuned version of google/pegasus-large on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1### Training results### Framework versions\n\n- Transformers 4.12.5\n- Pytorch 1.10.0\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
-0.10768193006515503,
0.10255664587020874,
-0.001930752070620656,
0.0864313393831253,
0.1583857238292694,
0.01620582677423954,
0.12821058928966522,
0.12650321424007416,
-0.10428367555141449,
0.054952021688222885,
0.09751760214567184,
0.011659989133477211,
0.04254121705889702,
0.1406572312116623,
-0.003898421535268426,
-0.3113534152507782,
-0.006257752422243357,
0.01339851226657629,
-0.11642816662788391,
0.11259278655052185,
0.11199280619621277,
-0.09192828088998795,
0.08916052430868149,
0.029750585556030273,
-0.16321703791618347,
0.028656240552663803,
-0.023130692541599274,
-0.08103398233652115,
0.1027936339378357,
0.021650612354278564,
0.08219865709543228,
0.0020547606982290745,
0.10630148649215698,
-0.16901269555091858,
0.0006406510947272182,
0.07092252373695374,
0.04655945673584938,
0.10540500283241272,
0.04278159886598587,
-0.010865308344364166,
0.11330869793891907,
-0.11457665264606476,
0.0710529163479805,
0.04660222306847572,
-0.0829506367444992,
-0.21875794231891632,
-0.06305696815252304,
0.09999138116836548,
0.06272734701633453,
0.10614607483148575,
-0.001016463153064251,
0.1227886751294136,
-0.09683595597743988,
0.060183778405189514,
0.24575956165790558,
-0.2713291347026825,
-0.0749230831861496,
0.021035412326455116,
0.055244896560907364,
0.034365277737379074,
-0.08971337229013443,
0.0019148389110341668,
0.05756683647632599,
0.040342312306165695,
0.08019163459539413,
-0.0026330819819122553,
-0.03956300765275955,
-0.007780811283737421,
-0.11258137971162796,
-0.03650103136897087,
0.20908965170383453,
0.04084989055991173,
-0.05046641081571579,
-0.10628657788038254,
-0.06576108932495117,
-0.11739522218704224,
-0.010244423523545265,
-0.026867209002375603,
0.0484599694609642,
-0.05577315762639046,
-0.0783357173204422,
-0.08084411174058914,
-0.07881226390600204,
-0.07609211653470993,
-0.018313027918338776,
0.16262663900852203,
0.051355160772800446,
0.027836494147777557,
-0.07186290621757507,
0.12606287002563477,
0.012729519046843052,
-0.12134963274002075,
-0.014002223499119282,
-0.013567239977419376,
-0.016679957509040833,
-0.0338699109852314,
-0.051394350826740265,
-0.0019998890347778797,
0.0025343564338982105,
0.15774737298488617,
-0.10362613946199417,
0.049474120140075684,
0.06392376869916916,
0.01203595194965601,
-0.018222693353891373,
0.1847992092370987,
-0.06621848791837692,
-0.006873709615319967,
0.03211016207933426,
0.09261495620012283,
0.03295315057039261,
-0.0025051573757082224,
-0.10332918912172318,
-0.06058761849999428,
0.07615868747234344,
0.06473162025213242,
-0.028338147327303886,
0.03528263047337532,
-0.027989305555820465,
-0.03835310786962509,
0.011608299799263477,
-0.11072777956724167,
0.03716824948787689,
-0.013628322631120682,
-0.1222195252776146,
0.014779066666960716,
0.03519199788570404,
-0.00853228010237217,
-0.059410423040390015,
0.07661157846450806,
-0.10159220546483994,
0.006903405766934156,
-0.09241281449794769,
-0.04808993265032768,
0.014220130629837513,
-0.04976407811045647,
-0.005427492316812277,
-0.08528536558151245,
-0.18347185850143433,
-0.024016546085476875,
0.03410513326525688,
-0.060169097036123276,
-0.07983522117137909,
-0.03346826881170273,
-0.07024522870779037,
0.013295422308146954,
-0.013372411020100117,
0.11990036070346832,
-0.034496650099754333,
0.09276160597801208,
0.0745050311088562,
0.04584883898496628,
-0.00152268644887954,
0.0458952970802784,
-0.10656362026929855,
0.04131416976451874,
-0.14668723940849304,
0.0886356458067894,
-0.06511256843805313,
0.0348939523100853,
-0.10309828072786331,
-0.12821361422538757,
-0.014682266861200333,
-0.007832138799130917,
0.0800083801150322,
0.16441504657268524,
-0.10078324377536774,
-0.07136652618646622,
0.14520755410194397,
-0.058251846581697464,
-0.11309591680765152,
0.10390445590019226,
-0.007997723296284676,
0.04013199359178543,
0.06474771350622177,
0.1245187297463417,
0.08904381096363068,
-0.112340047955513,
-0.020920945331454277,
0.02535826712846756,
0.05993059277534485,
-0.02880391664803028,
0.07026470452547073,
-0.012192579917609692,
0.013192607089877129,
0.04400346055626869,
-0.038167279213666916,
0.016760781407356262,
-0.0993192195892334,
-0.07738497853279114,
-0.06326580792665482,
-0.08807425945997238,
0.06697876751422882,
0.03794688731431961,
0.0804474875330925,
-0.07933872938156128,
-0.11979488283395767,
0.0884142816066742,
0.10992468893527985,
-0.04343753680586815,
0.02044825628399849,
-0.0657050684094429,
0.06775182485580444,
-0.02956916019320488,
-0.0007601258694194257,
-0.1926836222410202,
-0.12833160161972046,
0.03537340089678764,
-0.07367373257875443,
0.04704844951629639,
-0.0436718612909317,
0.07204944640398026,
0.05802705138921738,
-0.060211699455976486,
-0.02254030480980873,
-0.10901685804128647,
-0.04351404309272766,
-0.10700920969247818,
-0.14026090502738953,
-0.055138345807790756,
0.0005026075523346663,
0.1492888629436493,
-0.18511857092380524,
0.03028446063399315,
-0.015537089668214321,
0.14804598689079285,
0.02175845578312874,
-0.03422018885612488,
0.0031915251165628433,
0.040866605937480927,
-0.025698846206068993,
-0.08646427094936371,
0.046623386442661285,
-0.005475509911775589,
-0.08465296030044556,
-0.043965477496385574,
-0.10341958701610565,
0.10506845265626907,
0.09794290363788605,
0.017518918961286545,
-0.0840364322066307,
-0.018007351085543633,
-0.09234410524368286,
-0.029545828700065613,
-0.0847737118601799,
-0.00033740344224497676,
0.18573619425296783,
0.01453301589936018,
0.15016096830368042,
-0.07568494230508804,
-0.06533067673444748,
0.021564295515418053,
-0.019465727731585503,
-0.017814092338085175,
0.0809941440820694,
0.07728184759616852,
-0.012206023558974266,
0.09519685804843903,
0.07913386076688766,
-0.08703596889972687,
0.14565519988536835,
-0.05009159818291664,
-0.08450841903686523,
-0.009948036633431911,
-0.008534584194421768,
-0.01496520172804594,
0.10793954879045486,
-0.11051616072654724,
-0.019007433205842972,
0.030427923426032066,
0.019227033481001854,
0.05854175612330437,
-0.18350635468959808,
-0.002911067334935069,
0.013324316591024399,
-0.045079831033945084,
-0.06564103811979294,
0.008929003961384296,
0.016178950667381287,
0.089363232254982,
0.04062474146485329,
0.013975678943097591,
0.036583397537469864,
0.016827644780278206,
-0.06528622657060623,
0.1895867884159088,
-0.08571799099445343,
-0.17311780154705048,
-0.1728411465883255,
0.10020900517702103,
-0.09581045806407928,
-0.007972311228513718,
0.021400772035121918,
-0.09221796691417694,
-0.030253540724515915,
-0.05722040683031082,
0.056369565427303314,
-0.047873493283987045,
0.014107784256339073,
0.02088594250380993,
0.034593213349580765,
0.09436029940843582,
-0.12467347830533981,
0.006622293498367071,
-0.018461434170603752,
-0.09301166236400604,
-0.022433454170823097,
0.02545136958360672,
0.09559246897697449,
0.10119739919900894,
-0.02594594843685627,
0.03403206914663315,
-0.031200869008898735,
0.2407054752111435,
-0.0901850089430809,
-0.01880030333995819,
0.16618521511554718,
0.07105689495801926,
0.04692012816667557,
0.06944482773542404,
0.03938879817724228,
-0.0946590006351471,
0.034398071467876434,
0.05929409712553024,
-0.022362112998962402,
-0.2513471245765686,
-0.03057706728577614,
-0.02684718370437622,
-0.028993669897317886,
0.10381368547677994,
0.04597907140851021,
0.0067717586643993855,
0.0730423629283905,
-0.033031295984983444,
0.10589881241321564,
-0.03831097111105919,
0.0764613151550293,
0.15108288824558258,
0.04321853816509247,
0.086808942258358,
-0.038498520851135254,
-0.058945395052433014,
0.06779233366250992,
0.0003156001039315015,
0.23340027034282684,
-0.04611644893884659,
0.12393734604120255,
0.0012714348267763853,
0.1327466368675232,
-0.013005265966057777,
0.0616852268576622,
0.03399215266108513,
-0.004861669614911079,
0.014760688878595829,
-0.05268120765686035,
-0.043330926448106766,
0.021487534046173096,
0.016143133863806725,
0.07307911664247513,
-0.13457174599170685,
0.03206280246376991,
0.025345757603645325,
0.30747655034065247,
0.018108807504177094,
-0.32364019751548767,
-0.12528516352176666,
0.005391266196966171,
-0.033334825187921524,
-0.07143374532461166,
0.03725501149892807,
0.10921213775873184,
-0.14298205077648163,
0.05302971974015236,
-0.061720870435237885,
0.0998283326625824,
-0.05153386667370796,
0.007225592155009508,
0.05289454758167267,
0.14521296322345734,
-0.0005480245454236865,
0.10150349885225296,
-0.22509051859378815,
0.21433380246162415,
0.011214208789169788,
0.10138773918151855,
-0.059542443603277206,
0.03119436651468277,
0.013540449552237988,
0.09696993976831436,
0.08119035512208939,
-0.008955310098826885,
0.01811923459172249,
-0.15272405743598938,
-0.09415112435817719,
0.037033166736364365,
0.12891492247581482,
-0.06384080648422241,
0.0913667231798172,
-0.058119021356105804,
0.0013771853409707546,
0.023333683609962463,
-0.0553673580288887,
-0.14317648112773895,
-0.10284154862165451,
0.0052930633537471294,
0.023058271035552025,
0.010686744935810566,
-0.08081148564815521,
-0.12101852148771286,
0.014175062999129295,
0.13297927379608154,
-0.019336149096488953,
-0.07373686879873276,
-0.14008206129074097,
0.07957886159420013,
0.14578227698802948,
-0.07371098548173904,
0.013190137222409248,
-0.008817988447844982,
0.14017353951931,
0.034709371626377106,
-0.08185581862926483,
0.08120183646678925,
-0.06868777424097061,
-0.20172901451587677,
-0.03410860151052475,
0.161660298705101,
0.010957377962768078,
0.04165991395711899,
-0.027567006647586823,
0.007006854750216007,
-0.02922080270946026,
-0.09715204685926437,
0.02546720579266548,
0.05065407231450081,
0.04010697454214096,
0.059658851474523544,
-0.05979130417108536,
0.08263727277517319,
-0.018512871116399765,
0.0005822870880365372,
0.11169270426034927,
0.20503681898117065,
-0.08080019801855087,
0.04192642122507095,
0.07231063395738602,
-0.05650263652205467,
-0.1856130212545395,
0.05025826394557953,
0.12158067524433136,
0.027146922424435616,
-0.004083997569978237,
-0.22075025737285614,
0.10689794272184372,
0.11584353446960449,
-0.03394215181469917,
0.1114419624209404,
-0.3159691095352173,
-0.13808122277259827,
0.05825888738036156,
0.08602608740329742,
0.07294861972332001,
-0.14052143692970276,
-0.046721842139959335,
-0.05359378084540367,
-0.10782184451818466,
0.10353930294513702,
-0.07783333957195282,
0.12572404742240906,
-0.030493514612317085,
0.13037899136543274,
0.023464644327759743,
-0.0341334193944931,
0.130390927195549,
0.007226831279695034,
0.06703303754329681,
-0.040189944207668304,
0.018642934039235115,
0.10268168151378632,
-0.06440477818250656,
0.08942143619060516,
-0.022758247330784798,
0.07673559337854385,
-0.11890773475170135,
-0.02816096693277359,
-0.0856502503156662,
0.09406422823667526,
-0.04695117846131325,
-0.046158939599990845,
-0.04141087085008621,
0.020054684951901436,
0.008901425637304783,
-0.02530566416680813,
0.118484728038311,
0.0419490672647953,
0.0947776585817337,
0.08545947074890137,
0.11155527085065842,
-0.04328470304608345,
-0.11613469570875168,
0.010465849190950394,
-0.019160157069563866,
0.08175656199455261,
-0.1596432328224182,
0.009847075678408146,
0.12461753934621811,
0.048077210783958435,
0.09937809407711029,
0.06167525425553322,
-0.05756712704896927,
0.008842435665428638,
0.0566733255982399,
-0.12784533202648163,
-0.1726851910352707,
-0.06870774924755096,
-0.07330968976020813,
-0.14790210127830505,
0.07051882892847061,
0.09657055884599686,
-0.09142961353063583,
-0.02460664138197899,
-0.015034735202789307,
-0.007980220019817352,
-0.01791110448539257,
0.1788003295660019,
0.06578503549098969,
0.062169212847948074,
-0.09551707655191422,
0.14258529245853424,
0.06027313321828842,
-0.08445224910974503,
0.04570842161774635,
0.10738993436098099,
-0.07497638463973999,
-0.01046668365597725,
0.02900022827088833,
0.09210033714771271,
-0.06655899435281754,
-0.0474674366414547,
-0.13308565318584442,
-0.1006784588098526,
0.04883969947695732,
0.06279561668634415,
0.061194900423288345,
-0.0006974851130507886,
-0.03577270731329918,
0.049088165163993835,
-0.1577606052160263,
0.0935346856713295,
0.044059671461582184,
0.07710296660661697,
-0.17141680419445038,
0.12165125459432602,
0.013983710668981075,
0.06555331498384476,
-0.02664170414209366,
0.002414283575490117,
-0.0977896898984909,
-0.00919429026544094,
-0.13933458924293518,
-0.04222056642174721,
-0.01571749895811081,
0.005394052714109421,
-0.020378921180963516,
-0.047747522592544556,
-0.05736205354332924,
0.05414651334285736,
-0.08559346199035645,
-0.05923475697636604,
0.009794323705136776,
0.04236621409654617,
-0.1341705322265625,
0.022285036742687225,
0.033454425632953644,
-0.09390177577733994,
0.09895610064268112,
0.08290209621191025,
0.018019357696175575,
0.03627622872591019,
-0.06137821823358536,
-0.015787139534950256,
0.018021538853645325,
0.008229113183915615,
0.06516044586896896,
-0.07509081810712814,
0.0072676860727369785,
-0.020765801891684532,
0.08275312185287476,
0.010188856162130833,
0.03639591857790947,
-0.14483524858951569,
-0.0416717529296875,
-0.024496369063854218,
-0.04721282422542572,
-0.06858904659748077,
0.04295991733670235,
0.08007492870092392,
0.03334406018257141,
0.16963428258895874,
-0.07962437719106674,
0.027799753472208977,
-0.19792179763317108,
-0.010517927818000317,
-0.018471036106348038,
-0.0498841293156147,
-0.08426462113857269,
-0.03217034786939621,
0.08343126624822617,
-0.04523083195090294,
0.12744395434856415,
0.021825937554240227,
0.13719114661216736,
0.03706014156341553,
-0.020543154329061508,
-0.021846061572432518,
0.012893029488623142,
0.16288401186466217,
0.06544607132673264,
-0.007789145689457655,
0.08883363008499146,
0.02595275640487671,
0.08139794319868088,
0.08047538995742798,
0.18578274548053741,
0.062117449939250946,
-0.06453625112771988,
0.10824713855981827,
0.0735476016998291,
-0.06575874239206314,
-0.1755184829235077,
0.046551186591386795,
-0.013844129629433155,
0.108962282538414,
-0.059197526425123215,
0.12327553331851959,
0.10635662078857422,
-0.12800225615501404,
0.0460333526134491,
-0.0595669224858284,
-0.09640178084373474,
-0.1337069272994995,
-0.04507746174931526,
-0.07152346521615982,
-0.1708763837814331,
0.025795692577958107,
-0.1228930801153183,
0.004332482814788818,
0.06777943670749664,
-0.0019154173787683249,
-0.02688046172261238,
0.17916740477085114,
-0.0142283383756876,
-0.014250402338802814,
0.0847112238407135,
0.009701384231448174,
-0.007027924060821533,
-0.04322374239563942,
-0.07666226476430893,
0.01712741144001484,
-0.013810094445943832,
0.06943636387586594,
-0.0357467457652092,
-0.07437626272439957,
0.01897893100976944,
-0.01631287671625614,
-0.06563659012317657,
0.009310000576078892,
0.04135962203145027,
0.035536568611860275,
-0.0026483091060072184,
0.020181981846690178,
-0.024945637211203575,
-0.048381321132183075,
0.3092706501483917,
-0.09974448382854462,
-0.057729557156562805,
-0.13719160854816437,
0.2536170780658722,
0.03429959714412689,
-0.02316032163798809,
0.04922895506024361,
-0.11259625107049942,
-0.05504090338945389,
0.18715330958366394,
0.16765418648719788,
-0.0452093742787838,
-0.039187077432870865,
-0.012431868351995945,
-0.017400827258825302,
-0.03993970528244972,
0.12772488594055176,
0.11739187687635422,
0.07952164858579636,
-0.05561472475528717,
0.004885469563305378,
-0.001559445750899613,
-0.028887979686260223,
-0.09268981963396072,
0.12203005701303482,
0.0007698424160480499,
0.0007510788273066282,
-0.05686928704380989,
0.06298898905515671,
-0.05183308571577072,
-0.12813562154769897,
0.03431938216090202,
-0.17291031777858734,
-0.15924234688282013,
-0.03935715928673744,
0.033847589045763016,
0.006331704091280699,
0.09583676606416702,
0.0028755604289472103,
-0.010471010580658913,
0.13286279141902924,
-0.020913593471050262,
-0.10445601493120193,
-0.1032860055565834,
0.094036765396595,
-0.07956025749444962,
0.2286798506975174,
-0.011616408824920654,
0.049571022391319275,
0.10002729296684265,
0.004414504859596491,
-0.14334863424301147,
0.012648923322558403,
0.056451935321092606,
-0.04941144585609436,
0.025618771091103554,
0.1453191637992859,
-0.021126562729477882,
0.03740962967276573,
0.013303464278578758,
-0.10776004940271378,
-0.026465393602848053,
-0.045680806040763855,
-0.0015065676998347044,
-0.10314226895570755,
-0.008201539516448975,
-0.06838175654411316,
0.1301484853029251,
0.2217681109905243,
-0.04982033371925354,
-0.018457839265465736,
-0.08609393984079361,
0.03584863990545273,
0.06603137403726578,
0.06170469895005226,
-0.01943378336727619,
-0.21238279342651367,
-0.013863402418792248,
0.0541209802031517,
-0.006757398601621389,
-0.24890072643756866,
-0.05655276030302048,
0.023950058966875076,
-0.04097266495227814,
-0.04817638173699379,
0.08881955593824387,
0.06893295794725418,
0.04170047119259834,
-0.0368507020175457,
-0.07571765035390854,
-0.058810632675886154,
0.1556384116411209,
-0.17293938994407654,
-0.06456080079078674
] |
null | null |
transformers
|
#HourAI bot based on DialoGPT
|
{"tags": ["conversational"]}
|
text-generation
|
archmagos/HourAI
|
[
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
#HourAI bot based on DialoGPT
|
[] |
[
"TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
56
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.028994612395763397,
0.03717942163348198,
-0.007205516565591097,
0.004361928440630436,
0.14950066804885864,
-0.013941794633865356,
0.11986828595399857,
0.1182805597782135,
-0.03048190474510193,
-0.010174466297030449,
0.14877668023109436,
0.1851094663143158,
-0.013957205228507519,
0.09307502955198288,
-0.09557180106639862,
-0.2245006561279297,
0.08883897960186005,
0.027438592165708542,
0.035971157252788544,
0.1274057924747467,
0.08356550335884094,
-0.06823034584522247,
0.06669498234987259,
-0.03636123612523079,
-0.12532266974449158,
0.0028578240890055895,
0.05958588048815727,
-0.1302003562450409,
0.12235744297504425,
0.03003060445189476,
0.10374733805656433,
0.043939314782619476,
-0.07261740416288376,
-0.15499833226203918,
0.03386516869068146,
0.03463131934404373,
-0.06291534006595612,
0.046142611652612686,
0.07618991285562515,
-0.093361496925354,
0.06937185674905777,
0.06204131990671158,
-0.012534034438431263,
0.05895763263106346,
-0.15713968873023987,
-0.04276531934738159,
-0.014816577546298504,
0.013026992790400982,
0.07869639992713928,
0.10855725407600403,
-0.03201347962021828,
0.14759007096290588,
-0.0851001888513565,
0.12150996178388596,
0.1239863857626915,
-0.32539400458335876,
-0.0010325099574401975,
0.07500352710485458,
0.042798060923814774,
0.0700213834643364,
-0.04765492305159569,
0.051571447402238846,
0.031836334615945816,
0.004194476641714573,
0.014558027498424053,
-0.05954115092754364,
-0.12596048414707184,
0.018643176183104515,
-0.09720765799283981,
-0.06140025332570076,
0.20382985472679138,
-0.0590277723968029,
0.051872193813323975,
-0.07813733071088791,
-0.12088852375745773,
-0.029457518830895424,
-0.018522758036851883,
-0.0014233867404982448,
-0.07191982120275497,
0.0697530135512352,
0.008914402686059475,
-0.048782069236040115,
-0.14170318841934204,
-0.03263361006975174,
-0.16206398606300354,
0.18776893615722656,
0.03174376115202904,
0.04489494860172272,
-0.19151368737220764,
0.09325390309095383,
0.040709663182497025,
-0.10056023299694061,
0.030602479353547096,
-0.10420837253332138,
0.0518723800778389,
0.013859134167432785,
-0.03356955200433731,
-0.07820286601781845,
0.12177305668592453,
0.10705813020467758,
-0.08302552998065948,
0.029910054057836533,
-0.0518684946000576,
0.07312697917222977,
0.02555178292095661,
0.06321559846401215,
0.0005397886270657182,
0.009785126894712448,
0.06927945464849472,
-0.08632822334766388,
0.029176287353038788,
-0.06690286844968796,
-0.12896926701068878,
-0.0064486730843782425,
0.09108948707580566,
0.11376697570085526,
0.012237145565450191,
0.12357603758573532,
-0.043104153126478195,
0.01189747080206871,
0.06860891729593277,
-0.06891996413469315,
-0.022417569532990456,
0.03244988992810249,
0.03606780990958214,
0.06666674464941025,
-0.006634891033172607,
0.02604227140545845,
-0.13079532980918884,
0.04091726988554001,
-0.0643448606133461,
-0.01847873069345951,
-0.02537871152162552,
-0.054979365319013596,
0.021013228222727776,
-0.05385638400912285,
0.0016376969870179892,
-0.17925472557544708,
-0.15229609608650208,
0.01025520171970129,
-0.02450299635529518,
-0.01414579525589943,
-0.04215415567159653,
-0.05549995228648186,
-0.038427844643592834,
0.025479240342974663,
-0.07407741248607635,
-0.0541660450398922,
-0.06597121059894562,
0.11745881289243698,
-0.03170236200094223,
0.0638091042637825,
-0.09621044993400574,
0.055343348532915115,
-0.1376529037952423,
-0.010226898826658726,
-0.07068921625614166,
0.06311444938182831,
-0.015582656487822533,
0.11676833033561707,
0.0014907608274370432,
-0.023834871128201485,
-0.0757647231221199,
0.05509824678301811,
-0.025720594450831413,
0.22712132334709167,
-0.06154930591583252,
-0.09178899973630905,
0.31081268191337585,
-0.11809492111206055,
-0.15338093042373657,
0.13023294508457184,
0.011604922823607922,
0.019147438928484917,
0.12250597029924393,
0.20399682223796844,
0.008194107562303543,
-0.010573046281933784,
0.0578976534307003,
0.09868865460157394,
-0.12109559029340744,
-0.03384355455636978,
0.0019291907083243132,
-0.027920013293623924,
-0.13384109735488892,
0.045637402683496475,
0.08118539303541183,
0.06196809560060501,
-0.05529056116938591,
-0.03310706093907356,
-0.032375507056713104,
-0.007541123311966658,
0.09128022193908691,
0.0010877466993406415,
0.08752452582120895,
-0.08452648669481277,
-0.04017898812890053,
-0.05404124781489372,
-0.005262788850814104,
-0.04702545702457428,
0.018141234293580055,
-0.06270471960306168,
0.11833393573760986,
-0.0026014288887381554,
0.06254255026578903,
-0.14835688471794128,
-0.11584768444299698,
-0.009780634194612503,
0.12789341807365417,
-0.01780831813812256,
0.04813294857740402,
0.06763497740030289,
0.00098732253536582,
-0.0011714716674759984,
-0.03454669937491417,
0.17249971628189087,
-0.012706821784377098,
-0.05934740975499153,
-0.06328863650560379,
0.09847947955131531,
-0.06798075139522552,
0.05744375288486481,
-0.07993299514055252,
0.025816364213824272,
0.06514972448348999,
0.09963318705558777,
0.008724996820092201,
0.028547506779432297,
-0.0013018660247325897,
0.004906942136585712,
-0.05867992341518402,
-0.00611899746581912,
0.10396798700094223,
0.011229002848267555,
-0.0740957260131836,
0.20066019892692566,
-0.21398082375526428,
0.24392159283161163,
0.19056081771850586,
-0.24192918837070465,
0.00981980562210083,
-0.09797576069831848,
-0.035239703953266144,
0.0204030629247427,
0.04077638313174248,
-0.043731484562158585,
0.11199628561735153,
-0.00907069444656372,
0.18191012740135193,
-0.0733426883816719,
-0.04988950863480568,
-0.0024024536833167076,
-0.06362885236740112,
-0.006515666376799345,
0.08168808370828629,
0.0420350655913353,
-0.14488229155540466,
0.19209423661231995,
0.1648620218038559,
0.03738253936171532,
0.18765953183174133,
-0.007599308155477047,
-0.0005212257383391261,
0.08825082331895828,
0.056330155581235886,
-0.0332566499710083,
-0.0683663859963417,
-0.20980527997016907,
-0.015365404076874256,
0.0680062547326088,
0.0452900193631649,
0.10409069806337357,
-0.1285317838191986,
-0.0476088747382164,
-0.02153710089623928,
-0.03222563490271568,
0.021310606971383095,
0.07516232877969742,
0.04943285137414932,
0.1463354378938675,
-0.03315149247646332,
-0.029199428856372833,
0.10530391335487366,
0.0018139067105948925,
-0.10505969077348709,
0.21006526052951813,
-0.12959107756614685,
-0.38478371500968933,
-0.12295942008495331,
-0.12465260177850723,
-0.05466725677251816,
0.05989878252148628,
0.11986995488405228,
-0.12428545951843262,
-0.026873940601944923,
-0.027530189603567123,
0.06363729387521744,
-0.03752683103084564,
0.01928102970123291,
-0.062066853046417236,
0.0373716838657856,
-0.06338763982057571,
-0.10153484344482422,
-0.04917367920279503,
-0.029283059760928154,
-0.10798602551221848,
0.16745351254940033,
-0.07413849234580994,
0.05456981807947159,
0.18967147171497345,
0.024961886927485466,
0.03510000556707382,
-0.053832922130823135,
0.1711120754480362,
-0.08332394063472748,
-0.019603034481406212,
0.19940707087516785,
-0.057942941784858704,
0.08007334917783737,
0.12594826519489288,
-0.015241099521517754,
-0.0892653539776802,
0.050643131136894226,
-0.033212289214134216,
-0.08026967942714691,
-0.23528216779232025,
-0.13507108390331268,
-0.08512603491544724,
0.11327257007360458,
0.018413247540593147,
0.06799782812595367,
0.1529865264892578,
0.07597990334033966,
-0.0492875799536705,
-0.044824402779340744,
0.0672951489686966,
0.07731044292449951,
0.1831343173980713,
-0.048216793686151505,
0.14399567246437073,
-0.04114715754985809,
-0.15805895626544952,
0.07400382310152054,
0.04073971137404442,
0.10478080809116364,
0.037516117095947266,
0.039527639746665955,
0.02364072948694229,
0.09820537269115448,
0.13571609556674957,
0.11845608055591583,
0.007558451034128666,
-0.039035581052303314,
-0.016731536015868187,
-0.03634432330727577,
-0.0670543760061264,
0.011746753938496113,
-0.010773980990052223,
-0.12803319096565247,
-0.06190115585923195,
-0.06618185341358185,
0.11947073042392731,
0.082245372235775,
0.054481759667396545,
-0.21059565246105194,
-0.009993447922170162,
0.09027878940105438,
-0.019997427240014076,
-0.12006835639476776,
0.10920628160238266,
0.04952099546790123,
-0.12109538167715073,
0.03819160908460617,
-0.04868942126631737,
0.09576742351055145,
-0.06145532801747322,
0.0944039523601532,
-0.09457848221063614,
-0.06062381714582443,
0.002222648821771145,
0.11850869655609131,
-0.2925296127796173,
0.2048967033624649,
-0.00892670638859272,
-0.025700481608510017,
-0.1052609458565712,
-0.000636346114333719,
0.002383036306127906,
0.0990152508020401,
0.13844019174575806,
-0.013388436287641525,
-0.018852530047297478,
-0.07502689957618713,
-0.04686504229903221,
0.043244775384664536,
0.12532752752304077,
-0.015467319637537003,
-0.00834830105304718,
-0.04835360869765282,
-0.0023646291811019182,
-0.018859002739191055,
-0.09980562329292297,
-0.006937874481081963,
-0.16253407299518585,
0.06168588250875473,
0.0530114583671093,
0.10794179141521454,
-0.013860982842743397,
-0.00854520220309496,
-0.11860840767621994,
0.22573307156562805,
-0.07790760695934296,
-0.10792146623134613,
-0.09785399585962296,
-0.06658799201250076,
0.020297683775424957,
-0.06326670944690704,
0.04842204228043556,
-0.07062158733606339,
0.04524936527013779,
-0.06655782461166382,
-0.18382880091667175,
0.11711519211530685,
-0.10763727873563766,
-0.07890895754098892,
-0.021923266351222992,
0.22087989747524261,
-0.049244802445173264,
-0.01755031757056713,
0.03608519211411476,
0.016882948577404022,
-0.09164203703403473,
-0.10763014107942581,
0.019699513912200928,
-0.017170218750834465,
0.06290043145418167,
0.04854103550314903,
-0.05951589718461037,
-0.09417112171649933,
-0.04073793813586235,
-0.006547942757606506,
0.32037001848220825,
0.19119004905223846,
-0.04242495819926262,
0.1651287078857422,
0.16958652436733246,
-0.05205022543668747,
-0.3448276221752167,
-0.10939286649227142,
-0.12942685186862946,
-0.06017755717039108,
-0.0542730912566185,
-0.13424967229366302,
0.07734859734773636,
0.03176725283265114,
-0.02984931506216526,
0.11289031058549881,
-0.2495477944612503,
-0.0830443948507309,
0.1708361953496933,
0.02978249453008175,
0.36560356616973877,
-0.15687896311283112,
-0.10042843967676163,
-0.04485407844185829,
-0.11313583701848984,
0.1638716757297516,
-0.10624252259731293,
0.08359898626804352,
0.0005173089448362589,
0.07685405761003494,
0.05813150852918625,
-0.05476165935397148,
0.08167819678783417,
-0.027241511270403862,
0.009587266482412815,
-0.11564526706933975,
-0.02990633435547352,
0.03377249091863632,
0.011620084755122662,
0.03751041740179062,
-0.062149230390787125,
0.04695598781108856,
-0.06142954155802727,
-0.0440547950565815,
-0.08005185425281525,
0.05958639085292816,
0.029754646122455597,
-0.06617877632379532,
0.0050559998489916325,
-0.065475232899189,
-0.00020704269991256297,
0.016612045466899872,
0.2049017995595932,
-0.06282583624124527,
0.19598184525966644,
0.12754030525684357,
0.13884544372558594,
-0.14263051748275757,
0.04750414565205574,
-0.05133480206131935,
-0.06960497051477432,
0.0812823697924614,
-0.06672589480876923,
0.06808197498321533,
0.09437867254018784,
-0.03965180739760399,
0.0742703527212143,
0.10401890426874161,
0.014220776036381721,
-0.0025037124287337065,
0.12674778699874878,
-0.2911945581436157,
-0.06027824059128761,
-0.05841008946299553,
0.0267078448086977,
0.0765918642282486,
0.12912607192993164,
0.18127864599227905,
0.01996750570833683,
-0.03108501434326172,
-0.018294807523489,
0.03991484269499779,
-0.027562161907553673,
0.06695644557476044,
0.004677923396229744,
0.029333269223570824,
-0.1450662910938263,
0.08025793731212616,
0.00001413424797647167,
-0.1466224044561386,
0.024967879056930542,
0.14475053548812866,
-0.13803942501544952,
-0.13489209115505219,
-0.04678435996174812,
0.09309624135494232,
-0.05887448415160179,
-0.04888263717293739,
-0.04676072299480438,
-0.15312238037586212,
0.055112265050411224,
0.14994119107723236,
0.052975382655858994,
0.10418030619621277,
-0.017471758648753166,
-0.017556704580783844,
-0.048256851732730865,
0.017785103991627693,
-0.0009936511050909758,
0.0022541533689945936,
-0.09053029865026474,
0.0744970515370369,
-0.01958519034087658,
0.10916736721992493,
-0.09695246815681458,
-0.06790593266487122,
-0.17211103439331055,
0.03275161236524582,
-0.09633186459541321,
-0.05772961676120758,
-0.09038986265659332,
-0.03978794440627098,
-0.010782967321574688,
-0.014750530011951923,
-0.025427671149373055,
-0.043428484350442886,
-0.09523502737283707,
0.04417814314365387,
-0.03300096467137337,
0.007264059968292713,
-0.10051228106021881,
0.00016086101823020726,
0.07025935500860214,
-0.03757777437567711,
0.16884271800518036,
0.13381977379322052,
-0.10865480452775955,
0.1102571040391922,
-0.21270614862442017,
-0.07441695034503937,
0.12528401613235474,
-0.006233865395188332,
0.015086804516613483,
0.07555849105119705,
0.017773650586605072,
0.09126552194356918,
0.00828908197581768,
0.05829174071550369,
0.03726828843355179,
-0.11326239258050919,
0.07704365253448486,
-0.01599319651722908,
-0.1294003278017044,
-0.04773728922009468,
-0.0730040892958641,
0.02420450933277607,
-0.010505115613341331,
0.12641505897045135,
-0.07865285873413086,
0.0906529352068901,
-0.06671575456857681,
0.024498600512742996,
0.026367289945483208,
-0.1884288191795349,
-0.08302449434995651,
-0.04471737518906593,
0.044647034257650375,
0.006506338249891996,
0.23851554095745087,
0.0235122200101614,
-0.010749533772468567,
0.034232404083013535,
0.05822164937853813,
0.059870053082704544,
0.025883706286549568,
0.19520971179008484,
0.08231765776872635,
-0.072471983730793,
-0.11401300132274628,
0.03745274990797043,
0.01811346225440502,
-0.060790419578552246,
0.12814179062843323,
0.036005791276693344,
-0.027478119358420372,
0.07087187469005585,
-0.01174217090010643,
0.015425745397806168,
-0.0816754475235939,
-0.14454202353954315,
-0.05953683704137802,
0.03253600746393204,
-0.02498267963528633,
0.1120469719171524,
0.1812705099582672,
-0.0013609747402369976,
0.015817290171980858,
-0.03041967749595642,
-0.05926791578531265,
-0.1756991595029831,
-0.14313766360282898,
-0.08775309473276138,
-0.12457139790058136,
0.014500590041279793,
-0.12164600193500519,
0.025044914335012436,
0.053533174097537994,
0.06415745615959167,
-0.06100451946258545,
0.14366786181926727,
0.07371728122234344,
-0.08218565583229065,
0.058793265372514725,
-0.02419847622513771,
0.07684588432312012,
0.01638939045369625,
-0.03992878273129463,
-0.07652272284030914,
0.011380000039935112,
0.005251304712146521,
0.062121931463479996,
-0.03596027195453644,
0.03569943830370903,
-0.1459643691778183,
-0.08668382465839386,
-0.03907736763358116,
0.0812806636095047,
-0.044997770339250565,
0.13921087980270386,
0.018269622698426247,
-0.027931133285164833,
0.05132003873586655,
0.2326163798570633,
-0.06087464466691017,
-0.09179307520389557,
-0.04085228219628334,
0.22599206864833832,
0.022856563329696655,
0.12190650403499603,
-0.009267003275454044,
-0.016304370015859604,
-0.05399281904101372,
0.34328603744506836,
0.2938794195652008,
-0.07042541354894638,
0.038264770060777664,
-0.023749660700559616,
0.036720097064971924,
0.09899816662073135,
0.12958809733390808,
0.11624071002006531,
0.3169666826725006,
-0.06609135121107101,
-0.021776242181658745,
-0.0068347458727657795,
-0.0052537512965500355,
-0.11657443642616272,
0.08425644785165787,
0.025790410116314888,
-0.043594297021627426,
-0.04758237302303314,
0.09097945690155029,
-0.22038395702838898,
0.11581593006849289,
-0.13331608474254608,
-0.15514566004276276,
-0.04885316640138626,
0.0155342323705554,
0.13479961454868317,
-0.0050827935338020325,
0.08605613559484482,
-0.0002486487210262567,
-0.09584292024374008,
0.03018942102789879,
0.021486632525920868,
-0.18027843534946442,
0.01188130583614111,
0.03497105464339256,
-0.05494539812207222,
0.05150381475687027,
-0.010404348373413086,
0.059441130608320236,
0.07335782796144485,
0.027757329866290092,
-0.03716479241847992,
0.0907382071018219,
0.0018811057088896632,
-0.07033063471317291,
0.024655047804117203,
0.03311711549758911,
0.023353727534413338,
-0.08417163044214249,
0.06316978484392166,
-0.15417876839637756,
0.04319261014461517,
-0.0027546107303351164,
-0.0546049028635025,
-0.015047412365674973,
0.028999146074056625,
-0.052553869783878326,
0.04826204851269722,
0.060669150203466415,
-0.015072288922965527,
0.01514197327196598,
-0.05358981341123581,
-0.006741818506270647,
-0.03718702495098114,
-0.08331039547920227,
-0.05798166245222092,
-0.1719702184200287,
-0.08313801884651184,
0.12965653836727142,
0.002070409944280982,
-0.22290737926959991,
0.026626020669937134,
-0.10338432341814041,
0.07114361226558685,
-0.19193696975708008,
0.06663238257169724,
0.08630798757076263,
0.018926870077848434,
-0.002647911896929145,
-0.011747708544135094,
0.04613884165883064,
0.10495155304670334,
-0.07489542663097382,
-0.0890762060880661
] |
null | null |
transformers
|
#Mini-Me
|
{"tags": ["conversational"]}
|
text-generation
|
ardatasc/miniMe-version1
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
#Mini-Me
|
[] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
51
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-finetuned-en-to-ro-dataset_20-input_64
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the wmt16 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4335
- Bleu: 8.6652
- Gen Len: 18.2596
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
| 0.6351 | 1.0 | 7629 | 1.4335 | 8.6652 | 18.2596 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["wmt16"], "metrics": ["bleu"], "model-index": [{"name": "t5-small-finetuned-en-to-ro-dataset_20-input_64", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "wmt16", "type": "wmt16", "args": "ro-en"}, "metrics": [{"type": "bleu", "value": 8.6652, "name": "Bleu"}]}]}]}
|
text2text-generation
|
aretw0/t5-small-finetuned-en-to-ro-dataset_20-input_64
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:wmt16",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
t5-small-finetuned-en-to-ro-dataset\_20-input\_64
=================================================
This model is a fine-tuned version of t5-small on the wmt16 dataset.
It achieves the following results on the evaluation set:
* Loss: 1.4335
* Bleu: 8.6652
* Gen Len: 18.2596
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.5
* Pytorch 1.10.0+cu111
* Datasets 1.16.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
78,
113,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
-0.10917399078607559,
0.11471432447433472,
-0.0027396257501095533,
0.09895854443311691,
0.10955965518951416,
-0.0001069965583155863,
0.16124649345874786,
0.16258057951927185,
-0.10628575086593628,
0.053479187190532684,
0.1450633406639099,
0.1308496743440628,
0.051844220608472824,
0.1639290302991867,
-0.0668860450387001,
-0.24423490464687347,
0.04065472632646561,
0.056784600019454956,
-0.011823700740933418,
0.1331363320350647,
0.08877413719892502,
-0.12101919949054718,
0.09074299782514572,
0.031635139137506485,
-0.18749457597732544,
-0.020282115787267685,
0.002061503706499934,
-0.0736994668841362,
0.11226610839366913,
0.02977660670876503,
0.0911487564444542,
0.038374725729227066,
0.049766991287469864,
-0.16297008097171783,
0.010093905963003635,
0.05474463477730751,
0.008931340649724007,
0.1072186529636383,
0.05765332654118538,
-0.008725910447537899,
0.08835668116807938,
-0.06456660479307175,
0.06411945819854736,
0.019242841750383377,
-0.12956532835960388,
-0.2625778913497925,
-0.10905735194683075,
0.050317149609327316,
0.07186374813318253,
0.0861460417509079,
-0.006633952260017395,
0.18090227246284485,
-0.02800736017525196,
0.1068313792347908,
0.2432011365890503,
-0.29884496331214905,
-0.053969480097293854,
-0.014459454454481602,
0.04446632042527199,
0.07036794722080231,
-0.07821860909461975,
-0.03485395759344101,
0.025422576814889908,
0.04641352593898773,
0.14936189353466034,
-0.013171754777431488,
-0.024672530591487885,
-0.013022727333009243,
-0.1324913501739502,
-0.07593471556901932,
0.171248659491539,
0.03810932859778404,
-0.04172251746058464,
-0.0791015475988388,
-0.077671580016613,
-0.1780117154121399,
-0.045623596757650375,
0.010163996368646622,
0.03260093927383423,
-0.03613753989338875,
-0.090866319835186,
-0.011963759548962116,
-0.08345898240804672,
-0.03971271589398384,
-0.0400259830057621,
0.12204766273498535,
0.039326511323451996,
0.021013403311371803,
-0.0648568645119667,
0.08251743018627167,
-0.02115466259419918,
-0.17048144340515137,
-0.002365675289183855,
0.014742737635970116,
0.012092333287000656,
-0.0362180694937706,
-0.03975226357579231,
-0.12854725122451782,
0.0013694885419681668,
0.1508413404226303,
-0.08570656180381775,
0.07209396362304688,
-0.021052835509181023,
0.03757272660732269,
-0.0712432935833931,
0.1862633228302002,
-0.02531667985022068,
0.01199100911617279,
0.014259738847613335,
0.08283204585313797,
0.053004778921604156,
-0.03595377132296562,
-0.11852851510047913,
0.03560039773583412,
0.11873334646224976,
0.01798361726105213,
-0.026281235739588737,
0.05671486258506775,
-0.04950146749615669,
-0.033470869064331055,
0.06428056955337524,
-0.0985192358493805,
0.026263179257512093,
-0.014385460875928402,
-0.06137974187731743,
-0.016946153715252876,
0.014305219054222107,
0.010318907909095287,
-0.03901536390185356,
0.0866943746805191,
-0.09499428421258926,
0.01481225248426199,
-0.07909353822469711,
-0.13312721252441406,
0.032194834202528,
-0.08567824959754944,
0.004750316496938467,
-0.08903170377016068,
-0.15488608181476593,
-0.010155879892408848,
0.059911105781793594,
-0.04203338921070099,
-0.060760945081710815,
-0.04690389707684517,
-0.08462844043970108,
0.05228358507156372,
-0.017197903245687485,
0.09407021850347519,
-0.06994345039129257,
0.09313524514436722,
0.04367922991514206,
0.0699872076511383,
-0.03805668279528618,
0.046048570424318314,
-0.09221658110618591,
0.046713024377822876,
-0.20324106514453888,
0.05810253322124481,
-0.043135568499565125,
0.08136440813541412,
-0.10855108499526978,
-0.0986681804060936,
0.03332415595650673,
-0.0287183690816164,
0.10098884999752045,
0.10032692551612854,
-0.1712362915277481,
-0.0714666098356247,
0.19620312750339508,
-0.08605452626943588,
-0.13960541784763336,
0.13542011380195618,
-0.04866115376353264,
0.015205718576908112,
0.05108737573027611,
0.2256919890642166,
0.06009798124432564,
-0.09466048330068588,
-0.014539822936058044,
-0.043245840817689896,
0.0728016197681427,
-0.07243970781564713,
0.08524001389741898,
0.005715298466384411,
0.06257703900337219,
0.007864327169954777,
0.006257898174226284,
0.033685676753520966,
-0.08180579543113708,
-0.08341662585735321,
-0.05262620374560356,
-0.07464150339365005,
0.018977873027324677,
0.043067850172519684,
0.06436768174171448,
-0.12520940601825714,
-0.10918889939785004,
0.046763889491558075,
0.08006852120161057,
-0.08530648052692413,
0.051584817469120026,
-0.09699808061122894,
0.1122535690665245,
-0.08021575212478638,
-0.00659157196059823,
-0.18090251088142395,
-0.024141639471054077,
0.03383353725075722,
0.0010654424550011754,
0.013559523038566113,
-0.04356803372502327,
0.06793972849845886,
0.07252488285303116,
-0.04066728800535202,
-0.03788644075393677,
-0.02443225122988224,
-0.001314831548370421,
-0.11854644119739532,
-0.1943080872297287,
-0.048003263771533966,
-0.03863260895013809,
0.10233265906572342,
-0.15729813277721405,
0.03872581943869591,
0.056178245693445206,
0.10972263664007187,
0.042117565870285034,
-0.031345024704933167,
-0.003357804147526622,
0.06802110373973846,
-0.04786115139722824,
-0.06688948720693588,
0.06229365989565849,
0.026483003050088882,
-0.09326101094484329,
0.010820088908076286,
-0.158180832862854,
0.1622932404279709,
0.1346219778060913,
0.004879570100456476,
-0.0541812963783741,
-0.019613012671470642,
-0.05237652361392975,
-0.02854287065565586,
-0.028418002650141716,
0.016172830015420914,
0.1588008552789688,
0.025985952466726303,
0.15797199308872223,
-0.10222776979207993,
-0.05566391348838806,
0.0499078594148159,
-0.03026086650788784,
-0.011345218867063522,
0.11625569313764572,
0.039288707077503204,
-0.1284365952014923,
0.14239320158958435,
0.13931645452976227,
-0.042383112013339996,
0.13660340011119843,
-0.06493972986936569,
-0.07175718992948532,
-0.04036037623882294,
-0.013231067918241024,
0.03097875416278839,
0.10596074908971786,
-0.1082935631275177,
-0.015777455642819405,
0.04346192255616188,
0.031063292175531387,
0.0076416125521063805,
-0.19079066812992096,
-0.006651147734373808,
0.041639961302280426,
-0.04832824692130089,
-0.05512874573469162,
-0.00769960880279541,
0.004110692068934441,
0.09769129753112793,
0.014726448804140091,
-0.053582966327667236,
0.032937273383140564,
0.011940195225179195,
-0.07119324058294296,
0.18894925713539124,
-0.09939312934875488,
-0.16942797601222992,
-0.12950117886066437,
-0.10203353315591812,
-0.05805894732475281,
-0.005371432285755873,
0.07649076730012894,
-0.08027179539203644,
-0.04708235338330269,
-0.10186446458101273,
-0.027952581644058228,
-0.008628503419458866,
0.020371589809656143,
0.04548691213130951,
-0.020337754860520363,
0.06656253337860107,
-0.11115580052137375,
-0.02909284643828869,
-0.014477659948170185,
0.023933380842208862,
0.0658642128109932,
0.009455622173845768,
0.11307885497808456,
0.13176853954792023,
-0.016606198623776436,
0.04228051379323006,
-0.038424860686063766,
0.24758224189281464,
-0.07065451145172119,
-0.013800577260553837,
0.13955159485340118,
-0.018120357766747475,
0.09214428067207336,
0.12466620653867722,
0.05316920951008797,
-0.08791717886924744,
-0.0027621444314718246,
-0.001604850753210485,
-0.04535123333334923,
-0.21868807077407837,
-0.015303969383239746,
-0.05466558784246445,
0.001744882669299841,
0.10508858412504196,
0.021423574537038803,
0.03096095658838749,
0.05431736260652542,
0.012180328369140625,
0.05940742418169975,
-0.021445728838443756,
0.10620433837175369,
0.12473221868276596,
0.060851361602544785,
0.14198057353496552,
-0.05322448909282684,
-0.029268380254507065,
0.044931281358003616,
0.021764623001217842,
0.2137097716331482,
-0.008698340505361557,
0.21417513489723206,
0.04102635011076927,
0.15865211188793182,
0.02834424562752247,
0.07572963833808899,
-0.024131296202540398,
-0.008849830366671085,
-0.017770491540431976,
-0.04835456237196922,
-0.04220596328377724,
0.017860157415270805,
-0.05530538037419319,
0.03240496665239334,
-0.11530701071023941,
0.019909173250198364,
0.04589752107858658,
0.29963046312332153,
0.04798771068453789,
-0.3718622028827667,
-0.11109720170497894,
0.0094932671636343,
-0.04341591149568558,
-0.04128250107169151,
0.0008628775249235332,
0.09455103427171707,
-0.08159216493368149,
0.06703955680131912,
-0.08436376601457596,
0.11149004846811295,
-0.06510557234287262,
0.03551100939512253,
0.04669375345110893,
0.08505182713270187,
-0.012717513367533684,
0.05641089752316475,
-0.28100720047950745,
0.2738114297389984,
0.026290807873010635,
0.06374993175268173,
-0.07667809724807739,
0.012611317448318005,
0.006969583220779896,
0.03766192868351936,
0.05853402242064476,
-0.005911902990192175,
-0.10967565327882767,
-0.16153457760810852,
-0.1041492447257042,
0.011242196895182133,
0.07969959080219269,
0.0035379109904170036,
0.12099135667085648,
-0.016085704788565636,
-0.0014859960647299886,
0.04691782966256142,
-0.006847220938652754,
-0.033275824040174484,
-0.11474140733480453,
0.029838385060429573,
0.03969845548272133,
-0.032073017209768295,
-0.07454714924097061,
-0.10639364272356033,
-0.054155658930540085,
0.15956473350524902,
0.02727699838578701,
-0.07064219564199448,
-0.1298806518316269,
0.04128815978765488,
0.08371434360742569,
-0.09511010348796844,
0.023329902440309525,
-0.013915752992033958,
0.12196256965398788,
-0.007698898669332266,
-0.07569239288568497,
0.11352983862161636,
-0.05473337322473526,
-0.16862133145332336,
-0.05112864449620247,
0.12031980603933334,
0.008478439413011074,
0.06110162287950516,
-0.010009197518229485,
0.04102516546845436,
-0.03768019378185272,
-0.06940557807683945,
0.034696925431489944,
-0.004047933965921402,
0.09349345415830612,
-0.044153280556201935,
-0.003014791291207075,
0.028921516612172127,
-0.07135061919689178,
-0.03159482404589653,
0.1824987530708313,
0.2620856761932373,
-0.08438511192798615,
0.06301905959844589,
0.04395594820380211,
-0.05072613060474396,
-0.15080669522285461,
0.008553997613489628,
0.0609603077173233,
0.005049607250839472,
0.0060432893224060535,
-0.17499805986881256,
0.03296128660440445,
0.08296515792608261,
-0.01535357628017664,
0.07664306461811066,
-0.3164437413215637,
-0.12653249502182007,
0.08367202430963516,
0.12841640412807465,
0.09094066917896271,
-0.15086670219898224,
-0.0497749038040638,
-0.028958914801478386,
-0.15875592827796936,
0.137518048286438,
-0.0920708104968071,
0.11437015235424042,
-0.03256872668862343,
0.1096072793006897,
0.012829836457967758,
-0.0646963119506836,
0.11883208155632019,
-0.005842664744704962,
0.07460785657167435,
-0.06354458630084991,
0.023073723539710045,
0.09836554527282715,
-0.08192788809537888,
0.04517729952931404,
-0.10450837016105652,
0.038500525057315826,
-0.1263103038072586,
-0.016559315845370293,
-0.06823816150426865,
0.0007465110975317657,
-0.03655963018536568,
-0.038469184190034866,
-0.03676806017756462,
0.010679833590984344,
0.07058195024728775,
-0.0253180842846632,
0.1829911172389984,
0.02113182656466961,
0.14056538045406342,
0.15983447432518005,
0.09267670661211014,
-0.11937794834375381,
-0.06294651329517365,
-0.0052627376280725,
-0.03338058292865753,
0.043614864349365234,
-0.16437797248363495,
0.03173253685235977,
0.1364196389913559,
0.007336901500821114,
0.12597544491291046,
0.06369921565055847,
-0.0650341808795929,
0.023672858253121376,
0.05299321189522743,
-0.1716981828212738,
-0.10725577175617218,
-0.0063964324072003365,
0.04852115735411644,
-0.1315886229276657,
0.033887263387441635,
0.12701596319675446,
-0.05631670355796814,
-0.023221826180815697,
0.004589783493429422,
0.01895914413034916,
-0.014277997426688671,
0.1820504516363144,
0.035505592823028564,
0.06981107592582703,
-0.1099390760064125,
0.07826583087444305,
0.061465419828891754,
-0.10313957929611206,
0.050818558782339096,
0.10245969891548157,
-0.0971931740641594,
-0.028247613459825516,
0.036512572318315506,
0.17040444910526276,
-0.06097490340471268,
-0.04745258390903473,
-0.15727758407592773,
-0.1188703179359436,
0.0949232429265976,
0.17090941965579987,
0.06709177792072296,
0.008168856613337994,
-0.039086952805519104,
-0.015391355380415916,
-0.12216324359178543,
0.1019618809223175,
0.05355535075068474,
0.0792258083820343,
-0.12606219947338104,
0.11375851184129715,
-0.012528096325695515,
0.04522998631000519,
-0.006996932905167341,
0.015794357284903526,
-0.10961584746837616,
0.005936841946095228,
-0.13510942459106445,
0.004436931572854519,
-0.047851987183094025,
-0.0004405204963404685,
-0.0263049453496933,
-0.040917716920375824,
-0.057577889412641525,
0.017344828695058823,
-0.11474660038948059,
-0.03428823500871658,
0.013691357336938381,
0.02553664892911911,
-0.1210864931344986,
-0.022359712049365044,
0.018432727083563805,
-0.08651149272918701,
0.08237779140472412,
0.04350263625383377,
-0.004744785837829113,
0.02384069934487343,
-0.024593310430645943,
0.0023693498224020004,
0.04206053540110588,
0.008600871078670025,
0.0749109536409378,
-0.1182522103190422,
-0.01614619605243206,
0.006687935907393694,
0.018271127715706825,
0.027700547128915787,
0.11897240579128265,
-0.11851169168949127,
-0.003569161519408226,
0.002949648769572377,
-0.06171497330069542,
-0.07331183552742004,
0.06945384293794632,
0.09609014540910721,
0.019843291491270065,
0.18782134354114532,
-0.0701877698302269,
0.03631741181015968,
-0.2041826695203781,
-0.0004824092611670494,
0.011069185100495815,
-0.1444322168827057,
-0.0686158686876297,
-0.04079968109726906,
0.06533107161521912,
-0.07331144064664841,
0.10990822315216064,
0.00029397057369351387,
0.029118673875927925,
0.041680943220853806,
-0.024232879281044006,
-0.023682642728090286,
0.009833776392042637,
0.17776265740394592,
0.02087424322962761,
-0.03993479162454605,
0.08008939027786255,
0.021948911249637604,
0.08164765685796738,
0.1351768523454666,
0.19295553863048553,
0.12377285957336426,
0.05742652714252472,
0.09562498331069946,
0.0246766097843647,
-0.028312064707279205,
-0.18750935792922974,
0.04608301818370819,
-0.03345264121890068,
0.14469313621520996,
-0.0036162484902888536,
0.19875183701515198,
0.12363523989915848,
-0.1616145372390747,
0.049789298325777054,
-0.042546529322862625,
-0.08770232647657394,
-0.10374753177165985,
-0.10902634263038635,
-0.08727183192968369,
-0.13637308776378632,
-0.006051057018339634,
-0.12545187771320343,
0.04457848146557808,
0.06355591863393784,
0.01382653508335352,
-0.006067642010748386,
0.1218736544251442,
0.03791828826069832,
0.01333939004689455,
0.057010456919670105,
-0.0005080885021016002,
-0.040598683059215546,
-0.04785794019699097,
-0.06846243888139725,
0.01792242005467415,
0.0008599443244747818,
0.05296821519732475,
-0.009698429144918919,
-0.012721187435090542,
0.041002094745635986,
-0.024327702820301056,
-0.12153244018554688,
0.007053912151604891,
0.028403956443071365,
0.07168569415807724,
0.04125894606113434,
0.011225602589547634,
0.006194763816893101,
-0.015729404985904694,
0.20045487582683563,
-0.07241246104240417,
-0.058644987642765045,
-0.11636999249458313,
0.235474094748497,
0.01072970312088728,
-0.049952827394008636,
0.03867815434932709,
-0.06700122356414795,
-0.0035528417211025953,
0.203794464468956,
0.1763950139284134,
-0.04246234521269798,
-0.013220349326729774,
-0.01708497293293476,
-0.008600465953350067,
-0.02101289853453636,
0.10620400309562683,
0.12142284214496613,
0.023890454322099686,
-0.07888230681419373,
-0.026393845677375793,
-0.06630735844373703,
-0.0199846550822258,
-0.04438477009534836,
0.07920759916305542,
0.022992458194494247,
-0.004345722496509552,
-0.035063277930021286,
0.054386038333177567,
-0.053618792444467545,
-0.05333730950951576,
0.005299953278154135,
-0.21374726295471191,
-0.1652282476425171,
0.0011250237002968788,
0.07984796911478043,
-0.010978286154568195,
0.05968651920557022,
-0.005668939091265202,
0.011672243475914001,
0.08514123409986496,
-0.015483490191400051,
-0.07583421468734741,
-0.07210621982812881,
0.10283120721578598,
-0.1565060317516327,
0.179169699549675,
-0.03168606758117676,
0.03672231733798981,
0.14063027501106262,
0.04921741038560867,
-0.11283878982067108,
0.05979303643107414,
0.04866620525717735,
-0.040907539427280426,
0.010315114632248878,
0.12839774787425995,
-0.025993308052420616,
0.07548055052757263,
0.04117988795042038,
-0.11852877587080002,
-0.009816375561058521,
-0.07926999032497406,
-0.01329423114657402,
-0.020900364965200424,
-0.04580537602305412,
-0.04396604001522064,
0.13080313801765442,
0.1953880935907364,
-0.046875834465026855,
-0.010142689570784569,
-0.06230320408940315,
0.014063090085983276,
0.06366412341594696,
-0.011526157148182392,
-0.057635754346847534,
-0.2592150568962097,
0.0015044639585539699,
0.0800754725933075,
-0.0048964377492666245,
-0.2722071707248688,
-0.09105357527732849,
-0.0024159944150596857,
-0.05014897882938385,
-0.10266423225402832,
0.09415236860513687,
0.08769173920154572,
0.047062430530786514,
-0.06798126548528671,
0.0034771731588989496,
-0.07054917514324188,
0.1582612246274948,
-0.1364070177078247,
-0.06375838816165924
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-finetuned-en-to-ro-dataset_20
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the wmt16 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4052
- Bleu: 7.3293
- Gen Len: 18.2556
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
| 0.6029 | 1.0 | 7629 | 1.4052 | 7.3293 | 18.2556 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["wmt16"], "metrics": ["bleu"], "model-index": [{"name": "t5-small-finetuned-en-to-ro-dataset_20", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "wmt16", "type": "wmt16", "args": "ro-en"}, "metrics": [{"type": "bleu", "value": 7.3293, "name": "Bleu"}]}]}]}
|
text2text-generation
|
aretw0/t5-small-finetuned-en-to-ro-dataset_20
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:wmt16",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
t5-small-finetuned-en-to-ro-dataset\_20
=======================================
This model is a fine-tuned version of t5-small on the wmt16 dataset.
It achieves the following results on the evaluation set:
* Loss: 1.4052
* Bleu: 7.3293
* Gen Len: 18.2556
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.5
* Pytorch 1.10.0+cu111
* Datasets 1.16.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
78,
113,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
-0.10917399078607559,
0.11471432447433472,
-0.0027396257501095533,
0.09895854443311691,
0.10955965518951416,
-0.0001069965583155863,
0.16124649345874786,
0.16258057951927185,
-0.10628575086593628,
0.053479187190532684,
0.1450633406639099,
0.1308496743440628,
0.051844220608472824,
0.1639290302991867,
-0.0668860450387001,
-0.24423490464687347,
0.04065472632646561,
0.056784600019454956,
-0.011823700740933418,
0.1331363320350647,
0.08877413719892502,
-0.12101919949054718,
0.09074299782514572,
0.031635139137506485,
-0.18749457597732544,
-0.020282115787267685,
0.002061503706499934,
-0.0736994668841362,
0.11226610839366913,
0.02977660670876503,
0.0911487564444542,
0.038374725729227066,
0.049766991287469864,
-0.16297008097171783,
0.010093905963003635,
0.05474463477730751,
0.008931340649724007,
0.1072186529636383,
0.05765332654118538,
-0.008725910447537899,
0.08835668116807938,
-0.06456660479307175,
0.06411945819854736,
0.019242841750383377,
-0.12956532835960388,
-0.2625778913497925,
-0.10905735194683075,
0.050317149609327316,
0.07186374813318253,
0.0861460417509079,
-0.006633952260017395,
0.18090227246284485,
-0.02800736017525196,
0.1068313792347908,
0.2432011365890503,
-0.29884496331214905,
-0.053969480097293854,
-0.014459454454481602,
0.04446632042527199,
0.07036794722080231,
-0.07821860909461975,
-0.03485395759344101,
0.025422576814889908,
0.04641352593898773,
0.14936189353466034,
-0.013171754777431488,
-0.024672530591487885,
-0.013022727333009243,
-0.1324913501739502,
-0.07593471556901932,
0.171248659491539,
0.03810932859778404,
-0.04172251746058464,
-0.0791015475988388,
-0.077671580016613,
-0.1780117154121399,
-0.045623596757650375,
0.010163996368646622,
0.03260093927383423,
-0.03613753989338875,
-0.090866319835186,
-0.011963759548962116,
-0.08345898240804672,
-0.03971271589398384,
-0.0400259830057621,
0.12204766273498535,
0.039326511323451996,
0.021013403311371803,
-0.0648568645119667,
0.08251743018627167,
-0.02115466259419918,
-0.17048144340515137,
-0.002365675289183855,
0.014742737635970116,
0.012092333287000656,
-0.0362180694937706,
-0.03975226357579231,
-0.12854725122451782,
0.0013694885419681668,
0.1508413404226303,
-0.08570656180381775,
0.07209396362304688,
-0.021052835509181023,
0.03757272660732269,
-0.0712432935833931,
0.1862633228302002,
-0.02531667985022068,
0.01199100911617279,
0.014259738847613335,
0.08283204585313797,
0.053004778921604156,
-0.03595377132296562,
-0.11852851510047913,
0.03560039773583412,
0.11873334646224976,
0.01798361726105213,
-0.026281235739588737,
0.05671486258506775,
-0.04950146749615669,
-0.033470869064331055,
0.06428056955337524,
-0.0985192358493805,
0.026263179257512093,
-0.014385460875928402,
-0.06137974187731743,
-0.016946153715252876,
0.014305219054222107,
0.010318907909095287,
-0.03901536390185356,
0.0866943746805191,
-0.09499428421258926,
0.01481225248426199,
-0.07909353822469711,
-0.13312721252441406,
0.032194834202528,
-0.08567824959754944,
0.004750316496938467,
-0.08903170377016068,
-0.15488608181476593,
-0.010155879892408848,
0.059911105781793594,
-0.04203338921070099,
-0.060760945081710815,
-0.04690389707684517,
-0.08462844043970108,
0.05228358507156372,
-0.017197903245687485,
0.09407021850347519,
-0.06994345039129257,
0.09313524514436722,
0.04367922991514206,
0.0699872076511383,
-0.03805668279528618,
0.046048570424318314,
-0.09221658110618591,
0.046713024377822876,
-0.20324106514453888,
0.05810253322124481,
-0.043135568499565125,
0.08136440813541412,
-0.10855108499526978,
-0.0986681804060936,
0.03332415595650673,
-0.0287183690816164,
0.10098884999752045,
0.10032692551612854,
-0.1712362915277481,
-0.0714666098356247,
0.19620312750339508,
-0.08605452626943588,
-0.13960541784763336,
0.13542011380195618,
-0.04866115376353264,
0.015205718576908112,
0.05108737573027611,
0.2256919890642166,
0.06009798124432564,
-0.09466048330068588,
-0.014539822936058044,
-0.043245840817689896,
0.0728016197681427,
-0.07243970781564713,
0.08524001389741898,
0.005715298466384411,
0.06257703900337219,
0.007864327169954777,
0.006257898174226284,
0.033685676753520966,
-0.08180579543113708,
-0.08341662585735321,
-0.05262620374560356,
-0.07464150339365005,
0.018977873027324677,
0.043067850172519684,
0.06436768174171448,
-0.12520940601825714,
-0.10918889939785004,
0.046763889491558075,
0.08006852120161057,
-0.08530648052692413,
0.051584817469120026,
-0.09699808061122894,
0.1122535690665245,
-0.08021575212478638,
-0.00659157196059823,
-0.18090251088142395,
-0.024141639471054077,
0.03383353725075722,
0.0010654424550011754,
0.013559523038566113,
-0.04356803372502327,
0.06793972849845886,
0.07252488285303116,
-0.04066728800535202,
-0.03788644075393677,
-0.02443225122988224,
-0.001314831548370421,
-0.11854644119739532,
-0.1943080872297287,
-0.048003263771533966,
-0.03863260895013809,
0.10233265906572342,
-0.15729813277721405,
0.03872581943869591,
0.056178245693445206,
0.10972263664007187,
0.042117565870285034,
-0.031345024704933167,
-0.003357804147526622,
0.06802110373973846,
-0.04786115139722824,
-0.06688948720693588,
0.06229365989565849,
0.026483003050088882,
-0.09326101094484329,
0.010820088908076286,
-0.158180832862854,
0.1622932404279709,
0.1346219778060913,
0.004879570100456476,
-0.0541812963783741,
-0.019613012671470642,
-0.05237652361392975,
-0.02854287065565586,
-0.028418002650141716,
0.016172830015420914,
0.1588008552789688,
0.025985952466726303,
0.15797199308872223,
-0.10222776979207993,
-0.05566391348838806,
0.0499078594148159,
-0.03026086650788784,
-0.011345218867063522,
0.11625569313764572,
0.039288707077503204,
-0.1284365952014923,
0.14239320158958435,
0.13931645452976227,
-0.042383112013339996,
0.13660340011119843,
-0.06493972986936569,
-0.07175718992948532,
-0.04036037623882294,
-0.013231067918241024,
0.03097875416278839,
0.10596074908971786,
-0.1082935631275177,
-0.015777455642819405,
0.04346192255616188,
0.031063292175531387,
0.0076416125521063805,
-0.19079066812992096,
-0.006651147734373808,
0.041639961302280426,
-0.04832824692130089,
-0.05512874573469162,
-0.00769960880279541,
0.004110692068934441,
0.09769129753112793,
0.014726448804140091,
-0.053582966327667236,
0.032937273383140564,
0.011940195225179195,
-0.07119324058294296,
0.18894925713539124,
-0.09939312934875488,
-0.16942797601222992,
-0.12950117886066437,
-0.10203353315591812,
-0.05805894732475281,
-0.005371432285755873,
0.07649076730012894,
-0.08027179539203644,
-0.04708235338330269,
-0.10186446458101273,
-0.027952581644058228,
-0.008628503419458866,
0.020371589809656143,
0.04548691213130951,
-0.020337754860520363,
0.06656253337860107,
-0.11115580052137375,
-0.02909284643828869,
-0.014477659948170185,
0.023933380842208862,
0.0658642128109932,
0.009455622173845768,
0.11307885497808456,
0.13176853954792023,
-0.016606198623776436,
0.04228051379323006,
-0.038424860686063766,
0.24758224189281464,
-0.07065451145172119,
-0.013800577260553837,
0.13955159485340118,
-0.018120357766747475,
0.09214428067207336,
0.12466620653867722,
0.05316920951008797,
-0.08791717886924744,
-0.0027621444314718246,
-0.001604850753210485,
-0.04535123333334923,
-0.21868807077407837,
-0.015303969383239746,
-0.05466558784246445,
0.001744882669299841,
0.10508858412504196,
0.021423574537038803,
0.03096095658838749,
0.05431736260652542,
0.012180328369140625,
0.05940742418169975,
-0.021445728838443756,
0.10620433837175369,
0.12473221868276596,
0.060851361602544785,
0.14198057353496552,
-0.05322448909282684,
-0.029268380254507065,
0.044931281358003616,
0.021764623001217842,
0.2137097716331482,
-0.008698340505361557,
0.21417513489723206,
0.04102635011076927,
0.15865211188793182,
0.02834424562752247,
0.07572963833808899,
-0.024131296202540398,
-0.008849830366671085,
-0.017770491540431976,
-0.04835456237196922,
-0.04220596328377724,
0.017860157415270805,
-0.05530538037419319,
0.03240496665239334,
-0.11530701071023941,
0.019909173250198364,
0.04589752107858658,
0.29963046312332153,
0.04798771068453789,
-0.3718622028827667,
-0.11109720170497894,
0.0094932671636343,
-0.04341591149568558,
-0.04128250107169151,
0.0008628775249235332,
0.09455103427171707,
-0.08159216493368149,
0.06703955680131912,
-0.08436376601457596,
0.11149004846811295,
-0.06510557234287262,
0.03551100939512253,
0.04669375345110893,
0.08505182713270187,
-0.012717513367533684,
0.05641089752316475,
-0.28100720047950745,
0.2738114297389984,
0.026290807873010635,
0.06374993175268173,
-0.07667809724807739,
0.012611317448318005,
0.006969583220779896,
0.03766192868351936,
0.05853402242064476,
-0.005911902990192175,
-0.10967565327882767,
-0.16153457760810852,
-0.1041492447257042,
0.011242196895182133,
0.07969959080219269,
0.0035379109904170036,
0.12099135667085648,
-0.016085704788565636,
-0.0014859960647299886,
0.04691782966256142,
-0.006847220938652754,
-0.033275824040174484,
-0.11474140733480453,
0.029838385060429573,
0.03969845548272133,
-0.032073017209768295,
-0.07454714924097061,
-0.10639364272356033,
-0.054155658930540085,
0.15956473350524902,
0.02727699838578701,
-0.07064219564199448,
-0.1298806518316269,
0.04128815978765488,
0.08371434360742569,
-0.09511010348796844,
0.023329902440309525,
-0.013915752992033958,
0.12196256965398788,
-0.007698898669332266,
-0.07569239288568497,
0.11352983862161636,
-0.05473337322473526,
-0.16862133145332336,
-0.05112864449620247,
0.12031980603933334,
0.008478439413011074,
0.06110162287950516,
-0.010009197518229485,
0.04102516546845436,
-0.03768019378185272,
-0.06940557807683945,
0.034696925431489944,
-0.004047933965921402,
0.09349345415830612,
-0.044153280556201935,
-0.003014791291207075,
0.028921516612172127,
-0.07135061919689178,
-0.03159482404589653,
0.1824987530708313,
0.2620856761932373,
-0.08438511192798615,
0.06301905959844589,
0.04395594820380211,
-0.05072613060474396,
-0.15080669522285461,
0.008553997613489628,
0.0609603077173233,
0.005049607250839472,
0.0060432893224060535,
-0.17499805986881256,
0.03296128660440445,
0.08296515792608261,
-0.01535357628017664,
0.07664306461811066,
-0.3164437413215637,
-0.12653249502182007,
0.08367202430963516,
0.12841640412807465,
0.09094066917896271,
-0.15086670219898224,
-0.0497749038040638,
-0.028958914801478386,
-0.15875592827796936,
0.137518048286438,
-0.0920708104968071,
0.11437015235424042,
-0.03256872668862343,
0.1096072793006897,
0.012829836457967758,
-0.0646963119506836,
0.11883208155632019,
-0.005842664744704962,
0.07460785657167435,
-0.06354458630084991,
0.023073723539710045,
0.09836554527282715,
-0.08192788809537888,
0.04517729952931404,
-0.10450837016105652,
0.038500525057315826,
-0.1263103038072586,
-0.016559315845370293,
-0.06823816150426865,
0.0007465110975317657,
-0.03655963018536568,
-0.038469184190034866,
-0.03676806017756462,
0.010679833590984344,
0.07058195024728775,
-0.0253180842846632,
0.1829911172389984,
0.02113182656466961,
0.14056538045406342,
0.15983447432518005,
0.09267670661211014,
-0.11937794834375381,
-0.06294651329517365,
-0.0052627376280725,
-0.03338058292865753,
0.043614864349365234,
-0.16437797248363495,
0.03173253685235977,
0.1364196389913559,
0.007336901500821114,
0.12597544491291046,
0.06369921565055847,
-0.0650341808795929,
0.023672858253121376,
0.05299321189522743,
-0.1716981828212738,
-0.10725577175617218,
-0.0063964324072003365,
0.04852115735411644,
-0.1315886229276657,
0.033887263387441635,
0.12701596319675446,
-0.05631670355796814,
-0.023221826180815697,
0.004589783493429422,
0.01895914413034916,
-0.014277997426688671,
0.1820504516363144,
0.035505592823028564,
0.06981107592582703,
-0.1099390760064125,
0.07826583087444305,
0.061465419828891754,
-0.10313957929611206,
0.050818558782339096,
0.10245969891548157,
-0.0971931740641594,
-0.028247613459825516,
0.036512572318315506,
0.17040444910526276,
-0.06097490340471268,
-0.04745258390903473,
-0.15727758407592773,
-0.1188703179359436,
0.0949232429265976,
0.17090941965579987,
0.06709177792072296,
0.008168856613337994,
-0.039086952805519104,
-0.015391355380415916,
-0.12216324359178543,
0.1019618809223175,
0.05355535075068474,
0.0792258083820343,
-0.12606219947338104,
0.11375851184129715,
-0.012528096325695515,
0.04522998631000519,
-0.006996932905167341,
0.015794357284903526,
-0.10961584746837616,
0.005936841946095228,
-0.13510942459106445,
0.004436931572854519,
-0.047851987183094025,
-0.0004405204963404685,
-0.0263049453496933,
-0.040917716920375824,
-0.057577889412641525,
0.017344828695058823,
-0.11474660038948059,
-0.03428823500871658,
0.013691357336938381,
0.02553664892911911,
-0.1210864931344986,
-0.022359712049365044,
0.018432727083563805,
-0.08651149272918701,
0.08237779140472412,
0.04350263625383377,
-0.004744785837829113,
0.02384069934487343,
-0.024593310430645943,
0.0023693498224020004,
0.04206053540110588,
0.008600871078670025,
0.0749109536409378,
-0.1182522103190422,
-0.01614619605243206,
0.006687935907393694,
0.018271127715706825,
0.027700547128915787,
0.11897240579128265,
-0.11851169168949127,
-0.003569161519408226,
0.002949648769572377,
-0.06171497330069542,
-0.07331183552742004,
0.06945384293794632,
0.09609014540910721,
0.019843291491270065,
0.18782134354114532,
-0.0701877698302269,
0.03631741181015968,
-0.2041826695203781,
-0.0004824092611670494,
0.011069185100495815,
-0.1444322168827057,
-0.0686158686876297,
-0.04079968109726906,
0.06533107161521912,
-0.07331144064664841,
0.10990822315216064,
0.00029397057369351387,
0.029118673875927925,
0.041680943220853806,
-0.024232879281044006,
-0.023682642728090286,
0.009833776392042637,
0.17776265740394592,
0.02087424322962761,
-0.03993479162454605,
0.08008939027786255,
0.021948911249637604,
0.08164765685796738,
0.1351768523454666,
0.19295553863048553,
0.12377285957336426,
0.05742652714252472,
0.09562498331069946,
0.0246766097843647,
-0.028312064707279205,
-0.18750935792922974,
0.04608301818370819,
-0.03345264121890068,
0.14469313621520996,
-0.0036162484902888536,
0.19875183701515198,
0.12363523989915848,
-0.1616145372390747,
0.049789298325777054,
-0.042546529322862625,
-0.08770232647657394,
-0.10374753177165985,
-0.10902634263038635,
-0.08727183192968369,
-0.13637308776378632,
-0.006051057018339634,
-0.12545187771320343,
0.04457848146557808,
0.06355591863393784,
0.01382653508335352,
-0.006067642010748386,
0.1218736544251442,
0.03791828826069832,
0.01333939004689455,
0.057010456919670105,
-0.0005080885021016002,
-0.040598683059215546,
-0.04785794019699097,
-0.06846243888139725,
0.01792242005467415,
0.0008599443244747818,
0.05296821519732475,
-0.009698429144918919,
-0.012721187435090542,
0.041002094745635986,
-0.024327702820301056,
-0.12153244018554688,
0.007053912151604891,
0.028403956443071365,
0.07168569415807724,
0.04125894606113434,
0.011225602589547634,
0.006194763816893101,
-0.015729404985904694,
0.20045487582683563,
-0.07241246104240417,
-0.058644987642765045,
-0.11636999249458313,
0.235474094748497,
0.01072970312088728,
-0.049952827394008636,
0.03867815434932709,
-0.06700122356414795,
-0.0035528417211025953,
0.203794464468956,
0.1763950139284134,
-0.04246234521269798,
-0.013220349326729774,
-0.01708497293293476,
-0.008600465953350067,
-0.02101289853453636,
0.10620400309562683,
0.12142284214496613,
0.023890454322099686,
-0.07888230681419373,
-0.026393845677375793,
-0.06630735844373703,
-0.0199846550822258,
-0.04438477009534836,
0.07920759916305542,
0.022992458194494247,
-0.004345722496509552,
-0.035063277930021286,
0.054386038333177567,
-0.053618792444467545,
-0.05333730950951576,
0.005299953278154135,
-0.21374726295471191,
-0.1652282476425171,
0.0011250237002968788,
0.07984796911478043,
-0.010978286154568195,
0.05968651920557022,
-0.005668939091265202,
0.011672243475914001,
0.08514123409986496,
-0.015483490191400051,
-0.07583421468734741,
-0.07210621982812881,
0.10283120721578598,
-0.1565060317516327,
0.179169699549675,
-0.03168606758117676,
0.03672231733798981,
0.14063027501106262,
0.04921741038560867,
-0.11283878982067108,
0.05979303643107414,
0.04866620525717735,
-0.040907539427280426,
0.010315114632248878,
0.12839774787425995,
-0.025993308052420616,
0.07548055052757263,
0.04117988795042038,
-0.11852877587080002,
-0.009816375561058521,
-0.07926999032497406,
-0.01329423114657402,
-0.020900364965200424,
-0.04580537602305412,
-0.04396604001522064,
0.13080313801765442,
0.1953880935907364,
-0.046875834465026855,
-0.010142689570784569,
-0.06230320408940315,
0.014063090085983276,
0.06366412341594696,
-0.011526157148182392,
-0.057635754346847534,
-0.2592150568962097,
0.0015044639585539699,
0.0800754725933075,
-0.0048964377492666245,
-0.2722071707248688,
-0.09105357527732849,
-0.0024159944150596857,
-0.05014897882938385,
-0.10266423225402832,
0.09415236860513687,
0.08769173920154572,
0.047062430530786514,
-0.06798126548528671,
0.0034771731588989496,
-0.07054917514324188,
0.1582612246274948,
-0.1364070177078247,
-0.06375838816165924
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-finetuned-en-to-ro-epoch.04375
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the wmt16 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4137
- Bleu: 7.3292
- Gen Len: 18.2541
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 0.04375
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
| 0.6211 | 0.04 | 1669 | 1.4137 | 7.3292 | 18.2541 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["wmt16"], "metrics": ["bleu"], "model-index": [{"name": "t5-small-finetuned-en-to-ro-epoch.04375", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "wmt16", "type": "wmt16", "args": "ro-en"}, "metrics": [{"type": "bleu", "value": 7.3292, "name": "Bleu"}]}]}]}
|
text2text-generation
|
aretw0/t5-small-finetuned-en-to-ro-epoch.04375
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:wmt16",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
t5-small-finetuned-en-to-ro-epoch.04375
=======================================
This model is a fine-tuned version of t5-small on the wmt16 dataset.
It achieves the following results on the evaluation set:
* Loss: 1.4137
* Bleu: 7.3292
* Gen Len: 18.2541
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 0.04375
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.5
* Pytorch 1.10.0+cu111
* Datasets 1.16.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 0.04375\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 0.04375\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
78,
115,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 0.04375\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
-0.11355859786272049,
0.11958212405443192,
-0.0023930994793772697,
0.1040293425321579,
0.11815237998962402,
0.010798321105539799,
0.1583251953125,
0.15422649681568146,
-0.08017300814390182,
0.06317495554685593,
0.13834717869758606,
0.1232457160949707,
0.050300292670726776,
0.17082752287387848,
-0.06465116888284683,
-0.24050045013427734,
0.03651871904730797,
0.048516660928726196,
-0.020013654604554176,
0.1320967823266983,
0.09104315936565399,
-0.12213777005672455,
0.09467452764511108,
0.019153105095028877,
-0.18992695212364197,
-0.03160153701901436,
-0.003368685720488429,
-0.07167211920022964,
0.11814610660076141,
0.025275450199842453,
0.09340940415859222,
0.03243903070688248,
0.06150004267692566,
-0.16184045374393463,
0.006510847248136997,
0.04869823530316353,
0.01387608889490366,
0.10837285965681076,
0.05989711731672287,
-0.006729405373334885,
0.06522372364997864,
-0.06819675117731094,
0.06319751590490341,
0.016524381935596466,
-0.1260031759738922,
-0.2310812771320343,
-0.10681989789009094,
0.040963370352983475,
0.07158489525318146,
0.08653455972671509,
-0.008987713605165482,
0.1603829562664032,
-0.02444065362215042,
0.09677935391664505,
0.23465989530086517,
-0.29246383905410767,
-0.05660133436322212,
0.006722402758896351,
0.042621489614248276,
0.0699041411280632,
-0.08816970884799957,
-0.03215586021542549,
0.02979852817952633,
0.04542522132396698,
0.14898699522018433,
-0.010475261136889458,
-0.043272897601127625,
-0.015804782509803772,
-0.13280920684337616,
-0.06628220528364182,
0.17782992124557495,
0.045591387897729874,
-0.03991793096065521,
-0.07616721093654633,
-0.07260129600763321,
-0.18563634157180786,
-0.046174515038728714,
0.013543419539928436,
0.03126811981201172,
-0.03580594062805176,
-0.0915488749742508,
-0.0026069441810250282,
-0.08445821702480316,
-0.04495169222354889,
-0.042746733874082565,
0.10879508405923843,
0.0313115119934082,
0.015266726724803448,
-0.052077945321798325,
0.08842860162258148,
-0.021066593006253242,
-0.1667962521314621,
0.0021105504129081964,
0.015133997425436974,
-0.0012951118405908346,
-0.03777805715799332,
-0.038460202515125275,
-0.11467035859823227,
0.006052929442375898,
0.142485573887825,
-0.05564317852258682,
0.0644029900431633,
-0.023434733971953392,
0.033077407628297806,
-0.06480270624160767,
0.18186743557453156,
-0.04076176509261131,
-0.011809883639216423,
0.01038049440830946,
0.08130677789449692,
0.040586262941360474,
-0.03503059968352318,
-0.11181511729955673,
0.02609829418361187,
0.12543101608753204,
0.017108213156461716,
-0.020487692207098007,
0.059271860867738724,
-0.052163708955049515,
-0.03357844427227974,
0.04944296553730965,
-0.09536384791135788,
0.026181433349847794,
-0.015157248824834824,
-0.06480032950639725,
-0.015978366136550903,
0.0131592508405447,
0.017938872799277306,
-0.02744029276072979,
0.10054382681846619,
-0.10235165059566498,
0.010000696405768394,
-0.07515110820531845,
-0.1285393238067627,
0.03577732667326927,
-0.09070876240730286,
0.0033132461830973625,
-0.08687542378902435,
-0.16116873919963837,
-0.012056716717779636,
0.05472317710518837,
-0.05003323405981064,
-0.06247127428650856,
-0.05492504686117172,
-0.07828879356384277,
0.05564979091286659,
-0.019117387011647224,
0.11163051426410675,
-0.06798958033323288,
0.09549763798713684,
0.035293448716402054,
0.07010388374328613,
-0.04475926607847214,
0.05006000027060509,
-0.08994928002357483,
0.043300725519657135,
-0.17937125265598297,
0.060543086379766464,
-0.04930565506219864,
0.07861688733100891,
-0.11376793682575226,
-0.09156212955713272,
0.02369965985417366,
-0.0350431427359581,
0.09872924536466599,
0.10338041931390762,
-0.17450521886348724,
-0.06287068128585815,
0.1867213249206543,
-0.0720563605427742,
-0.1456490457057953,
0.12943045794963837,
-0.04990687593817711,
0.02415744960308075,
0.05205373466014862,
0.21149466931819916,
0.05087153613567352,
-0.0956619530916214,
-0.02689063921570778,
-0.04323284327983856,
0.07875283062458038,
-0.0717400535941124,
0.0898963063955307,
0.0024840328842401505,
0.04745776578783989,
0.011255783960223198,
-0.004472443833947182,
0.0372447669506073,
-0.08730441331863403,
-0.08348362147808075,
-0.049417462199926376,
-0.07876156270503998,
0.031851425766944885,
0.03984181955456734,
0.061837468296289444,
-0.11128418147563934,
-0.1022668331861496,
0.04979128763079643,
0.08042041212320328,
-0.08251916617155075,
0.04001077637076378,
-0.08567553758621216,
0.10192827880382538,
-0.09212177991867065,
-0.013103016652166843,
-0.18283092975616455,
-0.047361329197883606,
0.034826867282390594,
0.003341177711263299,
0.015306833200156689,
-0.04680808261036873,
0.06808830052614212,
0.06499762088060379,
-0.039170730859041214,
-0.028319137170910835,
-0.025407863780856133,
0.0007770478841848671,
-0.12280026078224182,
-0.19127199053764343,
-0.05346057564020157,
-0.03563142567873001,
0.12225419282913208,
-0.15869565308094025,
0.02976405993103981,
0.042201098054647446,
0.11153491586446762,
0.036737602204084396,
-0.03308021277189255,
-0.005272602662444115,
0.06475970894098282,
-0.04929840564727783,
-0.0684426799416542,
0.059037499129772186,
0.025359049439430237,
-0.08998667448759079,
0.002762942109256983,
-0.14825865626335144,
0.1438782811164856,
0.13584768772125244,
0.0034240458626300097,
-0.052714940160512924,
-0.01959381252527237,
-0.0500941276550293,
-0.036068178713321686,
-0.04011284187436104,
0.005405508913099766,
0.13689717650413513,
0.03052263893187046,
0.1496427208185196,
-0.0945003554224968,
-0.05132037028670311,
0.05130838230252266,
-0.025266965851187706,
-0.011248418129980564,
0.10615134239196777,
0.05566892772912979,
-0.11741115152835846,
0.14262962341308594,
0.12900525331497192,
-0.0404241569340229,
0.12840217351913452,
-0.05328826233744621,
-0.0784698873758316,
-0.04013068228960037,
-0.01672384701669216,
0.02747323550283909,
0.10872180014848709,
-0.11625301092863083,
-0.010264373384416103,
0.04829234629869461,
0.03548213094472885,
0.0092500951141119,
-0.18316613137722015,
-0.00823232252150774,
0.03243944048881531,
-0.0468301996588707,
-0.04823724180459976,
-0.006613355595618486,
0.007521874271333218,
0.09986700862646103,
0.021024109795689583,
-0.05582885816693306,
0.03387114778161049,
0.008931626565754414,
-0.07309086620807648,
0.1909974217414856,
-0.09300456196069717,
-0.17221203446388245,
-0.13131359219551086,
-0.11801553517580032,
-0.06040910258889198,
-0.005481838248670101,
0.0678437203168869,
-0.07215547561645508,
-0.04819080978631973,
-0.09418012201786041,
-0.013871944509446621,
-0.0028785024769604206,
0.028172116726636887,
0.044998250901699066,
-0.01743847317993641,
0.0630635917186737,
-0.10616989433765411,
-0.02115694060921669,
-0.013767201453447342,
0.01726076938211918,
0.060211844742298126,
0.012064685113728046,
0.11723420023918152,
0.13522818684577942,
-0.014232723973691463,
0.039410222321748734,
-0.03332284092903137,
0.24811135232448578,
-0.06922721117734909,
-0.011142836883664131,
0.12225017696619034,
-0.01659628562629223,
0.0840102881193161,
0.13667342066764832,
0.05544067174196243,
-0.09188167750835419,
0.006023782305419445,
0.006875304970890284,
-0.042325686663389206,
-0.21911200881004333,
-0.029189929366111755,
-0.05669541656970978,
0.006195904221385717,
0.11490573734045029,
0.022671131417155266,
0.033251918852329254,
0.06201527267694473,
0.019244100898504257,
0.07461458444595337,
-0.02725265733897686,
0.10007045418024063,
0.11611834168434143,
0.06553144007921219,
0.1372271031141281,
-0.04598550125956535,
-0.030874591320753098,
0.04764347895979881,
0.02086843177676201,
0.22114747762680054,
-0.009621597826480865,
0.21171505749225616,
0.036077920347452164,
0.16897477209568024,
0.02638561837375164,
0.08625417947769165,
-0.01897333562374115,
-0.004323026165366173,
-0.015549756586551666,
-0.04979574680328369,
-0.036689773201942444,
0.016183925792574883,
-0.05049306899309158,
0.03102521412074566,
-0.11072900891304016,
0.017136594280600548,
0.040665946900844574,
0.2922331392765045,
0.05991999804973602,
-0.3796832263469696,
-0.099941186606884,
0.00936948973685503,
-0.034618694335222244,
-0.05121445283293724,
-0.002254410879686475,
0.0919533371925354,
-0.08696664869785309,
0.0625242292881012,
-0.09059104323387146,
0.10705885291099548,
-0.06686428189277649,
0.02653522789478302,
0.06072887033224106,
0.07862139493227005,
-0.01315079815685749,
0.05873566120862961,
-0.2714354693889618,
0.2708241641521454,
0.020589355379343033,
0.061776161193847656,
-0.08137200027704239,
0.005636055022478104,
0.0074731106869876385,
0.01825842820107937,
0.06340247392654419,
-0.006854855455458164,
-0.08368667215108871,
-0.17174586653709412,
-0.12001801282167435,
0.014257790520787239,
0.07617869973182678,
-0.01302404422312975,
0.1239224299788475,
-0.019452815875411034,
-0.005756574682891369,
0.04897213727235794,
-0.009293818846344948,
-0.04036080837249756,
-0.1128280758857727,
0.031674500554800034,
0.05185230076313019,
-0.008346393704414368,
-0.06713912636041641,
-0.11029820889234543,
-0.05445243418216705,
0.1501310020685196,
0.02481834962964058,
-0.06413266062736511,
-0.13171526789665222,
0.05746673420071602,
0.10475429892539978,
-0.09048894047737122,
0.03393774852156639,
-0.009547033347189426,
0.126980260014534,
0.004900746513158083,
-0.07398231327533722,
0.11293869465589523,
-0.052952494472265244,
-0.16821935772895813,
-0.053792208433151245,
0.11976167559623718,
0.012132932431995869,
0.05952225625514984,
-0.009092921391129494,
0.04226871579885483,
-0.039022259414196014,
-0.07217884808778763,
0.031124072149395943,
-0.007862350903451443,
0.0825687050819397,
-0.02663568966090679,
-0.00806246418505907,
0.01888084225356579,
-0.06549306213855743,
-0.03509395942091942,
0.1815294325351715,
0.24905210733413696,
-0.08473746478557587,
0.05192600190639496,
0.050658147782087326,
-0.05163729190826416,
-0.15683613717556,
0.011868045665323734,
0.06621856987476349,
0.010980636812746525,
0.010266393423080444,
-0.17213602364063263,
0.052555616945028305,
0.08760328590869904,
-0.016028014943003654,
0.08157668262720108,
-0.3189579248428345,
-0.12661029398441315,
0.08494748175144196,
0.12316451966762543,
0.08470682799816132,
-0.14481963217258453,
-0.0548611618578434,
-0.029310937970876694,
-0.18377764523029327,
0.14643824100494385,
-0.08600377291440964,
0.11865594238042831,
-0.02965812012553215,
0.11535800993442535,
0.012232834473252296,
-0.0631503164768219,
0.1206752210855484,
0.012882517650723457,
0.06703266501426697,
-0.060016047209501266,
0.007336034439504147,
0.08994569629430771,
-0.0767509937286377,
0.04987044259905815,
-0.08889004588127136,
0.03883007913827896,
-0.11798302084207535,
-0.022114021703600883,
-0.06859361380338669,
0.003855534130707383,
-0.0357355996966362,
-0.04347724840044975,
-0.03269967809319496,
0.016271645203232765,
0.06640343368053436,
-0.0223140399903059,
0.16543981432914734,
0.029443275183439255,
0.1269184648990631,
0.14008978009223938,
0.08957357704639435,
-0.10299289971590042,
-0.06762203574180603,
-0.008774915710091591,
-0.03643949329853058,
0.03733901306986809,
-0.15185758471488953,
0.03276469558477402,
0.14201439917087555,
0.013629766181111336,
0.12801159918308258,
0.06475044041872025,
-0.05828506126999855,
0.02618413046002388,
0.05684224143624306,
-0.16756795346736908,
-0.100833959877491,
-0.0032512047328054905,
0.03227411210536957,
-0.1469936966896057,
0.02874886617064476,
0.12933269143104553,
-0.05742533504962921,
-0.020191915333271027,
0.0033861601259559393,
0.019877366721630096,
-0.01540873758494854,
0.18858546018600464,
0.033898405730724335,
0.06564212590456009,
-0.11204169690608978,
0.08276304602622986,
0.06747778505086899,
-0.08927860856056213,
0.04129114747047424,
0.09228876233100891,
-0.1029621809720993,
-0.023300189524888992,
0.034449253231287,
0.16137585043907166,
-0.07116690278053284,
-0.04056088253855705,
-0.1413366198539734,
-0.1165371760725975,
0.08786714822053909,
0.15373119711875916,
0.06393774598836899,
0.019149206578731537,
-0.039493318647146225,
-0.01977277360856533,
-0.12410688400268555,
0.10395337641239166,
0.06394508481025696,
0.08401879668235779,
-0.1304488480091095,
0.10596706718206406,
-0.017551258206367493,
0.04189940169453621,
-0.010298742912709713,
0.01766875945031643,
-0.09916811436414719,
0.0019209044985473156,
-0.13042236864566803,
0.006089781876653433,
-0.04974374175071716,
-0.005874950438737869,
-0.025454668328166008,
-0.0528869703412056,
-0.058166615664958954,
0.010086584836244583,
-0.10746932774782181,
-0.04232603684067726,
0.009993440471589565,
0.032577190548181534,
-0.12327051162719727,
-0.024928690865635872,
0.02365809865295887,
-0.08685895800590515,
0.08659003674983978,
0.045526232570409775,
0.0014488522429019213,
0.027117125689983368,
-0.03967024013400078,
0.00006699459481751546,
0.041361916810274124,
0.006427070125937462,
0.08191590011119843,
-0.1217736229300499,
-0.018828995525836945,
0.0022792441304773092,
0.030080316588282585,
0.027637023478746414,
0.1110617071390152,
-0.1279003918170929,
-0.010041756555438042,
-0.010892345570027828,
-0.04919762909412384,
-0.06882838159799576,
0.060185596346855164,
0.09945149719715118,
0.02776062861084938,
0.1835012584924698,
-0.06561058014631271,
0.037394750863313675,
-0.20622768998146057,
0.002993351314216852,
-0.0011730830883607268,
-0.133240208029747,
-0.07803074270486832,
-0.03874173387885094,
0.06750990450382233,
-0.0692550539970398,
0.11213170737028122,
-0.0035708786454051733,
0.02791186235845089,
0.03734240680932999,
-0.014628678560256958,
-0.025196511298418045,
0.0145424110814929,
0.19330348074436188,
0.02918301708996296,
-0.039665911346673965,
0.08041960746049881,
0.02896883897483349,
0.08159024268388748,
0.13510730862617493,
0.1954859346151352,
0.12110599875450134,
0.06280635297298431,
0.09211201965808868,
0.03420737013220787,
-0.03940824791789055,
-0.18469306826591492,
0.054757021367549896,
-0.04155857861042023,
0.13909243047237396,
-0.005903243087232113,
0.20773659646511078,
0.09876284003257751,
-0.16491828858852386,
0.040423694998025894,
-0.044926419854164124,
-0.09190638363361359,
-0.09496520459651947,
-0.0935189500451088,
-0.092229925096035,
-0.1307905912399292,
-0.008212622255086899,
-0.12765835225582123,
0.04065374284982681,
0.08634558320045471,
0.015318790450692177,
-0.0037932416889816523,
0.12670078873634338,
0.03117521107196808,
0.012896411120891571,
0.06335438787937164,
0.00456209946423769,
-0.036491718143224716,
-0.05482082813978195,
-0.06939190626144409,
0.007319287396967411,
0.012494721449911594,
0.05522109195590019,
-0.016116151586174965,
-0.01898176595568657,
0.04616585001349449,
-0.016131563112139702,
-0.124698705971241,
0.005807411856949329,
0.023478250950574875,
0.06707320362329483,
0.03788590803742409,
0.014469899237155914,
0.010933966375887394,
-0.01505101565271616,
0.2079307734966278,
-0.06920082867145538,
-0.05443746596574783,
-0.12037938088178635,
0.19105473160743713,
0.005988679360598326,
-0.04752713814377785,
0.04116789996623993,
-0.07221537083387375,
-0.004322097171097994,
0.20141533017158508,
0.16932359337806702,
-0.05159028619527817,
-0.019066764041781425,
-0.007202158682048321,
-0.00985703058540821,
-0.019405130296945572,
0.10643720626831055,
0.1116190031170845,
0.03515028581023216,
-0.08650477230548859,
-0.022858794778585434,
-0.06282459199428558,
-0.023016301915049553,
-0.03636128082871437,
0.06888212263584137,
0.017581112682819366,
-0.0020123652648180723,
-0.04233195632696152,
0.06278015673160553,
-0.05678728595376015,
-0.0615081787109375,
0.0008597575360909104,
-0.21028612554073334,
-0.17672351002693176,
-0.005713292863219976,
0.07036618888378143,
-0.009041359648108482,
0.05422920733690262,
-0.008421286940574646,
0.015030512586236,
0.09325800091028214,
-0.018722768872976303,
-0.07280950248241425,
-0.07238148897886276,
0.09424009174108505,
-0.12002142518758774,
0.19202202558517456,
-0.03247294947504997,
0.04042331501841545,
0.1366526186466217,
0.05252491682767868,
-0.12277553975582123,
0.04703446105122566,
0.052849188446998596,
-0.029776589944958687,
0.02724810130894184,
0.12321457266807556,
-0.02910946123301983,
0.07820749282836914,
0.04058581963181496,
-0.1178467869758606,
-0.009153495542705059,
-0.07047828286886215,
-0.009843683801591396,
-0.032664645463228226,
-0.05128484219312668,
-0.04129713028669357,
0.13865798711776733,
0.19200558960437775,
-0.05519097298383713,
-0.013641832396388054,
-0.05785542353987694,
0.013670593500137329,
0.054490551352500916,
0.008246279321610928,
-0.05697433650493622,
-0.2601149380207062,
-0.0015703224344179034,
0.08015796542167664,
-0.002090942580252886,
-0.2723597288131714,
-0.08915103226900101,
0.0004461705684661865,
-0.0533473826944828,
-0.09769853204488754,
0.10214319080114365,
0.0868101418018341,
0.043332163244485855,
-0.06575698405504227,
0.0034831915982067585,
-0.07298175990581512,
0.15983952581882477,
-0.1447276920080185,
-0.07470156252384186
] |
null | null |
transformers
|
hello
|
{}
|
feature-extraction
|
argv947059/example-based-ner-bert
|
[
"transformers",
"pytorch",
"jax",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #bert #feature-extraction #endpoints_compatible #region-us
|
hello
|
[] |
[
"TAGS\n#transformers #pytorch #jax #bert #feature-extraction #endpoints_compatible #region-us \n"
] |
[
32
] |
[
"passage: TAGS\n#transformers #pytorch #jax #bert #feature-extraction #endpoints_compatible #region-us \n"
] |
[
-0.052963756024837494,
-0.0036861018743366003,
-0.009317236952483654,
0.014060239307582378,
0.1265685260295868,
0.04055891931056976,
0.012158291414380074,
0.09425005316734314,
0.07149101793766022,
-0.016153931617736816,
0.11511608213186264,
0.23898901045322418,
-0.028362810611724854,
0.026604674756526947,
-0.05170471966266632,
-0.2719436585903168,
0.06598348915576935,
0.09840699285268784,
-0.038225941359996796,
0.0970769077539444,
0.0554196834564209,
-0.10502330213785172,
0.06711741536855698,
-0.013328155502676964,
-0.15120553970336914,
0.05215353146195412,
0.03010397218167782,
-0.07699708640575409,
0.10851556807756424,
0.015515191480517387,
0.15853559970855713,
0.011645578779280186,
-0.09041590243577957,
-0.160604789853096,
0.03134223073720932,
-0.017968304455280304,
-0.06257034838199615,
0.03479667380452156,
0.0707094818353653,
-0.09866075217723846,
0.031214237213134766,
0.10099188983440399,
0.020305143669247627,
0.0224761925637722,
-0.16484229266643524,
-0.19813960790634155,
-0.04946230351924896,
0.05015429109334946,
0.02250281721353531,
0.08229392021894455,
0.018286142498254776,
0.13435232639312744,
-0.15313594043254852,
0.08182822167873383,
0.21016328036785126,
-0.2993359863758087,
-0.009734644554555416,
0.0593695230782032,
0.1350070834159851,
0.019440416246652603,
-0.032571692019701004,
0.03773166611790657,
-0.006441458128392696,
0.022168122231960297,
0.023778874427080154,
-0.10345342755317688,
-0.04922936111688614,
0.06903901696205139,
-0.09044501185417175,
-0.07425874471664429,
0.21812093257904053,
-0.018261190503835678,
0.05052957311272621,
0.024539193138480186,
-0.08425062149763107,
-0.06177361682057381,
-0.040308937430381775,
-0.02324654348194599,
-0.014144795015454292,
0.05733082816004753,
0.006142587400972843,
-0.02140374295413494,
-0.10499297082424164,
0.01677618734538555,
-0.17896869778633118,
0.19923429191112518,
0.013208188116550446,
0.08352150768041611,
-0.21237951517105103,
0.0449824184179306,
-0.06849394738674164,
-0.09766314178705215,
0.030032433569431305,
-0.0852053165435791,
0.04761315509676933,
0.004622476641088724,
-0.06118087098002434,
0.012963674031198025,
0.04501301050186157,
0.1278681606054306,
0.014213649556040764,
0.029509782791137695,
0.0212479867041111,
0.10761474072933197,
0.027579398825764656,
0.12123031169176102,
0.02797030657529831,
-0.03111332282423973,
0.025293005630373955,
-0.10117059201002121,
-0.03267378360033035,
-0.05524241179227829,
-0.12406694889068604,
-0.029093708842992783,
0.05186808109283447,
0.07624085992574692,
0.03110049106180668,
0.0058425357565283775,
-0.09648735821247101,
-0.022560784593224525,
0.048836782574653625,
-0.0633535087108612,
0.01324373297393322,
-0.01158793643116951,
0.048493556678295135,
0.16013619303703308,
-0.03490329533815384,
-0.01864147186279297,
-0.022512229159474373,
0.08027350902557373,
-0.08937931060791016,
0.012826178222894669,
-0.052831489592790604,
-0.05256495252251625,
0.036838091909885406,
-0.13647215068340302,
0.06195525452494621,
-0.1411474496126175,
-0.09831726551055908,
0.03724878653883934,
0.06510140001773834,
0.0028400777373462915,
0.011090216226875782,
-0.0028051785193383694,
-0.02199988067150116,
0.008028601296246052,
-0.06167469918727875,
-0.0816744863986969,
-0.059597406536340714,
0.1003911942243576,
-0.005042034666985273,
0.06295882910490036,
-0.09823295474052429,
0.08133041858673096,
-0.09697461873292923,
0.030901752412319183,
-0.15964384377002716,
-0.01432819850742817,
-0.01723836548626423,
0.17763751745224,
-0.00034126266837120056,
-0.0548611618578434,
-0.1117730438709259,
0.05009101331233978,
-0.04932554066181183,
0.1541672796010971,
-0.08066514879465103,
-0.13397589325904846,
0.21098418533802032,
-0.10251977294683456,
-0.1840919852256775,
0.061645857989788055,
-0.00545533886179328,
0.0011149892816320062,
0.07106424123048782,
0.203650563955307,
0.07746961712837219,
-0.060603056102991104,
0.08728695660829544,
0.13636207580566406,
-0.10702621191740036,
-0.12712803483009338,
0.03748749569058418,
-0.02914872020483017,
-0.07186932861804962,
0.04315415397286415,
0.021652160212397575,
0.09422970563173294,
-0.07124552875757217,
-0.038790617138147354,
-0.012978962622582912,
-0.010238992050290108,
0.06346900761127472,
0.06290382891893387,
0.10297422111034393,
-0.046184465289115906,
-0.0026994396466761827,
0.008307021111249924,
-0.009622191078960896,
0.01653190515935421,
0.03997422009706497,
-0.060205742716789246,
0.17742055654525757,
-0.07573379576206207,
0.0052375528030097485,
-0.23334336280822754,
-0.07036417722702026,
0.010943070985376835,
0.06390442699193954,
-0.04700079187750816,
0.1549825817346573,
0.09247895330190659,
-0.07577121257781982,
0.012680678628385067,
-0.03353114426136017,
0.08304725587368011,
0.02012883499264717,
-0.03484402596950531,
-0.0750569999217987,
-0.009526349604129791,
-0.07900413125753403,
-0.08899429440498352,
-0.026323657482862473,
-0.01098543033003807,
0.07709052413702011,
0.09981118887662888,
0.020864687860012054,
0.017583217471837997,
-0.06752119958400726,
0.05020664259791374,
-0.02576756849884987,
0.0177655890583992,
0.0949443057179451,
-0.01739528775215149,
-0.04540245607495308,
0.15520338714122772,
-0.09017010033130646,
0.36493057012557983,
0.19001640379428864,
-0.3118160367012024,
0.0143634844571352,
-0.017740126699209213,
-0.00546584976837039,
0.03116120956838131,
0.09335438162088394,
-0.022324539721012115,
0.0862930566072464,
0.014072950929403305,
0.13336621224880219,
-0.03602796047925949,
-0.04512658715248108,
0.00488016102463007,
-0.030854322016239166,
-0.057892415672540665,
0.06719741225242615,
0.07318726181983948,
-0.15898650884628296,
0.1664324402809143,
0.2892304062843323,
0.024658242240548134,
0.13439251482486725,
-0.050064556300640106,
-0.028527043759822845,
0.01088089868426323,
-0.004020090214908123,
-0.04241277277469635,
0.05857431888580322,
-0.2635689377784729,
-0.06348064541816711,
0.05880957841873169,
0.011391952633857727,
0.07413619756698608,
-0.12690965831279755,
-0.0547051876783371,
0.03168385848402977,
0.03254729136824608,
-0.08504929393529892,
0.06544449925422668,
0.044477373361587524,
0.06054355204105377,
0.014742519706487656,
-0.07364493608474731,
0.09875046461820602,
0.0067468550987541676,
-0.04165760800242424,
0.16546687483787537,
-0.1195327416062355,
-0.26084214448928833,
-0.10503862798213959,
-0.15248729288578033,
0.024808121845126152,
0.007801828905940056,
0.08723878115415573,
-0.0696033239364624,
-0.018955839797854424,
0.07113382965326309,
0.0272858627140522,
-0.15711280703544617,
0.0252254419028759,
-0.07713302969932556,
0.03579463064670563,
-0.07968100905418396,
-0.0684952661395073,
-0.07613936811685562,
-0.06434401124715805,
-0.009333625435829163,
0.08535612374544144,
-0.11190655827522278,
0.09602662175893784,
0.11701052635908127,
0.03214438632130623,
0.08658012002706528,
-0.006250035483390093,
0.18954986333847046,
-0.05687757954001427,
-0.07587884366512299,
0.19472064077854156,
-0.03688417375087738,
0.08600619435310364,
0.09849857538938522,
0.05287953093647957,
-0.06082279980182648,
-0.04936293885111809,
-0.06025570631027222,
-0.09542788565158844,
-0.16916604340076447,
-0.06522456556558609,
-0.1508832424879074,
0.020152684301137924,
0.029854416847229004,
0.037862420082092285,
0.10091149806976318,
0.06081030145287514,
0.05603417381644249,
-0.03332297131419182,
-0.02884560078382492,
0.03904317319393158,
0.18016357719898224,
-0.02138795331120491,
0.09862849861383438,
-0.02871086820960045,
-0.10660931468009949,
0.058421771973371506,
0.032121144235134125,
0.20918923616409302,
0.11231514811515808,
0.037705160677433014,
0.05080387368798256,
0.19513967633247375,
0.14172111451625824,
0.14885953068733215,
-0.027547087520360947,
-0.0424746498465538,
-0.020660579204559326,
-0.006827371194958687,
-0.056708768010139465,
0.022592194378376007,
0.17596647143363953,
-0.09328009188175201,
-0.07799528539180756,
-0.21095149219036102,
0.0602993369102478,
0.0701967179775238,
0.030279070138931274,
-0.19828727841377258,
0.019662387669086456,
0.07570156455039978,
-0.0017542234854772687,
-0.05226362124085426,
0.0724397599697113,
-0.02223113924264908,
-0.12630979716777802,
0.0342094711959362,
-0.06242687255144119,
0.11481078714132309,
-0.007219333667308092,
0.07287821173667908,
-0.023485491052269936,
-0.1197328194975853,
0.06598693877458572,
0.06315088272094727,
-0.2035032957792282,
0.265704482793808,
-0.006578056141734123,
-0.05729619786143303,
-0.03779337555170059,
0.004988834261894226,
0.016682634130120277,
0.15234172344207764,
0.13822756707668304,
0.02424781396985054,
-0.0657852366566658,
-0.14051759243011475,
0.046606600284576416,
0.03090650588274002,
0.12475515902042389,
-0.070212721824646,
-0.015670286491513252,
-0.023392077535390854,
-0.01504120696336031,
-0.021317463368177414,
0.0625319704413414,
0.06776363402605057,
-0.15063032507896423,
0.05838163197040558,
-0.056513711810112,
0.03976921737194061,
-0.010568802244961262,
-0.016193414106965065,
-0.04077047109603882,
0.15405818819999695,
-0.03783063217997551,
-0.03903694450855255,
-0.098985455930233,
-0.11961425840854645,
0.11263576149940491,
-0.08763912320137024,
0.08025027066469193,
-0.06742454320192337,
-0.025876687839627266,
-0.06425515562295914,
-0.19912926852703094,
0.137502059340477,
-0.10783692449331284,
0.054419536143541336,
-0.06186164915561676,
0.17661862075328827,
-0.061372656375169754,
0.01232277974486351,
0.022339118644595146,
0.026846999302506447,
-0.12461834400892258,
-0.07176445424556732,
-0.00021531556558329612,
-0.020146500319242477,
0.03834713250398636,
0.03682788088917732,
-0.05799354612827301,
0.04448740556836128,
-0.004165316000580788,
0.05304564908146858,
0.23484887182712555,
0.16569161415100098,
-0.054744821041822433,
0.12290453910827637,
0.13697096705436707,
-0.0373862199485302,
-0.26884889602661133,
-0.07450263947248459,
-0.12268921732902527,
-0.037436116486787796,
0.015440378338098526,
-0.12973135709762573,
0.12852080166339874,
0.026052117347717285,
-0.017302265390753746,
0.12126334756612778,
-0.24278470873832703,
-0.054412320256233215,
0.1429450511932373,
0.019859960302710533,
0.4446322023868561,
-0.12276434898376465,
-0.08678081631660461,
0.016029899939894676,
-0.24718843400478363,
0.1013478934764862,
0.03782957047224045,
0.05960741266608238,
-0.022589823231101036,
0.031811367720365524,
0.03418872132897377,
-0.06166786700487137,
0.11188843846321106,
0.016796844080090523,
0.043888166546821594,
-0.054036349058151245,
-0.13559308648109436,
0.06865550577640533,
-0.01695988141000271,
-0.02015555649995804,
0.030521400272846222,
0.006575232371687889,
-0.16412822902202606,
-0.020895149558782578,
-0.12829890847206116,
0.06838260591030121,
0.024225331842899323,
-0.0330156646668911,
0.014284659177064896,
-0.025629930198192596,
-0.0020842088852077723,
0.019318141043186188,
0.28556373715400696,
-0.023091532289981842,
0.15915152430534363,
0.0036703497171401978,
0.05212036892771721,
-0.20219330489635468,
-0.17958396673202515,
-0.0687621608376503,
-0.04871116578578949,
0.09693709760904312,
-0.051237091422080994,
0.049753233790397644,
0.13456004858016968,
-0.008942127227783203,
0.030212610960006714,
0.1281672716140747,
0.006242772564291954,
-0.019622227177023888,
0.12672145664691925,
-0.19614574313163757,
-0.02772924117743969,
-0.05233310908079147,
-0.049490299075841904,
0.08346356451511383,
0.05978552997112274,
0.07819730043411255,
0.050845369696617126,
-0.031043145805597305,
-0.02332441322505474,
-0.02988717332482338,
-0.07948853820562363,
0.026934944093227386,
0.03525209054350853,
0.040102314203977585,
-0.1379339098930359,
0.02906956896185875,
-0.010940386913716793,
-0.27159202098846436,
-0.05059085786342621,
0.10220116376876831,
-0.10801422595977783,
-0.10470584034919739,
-0.0646820217370987,
0.13403543829917908,
-0.13819628953933716,
-0.011767823249101639,
-0.0421360582113266,
-0.12104002386331558,
0.05875451862812042,
0.20137184858322144,
0.10027112066745758,
0.10308783501386642,
-0.03936046361923218,
-0.0006528248195536435,
0.01760394312441349,
-0.03376733511686325,
0.009661308489739895,
0.01645725406706333,
-0.11603355407714844,
-0.03374771401286125,
-0.006667631212621927,
0.15351681411266327,
-0.07948599010705948,
-0.06356992572546005,
-0.15109962224960327,
0.06373295187950134,
-0.03385986015200615,
-0.09342114627361298,
-0.1244155541062355,
-0.05954677611589432,
0.020039744675159454,
-0.05939866974949837,
-0.04189888387918472,
-0.028911501169204712,
-0.14287447929382324,
0.03040890209376812,
0.015848716720938683,
0.0038695666007697582,
-0.05576401948928833,
-0.03524982929229736,
0.11663132160902023,
-0.06516677141189575,
0.07124289870262146,
0.1888193041086197,
-0.05122585594654083,
0.13516193628311157,
-0.11158259958028793,
-0.15316110849380493,
0.09748540073633194,
0.02599414438009262,
0.07872241735458374,
0.07292717695236206,
0.02696787379682064,
0.05939493700861931,
0.001954560400918126,
0.03438263386487961,
-0.03973422944545746,
-0.13351686298847198,
-0.023773811757564545,
0.02769111841917038,
-0.18662898242473602,
-0.029900524765253067,
-0.0607491061091423,
0.14990295469760895,
0.014299839735031128,
0.11870045214891434,
0.012899700552225113,
0.10260415077209473,
-0.05637417361140251,
0.001854541595093906,
-0.004377702716737986,
-0.19066840410232544,
0.001040599774569273,
-0.07924193888902664,
0.0054010325111448765,
-0.01029987633228302,
0.23242269456386566,
0.001893453299999237,
0.04246624931693077,
0.021678172051906586,
0.02238643169403076,
0.06013426557183266,
0.03037826158106327,
0.24299314618110657,
0.11352360248565674,
-0.05275481194257736,
-0.07847755402326584,
0.10017896443605423,
0.0099342605099082,
0.013257299549877644,
0.10118678957223892,
0.12967033684253693,
0.04960792139172554,
0.102946437895298,
0.047170236706733704,
0.05124982073903084,
-0.10055673867464066,
-0.2252604216337204,
0.020131826400756836,
0.0763651505112648,
0.02886216901242733,
0.09756013751029968,
0.14686918258666992,
-0.027980821207165718,
0.08679910749197006,
-0.024855410680174828,
-0.025126246735453606,
-0.13922882080078125,
-0.050710372626781464,
-0.05487101525068283,
-0.12127167731523514,
0.0038119384553283453,
-0.07388770580291748,
0.00872061774134636,
0.13778604567050934,
0.019346702843904495,
-0.02416999451816082,
0.10666673630475998,
0.06476649641990662,
-0.06081261858344078,
0.06429800391197205,
-0.016318149864673615,
-0.0035494028124958277,
0.030923180282115936,
0.0068810684606432915,
-0.11447969079017639,
-0.09393544495105743,
-0.05622541904449463,
0.014824923127889633,
-0.0998360812664032,
0.00777388783171773,
-0.1129651591181755,
-0.1317947506904602,
-0.037622228264808655,
0.03216177970170975,
-0.07305324822664261,
0.09837225079536438,
-0.007971052080392838,
0.0029706419445574284,
0.0089488560333848,
0.17413225769996643,
-0.0752522423863411,
0.010291093029081821,
-0.014209666289389133,
0.20998892188072205,
0.08818451315164566,
0.10622511804103851,
-0.00010614687198540196,
0.027271725237369537,
-0.04751461744308472,
0.2810079753398895,
0.25682657957077026,
-0.03268571197986603,
0.04118233174085617,
0.08400323987007141,
0.0352952741086483,
0.07457773387432098,
0.08635538071393967,
0.09502308815717697,
0.3094768822193146,
-0.08893507719039917,
-0.01832873374223709,
-0.045276056975126266,
-0.007693660911172628,
-0.08922053873538971,
0.022008083760738373,
0.090583436191082,
-0.051739878952503204,
-0.059487778693437576,
0.09710941463708878,
-0.1514272689819336,
0.12403640151023865,
0.10022277384996414,
-0.2248581200838089,
-0.05263209715485573,
-0.0753798633813858,
0.17728450894355774,
0.005827738903462887,
0.11856024712324142,
-0.04088432341814041,
-0.1279212087392807,
0.05804216116666794,
0.041939619928598404,
-0.2674042284488678,
-0.10722674429416656,
0.11922934651374817,
0.007283462211489677,
0.011069725267589092,
-0.03636489063501358,
-0.007730226498097181,
0.06680086255073547,
0.08091744780540466,
-0.004777153488248587,
0.003635617671534419,
0.034582268446683884,
-0.08894816040992737,
-0.09114518761634827,
0.02295706979930401,
0.01569727063179016,
-0.06678148359060287,
0.03557940572500229,
-0.15736618638038635,
0.038901135325431824,
-0.06494641304016113,
-0.04504818096756935,
0.00997934676706791,
-0.0023578645195811987,
-0.04645739495754242,
0.03953973203897476,
0.0798637866973877,
0.02510100044310093,
-0.04636509716510773,
-0.0499950535595417,
-0.005103605799376965,
0.09507005661725998,
-0.04521242901682854,
-0.1744234263896942,
-0.0458562895655632,
-0.0892651304602623,
0.0930018275976181,
-0.050208285450935364,
-0.08868695795536041,
-0.049340106546878815,
-0.034710593521595,
0.05042168125510216,
-0.12623053789138794,
0.04210792854428291,
0.05483612045645714,
0.04378687962889671,
0.013884893618524075,
-0.05358581990003586,
0.05241890624165535,
0.07315762341022491,
-0.1212339773774147,
-0.07481653988361359
] |
null | null |
transformers
|
# citizenlab/distilbert-base-multilingual-cased-toxicity
This is multilingual Distil-Bert model sequence classifier trained based on [JIGSAW Toxic Comment Classification Challenge](https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge) dataset.
## How to use it
```python
from transformers import pipeline
model_path = "citizenlab/distilbert-base-multilingual-cased-toxicity"
toxicity_classifier = pipeline("text-classification", model=model_path, tokenizer=model_path)
toxicity_classifier("this is a lovely message")
> [{'label': 'not_toxic', 'score': 0.9954179525375366}]
toxicity_classifier("you are an idiot and you and your family should go back to your country")
> [{'label': 'toxic', 'score': 0.9948776960372925}]
```
## Evaluation
### Accuracy
```
Accuracy Score = 0.9425
F1 Score (Micro) = 0.9450549450549449
F1 Score (Macro) = 0.8491432341169309
```
|
{"language": ["en", "nl", "fr", "pt", "it", "es", "de", "da", "pl", "af"], "datasets": ["jigsaw_toxicity_pred"], "metrics": ["F1 Accuracy"], "pipeline_type": "text-classification", "widget": [{"text": "this is a lovely message", "example_title": "Example 1", "multi_class": false}, {"text": "you are an idiot and you and your family should go back to your country", "example_title": "Example 2", "multi_class": false}]}
|
text-classification
|
citizenlab/distilbert-base-multilingual-cased-toxicity
|
[
"transformers",
"pytorch",
"distilbert",
"text-classification",
"en",
"nl",
"fr",
"pt",
"it",
"es",
"de",
"da",
"pl",
"af",
"dataset:jigsaw_toxicity_pred",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en",
"nl",
"fr",
"pt",
"it",
"es",
"de",
"da",
"pl",
"af"
] |
TAGS
#transformers #pytorch #distilbert #text-classification #en #nl #fr #pt #it #es #de #da #pl #af #dataset-jigsaw_toxicity_pred #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# citizenlab/distilbert-base-multilingual-cased-toxicity
This is multilingual Distil-Bert model sequence classifier trained based on JIGSAW Toxic Comment Classification Challenge dataset.
## How to use it
## Evaluation
### Accuracy
|
[
"# citizenlab/distilbert-base-multilingual-cased-toxicity\n\nThis is multilingual Distil-Bert model sequence classifier trained based on JIGSAW Toxic Comment Classification Challenge dataset.",
"## How to use it",
"## Evaluation",
"### Accuracy"
] |
[
"TAGS\n#transformers #pytorch #distilbert #text-classification #en #nl #fr #pt #it #es #de #da #pl #af #dataset-jigsaw_toxicity_pred #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# citizenlab/distilbert-base-multilingual-cased-toxicity\n\nThis is multilingual Distil-Bert model sequence classifier trained based on JIGSAW Toxic Comment Classification Challenge dataset.",
"## How to use it",
"## Evaluation",
"### Accuracy"
] |
[
74,
52,
5,
3,
5
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #en #nl #fr #pt #it #es #de #da #pl #af #dataset-jigsaw_toxicity_pred #autotrain_compatible #endpoints_compatible #has_space #region-us \n# citizenlab/distilbert-base-multilingual-cased-toxicity\n\nThis is multilingual Distil-Bert model sequence classifier trained based on JIGSAW Toxic Comment Classification Challenge dataset.## How to use it## Evaluation### Accuracy"
] |
[
-0.043528445065021515,
0.09250355511903763,
-0.00605350686237216,
0.05706646665930748,
0.11232677102088928,
0.03769741952419281,
0.07030988484621048,
0.127118319272995,
0.0418175533413887,
0.062368765473365784,
0.11451038718223572,
0.19328391551971436,
-0.03375384956598282,
0.13624882698059082,
-0.11040008813142776,
-0.16614916920661926,
0.04011867567896843,
0.03409992903470993,
-0.038486093282699585,
0.1528759002685547,
0.12635745108127594,
-0.07688909769058228,
0.08313240110874176,
0.003144853748381138,
-0.13726887106895447,
0.005049902480095625,
0.049904823303222656,
-0.08187147974967957,
0.1128949299454689,
0.018652549013495445,
0.06994432210922241,
0.04357485473155975,
0.013436508364975452,
-0.05361512303352356,
0.043405428528785706,
0.011019106023013592,
-0.023635709658265114,
0.09615162014961243,
0.0467621386051178,
-0.10125169157981873,
0.04673018306493759,
-0.04147644713521004,
0.00033408869057893753,
0.07327248901128769,
-0.11214591562747955,
-0.12336321920156479,
0.014073840342462063,
0.0608084611594677,
0.09977792203426361,
0.07076668739318848,
-0.06641456484794617,
0.2065630704164505,
-0.10359713435173035,
0.08566208928823471,
0.08372620493173599,
-0.15521949529647827,
-0.0621367022395134,
0.1221812441945076,
0.05138850212097168,
-0.05207397788763046,
-0.11085835099220276,
0.06888309866189957,
0.08328752964735031,
0.009358728304505348,
-0.021597053855657578,
-0.10175877064466476,
0.06958801299333572,
-0.03245548903942108,
-0.11547286063432693,
0.06802341341972351,
0.25042492151260376,
0.012322496622800827,
-0.035422489047050476,
-0.02861137129366398,
-0.07401954382658005,
0.03027915209531784,
-0.005405585281550884,
-0.09461882710456848,
-0.05274384468793869,
-0.0012332192854955792,
0.06055168807506561,
0.05185063183307648,
-0.09503497928380966,
-0.0655379667878151,
-0.08677449822425842,
0.2093360424041748,
-0.006899920757859945,
0.01888774149119854,
-0.08589217811822891,
0.016178270801901817,
0.03585006296634674,
-0.034792132675647736,
0.0075468155555427074,
-0.08925023674964905,
-0.08009564131498337,
-0.04228615388274193,
0.03192906454205513,
-0.04606155306100845,
0.14275795221328735,
0.03741423785686493,
0.017528025433421135,
0.00983482412993908,
-0.07655572891235352,
0.027933547273278236,
0.1072576567530632,
0.02343890070915222,
-0.07723092287778854,
0.003848128020763397,
0.004067258909344673,
0.03406296297907829,
-0.00527500594034791,
0.013997743837535381,
-0.10429544746875763,
-0.006819150410592556,
0.07767654210329056,
0.07854840904474258,
0.03452686220407486,
0.09619973599910736,
-0.10972489416599274,
-0.040787287056446075,
0.04970012605190277,
-0.019288692623376846,
0.01910029910504818,
0.08239210397005081,
-0.01288505271077156,
0.1332799345254898,
-0.008001443929970264,
-0.02262195199728012,
-0.031994499266147614,
0.10667815804481506,
-0.09586458653211594,
0.048721618950366974,
-0.038148775696754456,
-0.14531384408473969,
0.004273376427590847,
0.011258864775300026,
-0.01412432361394167,
-0.1341790407896042,
-0.04761962592601776,
-0.057334668934345245,
-0.029213229194283485,
-0.02669401280581951,
-0.04267965257167816,
-0.07119866460561752,
-0.008554059080779552,
0.029697230085730553,
-0.008220139890909195,
-0.08489831537008286,
-0.06704182177782059,
0.07652787864208221,
-0.05021810159087181,
0.1100989580154419,
-0.05753537639975548,
0.014910359866917133,
-0.09779756516218185,
0.014913320541381836,
-0.0018933042883872986,
0.040188729763031006,
-0.03264002874493599,
0.06453700363636017,
-0.10530348867177963,
-0.09066721796989441,
-0.05046504735946655,
-0.03748893737792969,
-0.025511937215924263,
0.2786920368671417,
-0.1890868991613388,
-0.057560332119464874,
0.10200924426317215,
-0.09120836108922958,
-0.07233189046382904,
0.12215718626976013,
-0.009852448478341103,
0.0742698535323143,
0.09478498250246048,
0.09065379947423935,
0.015256843529641628,
-0.12539619207382202,
-0.0274106003344059,
0.07341151684522629,
-0.12016990780830383,
0.09901254624128342,
0.07445652037858963,
-0.021006720140576363,
-0.1335662603378296,
-0.024039635434746742,
0.016092417761683464,
0.05792074650526047,
-0.05518263950943947,
-0.07857408374547958,
0.008253088220953941,
-0.02814675122499466,
0.1719864457845688,
-0.013909018598496914,
0.013278343714773655,
-0.06869872659444809,
-0.044749483466148376,
-0.07886406779289246,
0.11067693680524826,
0.006796380039304495,
0.012276173569262028,
-0.11331183463335037,
0.05974673479795456,
-0.000489444995764643,
0.013423214666545391,
-0.16109532117843628,
0.05920200049877167,
-0.008125567808747292,
0.04908845201134682,
-0.014184241183102131,
0.014508298598229885,
0.02815546654164791,
-0.03794277831912041,
0.00046372247743420303,
-0.02998293749988079,
0.08504340052604675,
0.014745897613465786,
-0.057163987308740616,
-0.22088027000427246,
0.07960247993469238,
-0.053775276988744736,
0.11355217546224594,
-0.14150801301002502,
0.011471141129732132,
0.14525778591632843,
0.012348685413599014,
-0.0504927895963192,
0.04795350134372711,
0.0873265340924263,
-0.029413461685180664,
-0.07156390696763992,
-0.02613399177789688,
0.07703577727079391,
-0.032013989984989166,
-0.13178563117980957,
0.012052539736032486,
-0.04509010165929794,
0.10607855021953583,
0.13394120335578918,
-0.11543754488229752,
-0.05093778669834137,
0.0010581074748188257,
-0.03898351266980171,
0.01395238097757101,
-0.041574474424123764,
-0.03446828946471214,
0.054959557950496674,
0.008257296867668629,
0.04765591770410538,
-0.011405437253415585,
0.0009824116714298725,
-0.019102975726127625,
-0.05428929254412651,
-0.08495309948921204,
0.0923810601234436,
-0.09495782852172852,
-0.2304447740316391,
0.13243073225021362,
0.14591825008392334,
-0.030627144500613213,
0.03998911753296852,
0.07160717248916626,
0.018240321427583694,
-0.034474536776542664,
0.02091037482023239,
-0.0047934390604496,
0.0835907906293869,
-0.19641759991645813,
0.021069863811135292,
0.052891943603754044,
-0.019748806953430176,
-0.03577010706067085,
-0.06087464839220047,
-0.02368423342704773,
-0.06476098299026489,
-0.0006382266874425113,
-0.05799869820475578,
0.0672387182712555,
0.044279489666223526,
0.1631140261888504,
0.026762207970023155,
-0.06187523156404495,
0.05628681182861328,
-0.0011692643165588379,
-0.12125638127326965,
0.21530920267105103,
-0.1487833708524704,
-0.46264225244522095,
-0.012051824480295181,
-0.0812527984380722,
-0.043387386947870255,
0.04732518643140793,
0.04401584342122078,
-0.2027997225522995,
-0.039681628346443176,
0.06756913661956787,
0.14199502766132355,
-0.05060163512825966,
0.02921256795525551,
0.008387435227632523,
0.02129986882209778,
0.027291584759950638,
-0.081594318151474,
-0.014541532844305038,
-0.05738738551735878,
0.012007498182356358,
0.15476413071155548,
-0.10083118081092834,
0.08186998218297958,
0.1539110541343689,
0.0035090541932731867,
0.018622752279043198,
-0.038251638412475586,
0.24245613813400269,
-0.10273264348506927,
0.022640759125351906,
0.1511968970298767,
-0.08704277873039246,
0.02065870724618435,
0.06021866574883461,
0.0336318165063858,
-0.05889369919896126,
0.062380749732255936,
0.03491417318582535,
-0.009794958867132664,
-0.18209682404994965,
-0.14897850155830383,
-0.02265326865017414,
0.022390564903616905,
0.06837394088506699,
0.08659330010414124,
0.03117714636027813,
0.09835194051265717,
0.013879839330911636,
-0.0123372133821249,
0.029397746548056602,
0.06689531356096268,
0.06722009181976318,
-0.07614520937204361,
0.12435559183359146,
-0.011380602605640888,
-0.15145975351333618,
0.10994479805231094,
-0.02121341973543167,
0.02900088205933571,
0.09648209810256958,
0.002369083696976304,
0.0915786400437355,
0.06093719229102135,
0.11061582714319229,
0.09348423033952713,
0.004756214562803507,
-0.03538636490702629,
-0.00004600609463523142,
-0.07727092504501343,
-0.02225426584482193,
0.027086008340120316,
0.09447696805000305,
0.05924151837825775,
0.03379352390766144,
-0.01850433275103569,
0.19159343838691711,
0.03263777121901512,
0.06279769539833069,
-0.17468489706516266,
-0.031116021797060966,
0.06382198631763458,
0.08884226530790329,
-0.08658260107040405,
0.06385013461112976,
0.0012282505631446838,
-0.11560903489589691,
0.0938519835472107,
-0.08995132893323898,
0.12291985005140305,
-0.04216917231678963,
0.06902191787958145,
-0.05967095121741295,
-0.08430895209312439,
-0.031274132430553436,
0.1299951672554016,
-0.42963525652885437,
0.12778645753860474,
0.026451485231518745,
-0.0430547334253788,
-0.10742390900850296,
-0.013782340101897717,
0.07786539196968079,
0.2211105078458786,
0.25387996435165405,
0.03675992414355278,
0.13307052850723267,
-0.09539797157049179,
-0.10871532559394836,
-0.025205904617905617,
0.06354884803295135,
0.03267926350235939,
0.0311077069491148,
0.010307109914720058,
0.013241667300462723,
-0.02222348004579544,
-0.0012151665287092328,
-0.16307903826236725,
-0.05445285141468048,
0.09739396721124649,
0.0415516123175621,
0.05157420039176941,
-0.02740294672548771,
-0.11592774838209152,
-0.08509065210819244,
0.2300994098186493,
-0.17416152358055115,
-0.058672111481428146,
-0.1428634226322174,
0.08596132695674896,
0.04430980235338211,
-0.0708949863910675,
-0.08176197856664658,
-0.040907710790634155,
0.07507048547267914,
-0.058886680752038956,
-0.08823218196630478,
0.13315711915493011,
-0.1096397340297699,
-0.08726296573877335,
-0.04724337160587311,
0.14509589970111847,
0.07447236776351929,
0.09342743456363678,
0.05193211883306503,
-0.009350506588816643,
-0.08223876357078552,
-0.1299595683813095,
0.037782084196805954,
-0.08047953248023987,
-0.00172094174195081,
0.12595145404338837,
-0.10174320638179779,
-0.23301908373832703,
-0.10309113562107086,
0.03632010519504547,
0.19585350155830383,
0.1717841923236847,
-0.046953070908784866,
0.06408100575208664,
0.07991191744804382,
-0.0785544216632843,
-0.29737982153892517,
0.0749068409204483,
-0.02379334717988968,
0.018902983516454697,
-0.09031957387924194,
-0.10811646282672882,
0.07207205146551132,
0.06888636946678162,
-0.04303641617298126,
0.027794741094112396,
-0.20055614411830902,
-0.06950575113296509,
0.24881291389465332,
0.02047686092555523,
0.22736771404743195,
-0.13523359596729279,
-0.04878922179341316,
-0.0920863002538681,
-0.032086070626974106,
0.18153592944145203,
-0.09849514067173004,
0.05238146707415581,
-0.010919596068561077,
-0.02091616950929165,
0.06975754350423813,
-0.01624867133796215,
0.17558491230010986,
0.054385438561439514,
0.05891459435224533,
-0.021521344780921936,
-0.24752670526504517,
0.11680556833744049,
0.048827048391103745,
0.07258763909339905,
0.0849921703338623,
0.026413224637508392,
-0.07750071585178375,
-0.02493695728480816,
-0.09602823853492737,
0.10788794606924057,
-0.03170889616012573,
-0.09800568968057632,
-0.10026037693023682,
0.06741385161876678,
0.002669372595846653,
-0.0063644880428910255,
0.18854524195194244,
-0.07531179487705231,
0.08301125466823578,
0.036559928208589554,
0.17425505816936493,
0.0365273542702198,
-0.074544258415699,
0.06800805032253265,
-0.03068501502275467,
0.07783935219049454,
-0.055430974811315536,
0.0805586501955986,
0.12938132882118225,
0.01159287802875042,
0.1327192187309265,
0.08833202719688416,
-0.04867517575621605,
-0.00493353046476841,
0.11609277129173279,
-0.18206097185611725,
-0.020502354949712753,
-0.050831131637096405,
-0.09963265806436539,
-0.05153648555278778,
0.04451889544725418,
0.11827479302883148,
0.0068493615835905075,
-0.02012653648853302,
0.022953765466809273,
0.029105596244335175,
-0.05130840092897415,
0.15877018868923187,
0.06095835939049721,
0.060197003185749054,
-0.12086566537618637,
0.047615908086299896,
0.07346811890602112,
-0.05927183851599693,
-0.025731544941663742,
0.03512268513441086,
-0.12090619653463364,
-0.07911115139722824,
-0.057099759578704834,
0.1508275717496872,
0.03656313568353653,
-0.04710344970226288,
-0.10098935663700104,
-0.17067550122737885,
0.03375108540058136,
0.09988303482532501,
0.13045579195022583,
0.09024626761674881,
-0.0674382895231247,
-0.07378754019737244,
-0.05311398208141327,
0.06746315211057663,
0.002184771467000246,
-0.0006019394495524466,
-0.10825330764055252,
0.1415976583957672,
-0.007799665909260511,
0.18281322717666626,
-0.106662318110466,
-0.05386640131473541,
-0.1896752119064331,
0.004072589799761772,
-0.032953083515167236,
0.03116178885102272,
-0.10430464148521423,
-0.030047684907913208,
0.004410986322909594,
-0.04119578003883362,
-0.009221401065587997,
0.00511989975348115,
-0.059765227138996124,
0.040852755308151245,
0.029939450323581696,
0.05903373658657074,
-0.08739764988422394,
-0.07407001405954361,
0.062721848487854,
-0.008793327957391739,
0.1399203985929489,
0.0883619636297226,
-0.0999896451830864,
0.008838468231260777,
-0.15228645503520966,
-0.003978121094405651,
0.15210884809494019,
0.029275663197040558,
0.02650274708867073,
-0.1768425852060318,
0.025017080828547478,
0.019473407417535782,
0.061545927077531815,
0.08872876316308975,
0.1423572599887848,
-0.03766702115535736,
0.013558068312704563,
-0.09680065512657166,
-0.07961385697126389,
0.00008756426541367546,
-0.01167035847902298,
0.04515831917524338,
0.028426233679056168,
0.15715579688549042,
-0.046462271362543106,
0.0487116277217865,
-0.06596537679433823,
0.030973054468631744,
-0.034930188208818436,
-0.10550550371408463,
-0.15538813173770905,
-0.015075634233653545,
0.055406179279088974,
-0.028251584619283676,
0.15632006525993347,
0.009013648144900799,
-0.040815286338329315,
0.05763040482997894,
0.10689164698123932,
-0.01699090376496315,
-0.019698308780789375,
0.07194598764181137,
0.009368807077407837,
-0.047426193952560425,
0.01292494684457779,
0.016381047666072845,
0.02769475057721138,
0.023260854184627533,
0.07351010292768478,
0.05991201847791672,
0.08597829937934875,
0.03305250406265259,
-0.028759555891156197,
-0.05092475563287735,
-0.00947593990713358,
-0.05121524631977081,
-0.03955696150660515,
-0.029337313026189804,
0.014126411639153957,
0.06882098317146301,
0.10301748663187027,
-0.07133636623620987,
0.04105246439576149,
-0.05392911657691002,
-0.10404029488563538,
-0.18844859302043915,
-0.10471644997596741,
-0.08531004935503006,
-0.03930225223302841,
-0.020445197820663452,
-0.13488835096359253,
-0.03164878860116005,
0.0029973951168358326,
0.05570230260491371,
-0.049137458205223083,
0.14265038073062897,
-0.12103026360273361,
-0.08822795003652573,
0.13841329514980316,
-0.014521660283207893,
0.03608177229762077,
-0.049249209463596344,
0.047408539801836014,
0.002850794931873679,
-0.03004796989262104,
0.0354013592004776,
-0.007149302400648594,
0.04909814894199371,
0.0031833986286073923,
-0.14088302850723267,
-0.10730977356433868,
-0.016800709068775177,
0.043905891478061676,
0.03624577075242996,
0.18820852041244507,
0.03867929056286812,
-0.01569940336048603,
-0.0038877849001437426,
0.1525755524635315,
-0.038312580436468124,
0.016025515273213387,
-0.09096596390008926,
0.14798910915851593,
-0.0435212105512619,
0.06613113731145859,
0.01907716877758503,
-0.014906700700521469,
-0.00426823366433382,
0.23204940557479858,
0.1993902027606964,
-0.10466143488883972,
0.04365159943699837,
-0.029214050620794296,
0.03870566934347153,
0.09939407557249069,
0.027534743770956993,
0.10314741730690002,
0.0560845285654068,
-0.08708135783672333,
0.02280750498175621,
-0.06593271344900131,
-0.009381345473229885,
-0.047609783709049225,
0.007752171717584133,
0.08181726187467575,
0.0051834541372954845,
-0.08003370463848114,
0.11223414540290833,
-0.11249341070652008,
0.05678668990731239,
-0.02677987329661846,
-0.11696913838386536,
-0.1571592390537262,
-0.0020071077160537243,
-0.08061223477125168,
0.04333208501338959,
0.059667110443115234,
-0.009219865314662457,
0.0010097979102283716,
0.06698373705148697,
0.02161877602338791,
-0.15748056769371033,
-0.0900769904255867,
0.08507439494132996,
-0.04806409776210785,
0.08569414168596268,
0.02899637445807457,
0.08862114697694778,
0.09070495516061783,
-0.07303782552480698,
-0.08842470496892929,
0.07885873317718506,
-0.04394180700182915,
-0.034270770847797394,
0.0450163297355175,
0.01649663597345352,
0.05289585888385773,
0.08738591521978378,
0.0969754159450531,
-0.0961088091135025,
0.011518843472003937,
0.016107643023133278,
-0.07174818217754364,
-0.11932305246591568,
0.1478569358587265,
-0.10100726783275604,
0.04872005432844162,
0.1340559720993042,
-0.04361606761813164,
-0.03082970529794693,
-0.033828288316726685,
0.030741354450583458,
-0.006400391925126314,
0.007876347750425339,
-0.04291346296668053,
-0.1812306046485901,
-0.005871286615729332,
-0.09502407163381577,
0.028228752315044403,
-0.16893887519836426,
-0.046060748398303986,
-0.11822514981031418,
0.009457445703446865,
-0.06335680186748505,
0.07760293781757355,
-0.10029575973749161,
0.016541579738259315,
-0.014985884539783001,
-0.12199696898460388,
0.03732510283589363,
0.08904744684696198,
-0.11331432312726974,
-0.07170100510120392
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7751
- Accuracy: 0.9113
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 4.315 | 1.0 | 318 | 3.3087 | 0.74 |
| 2.6371 | 2.0 | 636 | 1.8833 | 0.8381 |
| 1.5388 | 3.0 | 954 | 1.1547 | 0.8929 |
| 1.0076 | 4.0 | 1272 | 0.8590 | 0.9071 |
| 0.79 | 5.0 | 1590 | 0.7751 | 0.9113 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.7.1
- Datasets 1.16.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["clinc_oos"], "metrics": ["accuracy"], "model-index": [{"name": "distilbert-base-uncased-finetuned-clinc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "clinc_oos", "type": "clinc_oos", "args": "plus"}, "metrics": [{"type": "accuracy", "value": 0.9112903225806451, "name": "Accuracy"}]}]}]}
|
text-classification
|
arianpasquali/distilbert-base-uncased-finetuned-clinc
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:clinc_oos",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-clinc_oos #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-clinc
=======================================
This model is a fine-tuned version of distilbert-base-uncased on the clinc\_oos dataset.
It achieves the following results on the evaluation set:
* Loss: 0.7751
* Accuracy: 0.9113
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 48
* eval\_batch\_size: 48
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.7.1
* Datasets 1.16.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 48\n* eval\\_batch\\_size: 48\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.7.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-clinc_oos #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 48\n* eval\\_batch\\_size: 48\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.7.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
70,
98,
4,
30
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-clinc_oos #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 48\n* eval\\_batch\\_size: 48\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.7.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
-0.09138041734695435,
0.0805753543972969,
-0.0028635701164603233,
0.12939096987247467,
0.15003494918346405,
0.0272093303501606,
0.13568271696567535,
0.11968731135129929,
-0.07655467092990875,
0.01936168596148491,
0.1025620698928833,
0.15764380991458893,
0.03176047280430794,
0.11379195004701614,
-0.07844956964254379,
-0.24371878802776337,
-0.006638345308601856,
0.040245186537504196,
-0.0705924779176712,
0.1290901154279709,
0.09970040619373322,
-0.11111009120941162,
0.09125328809022903,
0.0019949653651565313,
-0.152124285697937,
0.007769706193357706,
-0.00004554802944767289,
-0.06793411076068878,
0.12266729027032852,
0.025695746764540672,
0.10015777498483658,
0.01285749301314354,
0.07834247499704361,
-0.19562390446662903,
0.00940155889838934,
0.04212658479809761,
-0.01596742868423462,
0.07860515266656876,
0.03740369528532028,
0.0042329044081270695,
0.13370776176452637,
-0.08553262054920197,
0.05662371218204498,
0.017439553514122963,
-0.11930418759584427,
-0.21764163672924042,
-0.06304466724395752,
0.0259014293551445,
0.08345761895179749,
0.11432687938213348,
-0.004964624997228384,
0.13548564910888672,
-0.10576966404914856,
0.09410818666219711,
0.2126169055700302,
-0.24987336993217468,
-0.06655324995517731,
0.03920959308743477,
0.024015409871935844,
0.0813831090927124,
-0.10797538608312607,
-0.04847992584109306,
0.03484122082591057,
0.0430210679769516,
0.11798095703125,
-0.03878393769264221,
-0.06779082119464874,
0.018200578168034554,
-0.13906684517860413,
-0.03387003391981125,
0.19263555109500885,
0.0645078793168068,
-0.033311259001493454,
-0.0308873001486063,
-0.060808878391981125,
-0.14734148979187012,
-0.02731049805879593,
-0.005395612213760614,
0.06829450279474258,
-0.02376134693622589,
-0.04111864045262337,
0.00013071736611891538,
-0.10866199433803558,
-0.04122856631875038,
-0.08249568939208984,
0.13417157530784607,
0.025773191824555397,
0.009029910899698734,
-0.026454882696270943,
0.1027301549911499,
0.003652178682386875,
-0.1236182451248169,
0.0032138805836439133,
0.03993876650929451,
0.029039761051535606,
-0.03377452865242958,
-0.060852110385894775,
-0.03831104934215546,
0.033112458884716034,
0.11120054870843887,
-0.03858617693185806,
0.035291511565446854,
0.026304353028535843,
0.04535882920026779,
-0.07453900575637817,
0.1838371455669403,
-0.0042487322352826595,
-0.01624676212668419,
0.012468709610402584,
0.06413116306066513,
0.013484232127666473,
-0.018481070175766945,
-0.11837615072727203,
0.03205317258834839,
0.09370193630456924,
-0.0037362920120358467,
-0.05781980976462364,
0.0539930984377861,
-0.07557294517755508,
-0.03265449404716492,
-0.019680969417095184,
-0.10540199279785156,
0.044291868805885315,
0.004516812041401863,
-0.09010756015777588,
-0.02156106010079384,
0.027906043455004692,
0.0379977710545063,
-0.03698251396417618,
0.09952842444181442,
-0.08508068323135376,
0.033168815076351166,
-0.08488542586565018,
-0.08539974689483643,
0.008650947362184525,
-0.10112956166267395,
0.03673205524682999,
-0.09297113120555878,
-0.17763182520866394,
-0.035069093108177185,
0.06399715691804886,
-0.00939642358571291,
-0.0764317438006401,
-0.08146112412214279,
-0.06763873249292374,
0.005417556036263704,
-0.005328122526407242,
0.10832824558019638,
-0.0670398399233818,
0.10051046311855316,
0.03287584334611893,
0.045526280999183655,
-0.07190293073654175,
0.05836682766675949,
-0.1300978809595108,
0.013440989889204502,
-0.11953581869602203,
0.03460210561752319,
-0.02912457473576069,
0.06803678721189499,
-0.06042545288801193,
-0.1024242639541626,
0.01708337478339672,
-0.0018071929225698113,
0.0458897203207016,
0.08322227746248245,
-0.15967488288879395,
-0.07766646146774292,
0.12280578911304474,
-0.06160468980669975,
-0.12048491090536118,
0.1212378665804863,
-0.059183068573474884,
0.03640870004892349,
0.05840636417269707,
0.18058381974697113,
0.06337533891201019,
-0.06723523139953613,
0.01873442903161049,
-0.010065575130283833,
0.0754593014717102,
-0.05319252982735634,
0.09928726404905319,
0.009331961162388325,
0.014551841653883457,
0.029731445014476776,
-0.03699113056063652,
0.02831087075173855,
-0.08045557141304016,
-0.10642652213573456,
-0.04232000187039375,
-0.08492571115493774,
0.024975188076496124,
0.07237349450588226,
0.06601578742265701,
-0.10883358120918274,
-0.06983937323093414,
0.026106707751750946,
0.08973378688097,
-0.05606304854154587,
0.02011694945394993,
-0.06786616891622543,
0.08158467710018158,
-0.030806565657258034,
-0.01417283434420824,
-0.1684342920780182,
-0.010276706889271736,
0.01476363092660904,
0.008755004033446312,
0.029413266107439995,
0.03907253220677376,
0.06269369274377823,
0.0693918764591217,
-0.03699866682291031,
-0.024323882535099983,
-0.043219443410634995,
-0.002666828688234091,
-0.11158555001020432,
-0.1934543251991272,
-0.024845773354172707,
-0.020113319158554077,
0.1601647436618805,
-0.2229793518781662,
0.05032581835985184,
-0.0034734560176730156,
0.08104207366704941,
0.014351666904985905,
-0.00861679669469595,
-0.055948350578546524,
0.0783609002828598,
-0.04562331736087799,
-0.05057571455836296,
0.07197672873735428,
0.014401257038116455,
-0.09887449443340302,
-0.08540119230747223,
-0.10181254148483276,
0.20228588581085205,
0.13523894548416138,
-0.10563614219427109,
-0.045815709978342056,
-0.01556231826543808,
-0.07541853934526443,
-0.02810012362897396,
-0.05129619687795639,
0.04018574208021164,
0.20527848601341248,
-0.021164244040846825,
0.13223430514335632,
-0.07652727514505386,
-0.03223755955696106,
0.023059172555804253,
-0.05025073140859604,
0.008856070227921009,
0.13914839923381805,
0.11014259606599808,
-0.09420570731163025,
0.15698878467082977,
0.1691674441099167,
-0.0755954384803772,
0.12973907589912415,
-0.04742017388343811,
-0.05505235493183136,
-0.029616190120577812,
-0.026151981204748154,
-0.01248571090400219,
0.08783125132322311,
-0.16944481432437897,
0.012471129186451435,
0.018955066800117493,
0.015332058072090149,
0.01617797091603279,
-0.22060835361480713,
-0.03633292019367218,
0.04989897087216377,
-0.030726471915841103,
-0.031014766544103622,
-0.023952294141054153,
0.005629970226436853,
0.09882442653179169,
-0.012041806243360043,
-0.1110035851597786,
0.05716637521982193,
0.0019096829928457737,
-0.06994529813528061,
0.20532062649726868,
-0.07991626858711243,
-0.1669243574142456,
-0.12246154993772507,
-0.062291450798511505,
-0.07827311754226685,
0.013519013300538063,
0.07287314534187317,
-0.07165572792291641,
-0.031024837866425514,
-0.08897244930267334,
0.022916574031114578,
0.003586068982258439,
0.025875844061374664,
0.03193225339055061,
0.013513456098735332,
0.06959434598684311,
-0.1006019338965416,
-0.03548674285411835,
-0.043558910489082336,
-0.06811831146478653,
0.04083035886287689,
0.02907492220401764,
0.11593692749738693,
0.12282074242830276,
-0.017757505178451538,
0.0052284179255366325,
-0.006445193197578192,
0.2188936173915863,
-0.06538786739110947,
-0.03767562657594681,
0.13506849110126495,
-0.008490603417158127,
0.049359746277332306,
0.10544552654027939,
0.06410159915685654,
-0.08354206383228302,
0.002667500637471676,
0.03824783116579056,
-0.031214412301778793,
-0.22852708399295807,
-0.03807375580072403,
-0.061770547181367874,
-0.0026551245246082544,
0.09277717024087906,
0.03765948861837387,
0.05425923317670822,
0.0678682029247284,
0.04744759574532509,
0.09855607897043228,
-0.02257356606423855,
0.05111444368958473,
0.1256592720746994,
0.05198384448885918,
0.1076817587018013,
-0.02002089098095894,
-0.06468695402145386,
0.04761088639497757,
-0.013859082944691181,
0.20548518002033234,
0.019321952015161514,
0.12158339470624924,
0.04487380385398865,
0.16067519783973694,
-0.025608684867620468,
0.06624076515436172,
0.003261699341237545,
-0.01850913278758526,
-0.022449037060141563,
-0.03203939273953438,
-0.03517113998532295,
0.032895442098379135,
-0.04401647299528122,
0.07510165125131607,
-0.13952386379241943,
0.020221827551722527,
0.05564289540052414,
0.24364475905895233,
0.005812858231365681,
-0.3302593231201172,
-0.07775919139385223,
0.01652088761329651,
-0.04157378524541855,
-0.023758435621857643,
0.044077400118112564,
0.07778344303369522,
-0.08811014145612717,
0.02548827789723873,
-0.048633966594934464,
0.10228454321622849,
-0.056084033101797104,
0.05172562226653099,
0.07232960313558578,
0.08953598141670227,
0.013801504857838154,
0.09524087607860565,
-0.3063105344772339,
0.257615864276886,
-0.005452238954603672,
0.06926773488521576,
-0.08372007310390472,
0.004010161384940147,
0.02735704556107521,
0.06631467491388321,
0.07542572915554047,
-0.010026910342276096,
-0.01949060522019863,
-0.18119101226329803,
-0.06964803487062454,
0.03670208528637886,
0.0672854334115982,
-0.08252174407243729,
0.08539514243602753,
-0.034309305250644684,
0.011084061115980148,
0.057016003876924515,
-0.0009222693624906242,
-0.04478457570075989,
-0.09057004749774933,
0.0047212871722877026,
0.0563017912209034,
-0.02832445502281189,
-0.06813856959342957,
-0.10386946052312851,
-0.10620477795600891,
0.15005309879779816,
-0.014608322642743587,
-0.028952764347195625,
-0.1040809229016304,
0.08074650913476944,
0.06834498792886734,
-0.07890157401561737,
0.005493413656949997,
0.012706833891570568,
0.06603964418172836,
0.03564617782831192,
-0.0701473206281662,
0.11375871300697327,
-0.0633968710899353,
-0.16838675737380981,
-0.06253721565008163,
0.11978419870138168,
0.03195018321275711,
0.07264921069145203,
-0.01310057658702135,
0.006398954428732395,
-0.04820280522108078,
-0.07766451686620712,
0.026482023298740387,
-0.002049738774076104,
0.07207487523555756,
0.0338967964053154,
-0.04401716962456703,
0.006689238827675581,
-0.07655426859855652,
-0.047813981771469116,
0.1702144294977188,
0.23477478325366974,
-0.06755532324314117,
0.02530568465590477,
0.03148984536528587,
-0.08048218488693237,
-0.14990460872650146,
0.01977430284023285,
0.03818182274699211,
0.014516953378915787,
0.035591769963502884,
-0.16418346762657166,
0.1259060502052307,
0.11164569854736328,
-0.008119086734950542,
0.11676328629255295,
-0.33048009872436523,
-0.1156536340713501,
0.13647739589214325,
0.13495180010795593,
0.15046323835849762,
-0.14858724176883698,
-0.008772564120590687,
-0.02127898670732975,
-0.13238732516765594,
0.12384499609470367,
-0.09748832136392593,
0.11656691879034042,
-0.03729773312807083,
0.08544103801250458,
0.012981198728084564,
-0.04858189821243286,
0.13387809693813324,
0.027929922565817833,
0.09668140858411789,
-0.0891759991645813,
-0.03658219054341316,
0.03610789403319359,
-0.03315548226237297,
0.011328941211104393,
-0.10117870569229126,
0.02600053697824478,
-0.12073266506195068,
-0.03016655519604683,
-0.06185861676931381,
0.03402210399508476,
-0.042123906314373016,
-0.058065228164196014,
-0.026115383952856064,
0.03271622583270073,
0.0689224898815155,
0.0009181797504425049,
0.16149505972862244,
0.033018309623003006,
0.14010249078273773,
0.10872470587491989,
0.0747959315776825,
-0.07064488530158997,
-0.07420412451028824,
-0.03313163295388222,
0.0009204995003528893,
0.05026875436306,
-0.11466356366872787,
0.022794274613261223,
0.16031284630298615,
0.011153800413012505,
0.15814706683158875,
0.09086782485246658,
-0.00456122774630785,
0.004688995890319347,
0.047746866941452026,
-0.17170289158821106,
-0.07839952409267426,
-0.028609467670321465,
-0.04824419692158699,
-0.11799491941928864,
0.052338071167469025,
0.10723426192998886,
-0.07279065251350403,
-0.00502513162791729,
-0.0075942096300423145,
0.04029371216893196,
-0.07215652614831924,
0.17291119694709778,
0.04173655062913895,
0.04940951243042946,
-0.09325510263442993,
0.07401158660650253,
0.07456694543361664,
-0.07701466977596283,
0.0008211841923184693,
0.06657659262418747,
-0.06998362392187119,
-0.046833720058202744,
0.07540878653526306,
0.1959739774465561,
-0.04829135164618492,
-0.06125994771718979,
-0.16086845099925995,
-0.13316765427589417,
0.0826062336564064,
0.12619538605213165,
0.11337549239397049,
0.01468501053750515,
-0.06088395044207573,
-0.011933774687349796,
-0.11358354240655899,
0.08061899989843369,
0.05107147991657257,
0.06027902662754059,
-0.14503636956214905,
0.10368268936872482,
-0.00803628284484148,
0.03713759034872055,
-0.011185670271515846,
0.015229119919240475,
-0.11059186607599258,
0.009911098517477512,
-0.08136764913797379,
-0.01440772321075201,
-0.019227415323257446,
0.024735944345593452,
0.008570403791964054,
-0.07486236840486526,
-0.05627976730465889,
0.021135935559868813,
-0.11222108453512192,
-0.02972632832825184,
0.037513185292482376,
0.06982113420963287,
-0.10221957415342331,
-0.056716278195381165,
0.024479243904352188,
-0.06676405668258667,
0.06275201588869095,
0.061960022896528244,
0.002516824286431074,
0.025019239634275436,
-0.15362516045570374,
0.03095293790102005,
0.05329974368214607,
0.03270817920565605,
0.06129838898777962,
-0.09503205865621567,
-0.00919845886528492,
0.021575745195150375,
0.0269145630300045,
0.01561509259045124,
0.08325794339179993,
-0.14407804608345032,
-0.016161618754267693,
-0.022633908316493034,
-0.113322414457798,
-0.05917361378669739,
0.013924959115684032,
0.09666340053081512,
0.023074597120285034,
0.21381042897701263,
-0.058116670697927475,
0.05853710323572159,
-0.2076190859079361,
0.006651587318629026,
0.010066579096019268,
-0.10631588101387024,
-0.10321011394262314,
-0.07968917489051819,
0.058745965361595154,
-0.05610688403248787,
0.13649579882621765,
0.04805351421236992,
0.06526324898004532,
0.020273346453905106,
-0.021262837573885918,
0.029343055561184883,
0.015521637164056301,
0.18684589862823486,
0.03724370896816254,
-0.040960561484098434,
0.07919855415821075,
0.016703173518180847,
0.11161557585000992,
0.10727938264608383,
0.19505074620246887,
0.13095954060554504,
0.005727855954319239,
0.09860370308160782,
0.041322723031044006,
-0.04261166974902153,
-0.16598188877105713,
0.04708090424537659,
-0.010842010378837585,
0.1053398922085762,
-0.03109952248632908,
0.18934711813926697,
0.05366265028715134,
-0.16455040872097015,
0.03181226924061775,
-0.0609547384083271,
-0.0822053924202919,
-0.1061236634850502,
-0.06070522218942642,
-0.09377767890691757,
-0.13932274281978607,
-0.0013744333991780877,
-0.11516912281513214,
0.01813335344195366,
0.09314875304698944,
0.003703972790390253,
-0.02999083139002323,
0.1436864584684372,
-0.0007016521412879229,
0.030763693153858185,
0.06311448663473129,
-0.013636979274451733,
-0.04229075834155083,
-0.11183398962020874,
-0.08885687589645386,
-0.028294241055846214,
-0.02697620913386345,
0.02423437498509884,
-0.06069353222846985,
-0.03208727762103081,
0.02641758881509304,
-0.03216124698519707,
-0.09518595039844513,
0.007488463073968887,
-0.00716990465298295,
0.05117865279316902,
0.050729911774396896,
0.0176183320581913,
0.02120930328965187,
0.008617808111011982,
0.21589164435863495,
-0.06918572634458542,
-0.06362909078598022,
-0.0979401022195816,
0.19417332112789154,
0.03967155143618584,
-0.03929399326443672,
0.046784624457359314,
-0.07154209166765213,
-0.004279704764485359,
0.21986497938632965,
0.19108986854553223,
-0.07771185040473938,
-0.009312240406870842,
0.010671033523976803,
-0.005507312715053558,
-0.014733809977769852,
0.09155414253473282,
0.1304141730070114,
0.03928708657622337,
-0.09262542426586151,
-0.042104434221982956,
-0.061453741043806076,
0.002514971187338233,
-0.035834673792123795,
0.054342735558748245,
0.04158032685518265,
0.01402977854013443,
-0.02849416248500347,
0.044021181762218475,
-0.06971363723278046,
-0.09259422868490219,
0.06720462441444397,
-0.214630126953125,
-0.1521640568971634,
-0.032324858009815216,
0.11940078437328339,
0.004842227324843407,
0.06404096633195877,
-0.02574492059648037,
-0.01525439228862524,
0.07242799550294876,
-0.01510231476277113,
-0.09351157397031784,
-0.06724538654088974,
0.09838158637285233,
-0.10707185417413712,
0.2149675339460373,
-0.04865894094109535,
0.08043201267719269,
0.11384329944849014,
0.0669390857219696,
-0.05725216493010521,
0.0632365271449089,
0.041711077094078064,
-0.03977005183696747,
0.03177797049283981,
0.07317148894071579,
-0.03564782813191414,
0.08167179673910141,
0.054301630705595016,
-0.11165003478527069,
0.008863674476742744,
-0.05003153532743454,
-0.04483479633927345,
-0.0249751228839159,
-0.029451614245772362,
-0.07548936456441879,
0.12181548774242401,
0.20842774212360382,
-0.03103221021592617,
-0.020288696512579918,
-0.07178337126970291,
0.03904557600617409,
0.05135827884078026,
-0.0031759419944137335,
-0.056065041571855545,
-0.20147526264190674,
0.0005199115839786828,
0.0507674440741539,
-0.016872117295861244,
-0.23222646117210388,
-0.09912046045064926,
-0.003989826887845993,
-0.07950137555599213,
-0.10785648971796036,
0.04982390254735947,
0.08939054608345032,
0.0371047779917717,
-0.07409662008285522,
-0.051582928746938705,
-0.07553485780954361,
0.1490384340286255,
-0.137285515666008,
-0.07560700178146362
] |
null | null |
transformers
|
# citizenlab/twitter-xlm-roberta-base-sentiment-finetunned
This is multilingual XLM-Roberta model sequence classifier fine tunned and based on [Cardiff NLP Group](cardiffnlp/twitter-roberta-base-sentiment) sentiment classification model.
## How to use it
```python
from transformers import pipeline
model_path = "citizenlab/twitter-xlm-roberta-base-sentiment-finetunned"
sentiment_classifier = pipeline("text-classification", model=model_path, tokenizer=model_path)
sentiment_classifier("this is a lovely message")
> [{'label': 'Positive', 'score': 0.9918450713157654}]
sentiment_classifier("you are an idiot and you and your family should go back to your country")
> [{'label': 'Negative', 'score': 0.9849833846092224}]
```
## Evaluation
```
precision recall f1-score support
Negative 0.57 0.14 0.23 28
Neutral 0.78 0.94 0.86 132
Positive 0.89 0.80 0.85 51
accuracy 0.80 211
macro avg 0.75 0.63 0.64 211
weighted avg 0.78 0.80 0.77 211
```
|
{"language": ["en", "nl", "fr", "pt", "it", "es", "de", "da", "pl", "af"], "datasets": ["jigsaw_toxicity_pred"], "metrics": ["F1 Accuracy"], "pipeline_type": "text-classification", "widget": [{"text": "this is a lovely message", "example_title": "Example 1", "multi_class": false}, {"text": "you are an idiot and you and your family should go back to your country", "example_title": "Example 2", "multi_class": false}]}
|
text-classification
|
citizenlab/twitter-xlm-roberta-base-sentiment-finetunned
|
[
"transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"en",
"nl",
"fr",
"pt",
"it",
"es",
"de",
"da",
"pl",
"af",
"dataset:jigsaw_toxicity_pred",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en",
"nl",
"fr",
"pt",
"it",
"es",
"de",
"da",
"pl",
"af"
] |
TAGS
#transformers #pytorch #xlm-roberta #text-classification #en #nl #fr #pt #it #es #de #da #pl #af #dataset-jigsaw_toxicity_pred #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# citizenlab/twitter-xlm-roberta-base-sentiment-finetunned
This is multilingual XLM-Roberta model sequence classifier fine tunned and based on Cardiff NLP Group sentiment classification model.
## How to use it
## Evaluation
|
[
"# citizenlab/twitter-xlm-roberta-base-sentiment-finetunned\n\nThis is multilingual XLM-Roberta model sequence classifier fine tunned and based on Cardiff NLP Group sentiment classification model.",
"## How to use it",
"## Evaluation"
] |
[
"TAGS\n#transformers #pytorch #xlm-roberta #text-classification #en #nl #fr #pt #it #es #de #da #pl #af #dataset-jigsaw_toxicity_pred #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# citizenlab/twitter-xlm-roberta-base-sentiment-finetunned\n\nThis is multilingual XLM-Roberta model sequence classifier fine tunned and based on Cardiff NLP Group sentiment classification model.",
"## How to use it",
"## Evaluation"
] |
[
76,
52,
5,
3
] |
[
"passage: TAGS\n#transformers #pytorch #xlm-roberta #text-classification #en #nl #fr #pt #it #es #de #da #pl #af #dataset-jigsaw_toxicity_pred #autotrain_compatible #endpoints_compatible #has_space #region-us \n# citizenlab/twitter-xlm-roberta-base-sentiment-finetunned\n\nThis is multilingual XLM-Roberta model sequence classifier fine tunned and based on Cardiff NLP Group sentiment classification model.## How to use it## Evaluation"
] |
[
-0.07355473190546036,
0.09789261966943741,
-0.004880743566900492,
0.07343786954879761,
0.10009933263063431,
0.03665481135249138,
0.04076831415295601,
0.1267467588186264,
0.06555552035570145,
0.08058282732963562,
0.09387864917516708,
0.15701965987682343,
-0.027439160272479057,
0.07825560867786407,
-0.11642365902662277,
-0.22030822932720184,
0.022944042459130287,
0.03503154218196869,
-0.04024698585271835,
0.15522290766239166,
0.11863904446363449,
-0.0791923999786377,
0.1223057359457016,
-0.007335760165005922,
-0.1125413030385971,
0.026401596143841743,
0.0327923484146595,
-0.06605958193540573,
0.1251624971628189,
0.06991368532180786,
0.05415892228484154,
0.06645414233207703,
0.020458433777093887,
-0.09224328398704529,
0.05780697613954544,
-0.004318479914218187,
-0.061447594314813614,
0.09381797164678574,
0.06395004689693451,
-0.13898883759975433,
0.15945976972579956,
-0.023767024278640747,
0.003960215486586094,
0.0507182702422142,
-0.1143740564584732,
-0.10106083005666733,
0.0002490808838047087,
0.07461346685886383,
0.0380137674510479,
0.05550893396139145,
-0.06624121963977814,
0.21726752817630768,
-0.1628420352935791,
0.07070889323949814,
0.11833659559488297,
-0.10107364505529404,
-0.04601924121379852,
0.11475519090890884,
0.07134070992469788,
0.00868235807865858,
-0.08944462239742279,
0.08838943392038345,
0.0542939193546772,
0.0016719111008569598,
-0.02855333499610424,
-0.08998166024684906,
0.032947659492492676,
0.009881802834570408,
-0.10673356056213379,
0.0508287288248539,
0.27736717462539673,
0.04509453848004341,
-0.013084739446640015,
-0.011446642689406872,
-0.06921160966157913,
0.008814836852252483,
0.023925909772515297,
-0.09516818076372147,
-0.03900548443198204,
0.007003914099186659,
0.017411526292562485,
0.06881660223007202,
-0.07549148052930832,
0.01089942641556263,
-0.17318706214427948,
0.2565314471721649,
-0.02490047924220562,
0.02654421702027321,
-0.06429409235715866,
0.04445617273449898,
-0.0070789884775877,
-0.018797099590301514,
-0.0036567721981555223,
-0.06628189235925674,
-0.027482567355036736,
-0.03175743669271469,
0.008934089913964272,
0.03682093322277069,
0.08865317702293396,
0.01429720688611269,
0.02783251740038395,
-0.0011965497396886349,
-0.015992892906069756,
0.03709576651453972,
0.17880934476852417,
0.059782177209854126,
-0.09990211576223373,
-0.05456594005227089,
0.010656501166522503,
0.016748277470469475,
-0.009630405344069004,
0.010709799826145172,
-0.12284601479768753,
-0.008919542655348778,
0.021971948444843292,
0.06804971396923065,
0.03403453156352043,
0.1405096799135208,
-0.09842011332511902,
-0.05225122347474098,
0.01046613696962595,
-0.017236370593309402,
0.07546259462833405,
0.06902982294559479,
-0.05410664901137352,
0.17366532981395721,
-0.025433573871850967,
-0.021395642310380936,
-0.014933931641280651,
0.04434815049171448,
-0.05358649790287018,
0.011249052360653877,
-0.0347776934504509,
-0.13252606987953186,
0.030050039291381836,
-0.0015209594275802374,
0.005530071910470724,
-0.15625810623168945,
-0.1062157154083252,
-0.05896591767668724,
-0.03338911756873131,
-0.05644087865948677,
-0.05899394303560257,
-0.10165071487426758,
0.01215664204210043,
0.03705652058124542,
-0.016138633713126183,
-0.0790756344795227,
-0.057403698563575745,
0.07028412818908691,
-0.014944193884730339,
0.10176635533571243,
-0.09795556217432022,
0.04038158431649208,
-0.13387030363082886,
0.00836128368973732,
0.004057518672198057,
0.04608410224318504,
-0.007337486371397972,
0.06652597337961197,
-0.057078663259744644,
-0.06849053502082825,
-0.037341151386499405,
0.02076598070561886,
-0.07242433726787567,
0.18148642778396606,
-0.13770543038845062,
-0.09331386536359787,
0.012110639363527298,
-0.07758685201406479,
-0.04301004856824875,
0.11883034557104111,
-0.023997146636247635,
0.09913644194602966,
0.09286338090896606,
0.07117757946252823,
0.06470644474029541,
-0.09901223331689835,
-0.05475170537829399,
0.06349717080593109,
-0.1467125117778778,
0.02480502612888813,
0.04859127849340439,
0.054039616137742996,
-0.11762861162424088,
0.02271655574440956,
-0.05522095039486885,
0.07584881782531738,
-0.05064533278346062,
-0.06292858719825745,
0.005260436329990625,
-0.008161996491253376,
0.1141897514462471,
0.02389952912926674,
0.008270573802292347,
-0.084539495408535,
-0.0548550970852375,
-0.1120811402797699,
0.12204272300004959,
-0.003713166108354926,
0.009877029806375504,
-0.10143649578094482,
0.124942347407341,
-0.005304789170622826,
0.004812205675989389,
-0.1545138657093048,
0.0462774820625782,
-0.020136501640081406,
0.07172729820013046,
0.02464182674884796,
0.11848737299442291,
-0.0007192945922724903,
-0.06751653552055359,
-0.02344108559191227,
-0.018262427300214767,
0.10761834681034088,
-0.015239033848047256,
-0.036306533962488174,
-0.25340282917022705,
0.07194072753190994,
-0.0666041150689125,
0.06324002146720886,
-0.16628381609916687,
0.004566605668514967,
0.151316300034523,
0.03568893298506737,
-0.06619523465633392,
0.09117943048477173,
0.028626035898923874,
0.018588054925203323,
-0.11097630858421326,
0.00879274494946003,
0.10042901337146759,
-0.05139584466814995,
-0.09898063540458679,
0.06784085184335709,
-0.10450784862041473,
0.09314601868391037,
0.16275587677955627,
-0.12580564618110657,
-0.04163386672735214,
0.018018407747149467,
-0.0349917896091938,
0.056095611304044724,
-0.07536032795906067,
0.021531667560338974,
0.09710856527090073,
-0.04156454652547836,
0.07406596839427948,
-0.038653429597616196,
-0.02546161413192749,
-0.01586642675101757,
-0.03857618570327759,
-0.09740667045116425,
0.13208889961242676,
0.03736297786235809,
-0.18234467506408691,
0.15076914429664612,
0.12338146567344666,
-0.00583491800352931,
0.06707996129989624,
0.07362239807844162,
0.0487096831202507,
-0.05980098247528076,
-0.030050626024603844,
-0.03685876354575157,
0.07541104406118393,
-0.12029492110013962,
-0.007860536687076092,
0.03768705204129219,
-0.025301726534962654,
-0.02588345855474472,
-0.08894701302051544,
-0.0364471934735775,
-0.014183253049850464,
-0.008780230768024921,
-0.060107432305812836,
0.0705183893442154,
0.04266003146767616,
0.15105539560317993,
-0.011468876153230667,
-0.09490357339382172,
0.0546356700360775,
0.005468455608934164,
-0.12470093369483948,
0.1629210114479065,
-0.12270425260066986,
-0.3988935947418213,
-0.0019868649542331696,
-0.08959189057350159,
-0.05576683580875397,
0.0198503490537405,
0.009747808799147606,
-0.1415432095527649,
-0.0434085875749588,
0.0020824335515499115,
0.15718260407447815,
-0.09316366165876389,
0.025176797062158585,
-0.06248662248253822,
0.03623546287417412,
-0.010408061556518078,
-0.0721147432923317,
-0.006394932512193918,
-0.06754341721534729,
0.02667437121272087,
0.11078871786594391,
-0.10208254307508469,
0.08483728766441345,
0.18061767518520355,
-0.013087787665426731,
0.03163199871778488,
-0.052201271057128906,
0.1949097067117691,
-0.10812436044216156,
0.016643362119793892,
0.04591584578156471,
-0.020301764830946922,
0.012840821407735348,
0.13593606650829315,
0.02698153629899025,
-0.06430580466985703,
0.019249293953180313,
0.020489389076828957,
-0.04340844601392746,
-0.1564473807811737,
-0.16104042530059814,
-0.03861279785633087,
0.0637635812163353,
0.05481071397662163,
0.10756722837686539,
0.05609078332781792,
0.09809548407793045,
0.025066383183002472,
-0.061653461307287216,
-0.0755973607301712,
0.052474360913038254,
0.14666511118412018,
-0.061533331871032715,
0.10801738500595093,
-0.031113160774111748,
-0.11809401214122772,
0.12672150135040283,
-0.01816261000931263,
0.015597722493112087,
0.05002490431070328,
-0.01999969407916069,
0.050477784126996994,
0.1207447350025177,
0.13111665844917297,
0.04202566668391228,
-0.0031003993935883045,
-0.04259849712252617,
-0.029309310019016266,
-0.03207335248589516,
0.0003610870335251093,
0.022527102380990982,
0.13863638043403625,
0.06872311979532242,
-0.03884119912981987,
-0.09859420359134674,
0.14818422496318817,
0.010589439421892166,
0.027223657816648483,
-0.19725143909454346,
0.0038279027212411165,
0.06455715745687485,
0.04945920407772064,
-0.025801051408052444,
0.03741633892059326,
-0.000017205593394464813,
-0.09805861115455627,
0.09402047097682953,
-0.030582968145608902,
0.10814865678548813,
-0.06804399192333221,
0.04653623327612877,
-0.04956313595175743,
-0.009669827297329903,
-0.01905916817486286,
0.09673884510993958,
-0.2604445219039917,
0.21382196247577667,
0.022958287969231606,
-0.05292888358235359,
-0.06766956299543381,
-0.049291256815195084,
0.056770067662000656,
0.26977017521858215,
0.165667325258255,
0.04768870025873184,
-0.039878420531749725,
-0.11700894683599472,
-0.04704606533050537,
0.030254559591412544,
0.0523039773106575,
0.009765112772583961,
0.06493844836950302,
0.006204685661941767,
0.020107116550207138,
0.010288308374583721,
-0.006283747032284737,
-0.06509356200695038,
-0.07059929519891739,
0.06520508974790573,
0.09216282516717911,
0.004165992606431246,
0.01713692396879196,
-0.1149783730506897,
-0.14282694458961487,
0.21079187095165253,
-0.13443855941295624,
-0.0015691830776631832,
-0.15171070396900177,
0.11300501972436905,
-0.019088583067059517,
-0.07552608102560043,
-0.13132119178771973,
-0.029208136722445488,
0.033477991819381714,
-0.048582956194877625,
-0.13645531237125397,
0.12047374248504639,
-0.08607643097639084,
-0.1138215959072113,
-0.06127497926354408,
0.13422968983650208,
0.08774668723344803,
0.07123999297618866,
0.021803203970193863,
0.012590802274644375,
-0.07622789591550827,
-0.115386962890625,
0.054687898606061935,
0.02261560969054699,
-0.08352608978748322,
0.11198017746210098,
-0.03351380676031113,
-0.14176081120967865,
-0.04168866574764252,
-0.008531567640602589,
0.1617411971092224,
0.16692890226840973,
-0.07255853712558746,
0.0859956368803978,
0.08823705464601517,
-0.09417928755283356,
-0.28535667061805725,
0.07409226894378662,
-0.0640290305018425,
0.04138707369565964,
-0.040066905319690704,
-0.07854340225458145,
0.10213004052639008,
0.07603662461042404,
-0.008595321327447891,
-0.03001234494149685,
-0.1930365115404129,
-0.05413983762264252,
0.21434977650642395,
0.004187411163002253,
0.2941633462905884,
-0.0971251130104065,
-0.02980509027838707,
-0.0450822114944458,
0.018994243815541267,
0.19458627700805664,
-0.07367004454135895,
0.07390318810939789,
-0.043165747076272964,
0.10187133401632309,
0.05109141767024994,
0.023368852213025093,
0.19097380340099335,
0.03008381463587284,
0.06639508157968521,
-0.05670465528964996,
-0.2288731336593628,
0.10354635864496231,
0.03898244351148605,
0.0431082621216774,
-0.02074149064719677,
-0.010722873732447624,
-0.08382531255483627,
-0.037667080760002136,
-0.1222972720861435,
0.10112233459949493,
-0.01321469061076641,
-0.061211418360471725,
-0.16755016148090363,
0.07256904244422913,
0.03758528456091881,
-0.04667286574840546,
0.10116482526063919,
-0.10489724576473236,
0.04304271936416626,
0.07388606667518616,
0.20131029188632965,
0.049140769988298416,
-0.07435419410467148,
0.05950272083282471,
-0.04548732936382294,
0.08613614737987518,
-0.10334978252649307,
0.03395470231771469,
0.13976828753948212,
0.005701761692762375,
0.12592573463916779,
0.0892087072134018,
-0.03861873969435692,
-0.0007554561598226428,
0.12021889537572861,
-0.13819947838783264,
-0.08419858664274216,
-0.052348632365465164,
-0.14339463412761688,
-0.06924214214086533,
-0.007758093532174826,
0.11189687997102737,
-0.014372402802109718,
-0.013683692552149296,
0.030340194702148438,
-0.011326359584927559,
-0.022015666589140892,
0.11499980092048645,
0.052847471088171005,
0.00873649027198553,
-0.1130627915263176,
0.02713322453200817,
0.06118357926607132,
-0.11265899240970612,
-0.01913224160671234,
0.08255436271429062,
-0.09830677509307861,
-0.09226690232753754,
-0.03758983686566353,
0.18394654989242554,
-0.09426058828830719,
-0.06435871869325638,
-0.1107681468129158,
-0.19471018016338348,
0.07194848358631134,
0.15258020162582397,
0.1036747470498085,
0.06693826615810394,
-0.09504200518131256,
-0.0517951101064682,
-0.06772882491350174,
0.04732099547982216,
0.03605465590953827,
-0.016093982383608818,
-0.12396653741598129,
0.09930691123008728,
0.00014846383419353515,
0.13382408022880554,
-0.09659866243600845,
-0.08423008024692535,
-0.1432492882013321,
-0.007024284452199936,
-0.038689859211444855,
-0.0035649826750159264,
-0.11180653423070908,
-0.02248840592801571,
0.02893766202032566,
-0.05110087990760803,
-0.03848489001393318,
-0.02763495035469532,
-0.07234108448028564,
0.05199798196554184,
0.03422526270151138,
0.056919582188129425,
-0.046810299158096313,
-0.0626421794295311,
0.06038922443985939,
0.024036483839154243,
0.11590965837240219,
0.07008764892816544,
-0.08130921423435211,
-0.010246471501886845,
-0.24107705056667328,
-0.0037307185120880604,
0.14264695346355438,
0.023912038654088974,
0.06671419739723206,
-0.14048326015472412,
0.014556527137756348,
0.05107332020998001,
0.06909632682800293,
0.09084457904100418,
0.10148601979017258,
-0.04754927381873131,
0.11420674622058868,
-0.020133089274168015,
-0.10685563087463379,
-0.04398595914244652,
0.005135164130479097,
0.0073863971047103405,
0.03676755726337433,
0.15029634535312653,
-0.0735020637512207,
0.022129463031888008,
-0.07786760479211807,
0.03925400972366333,
-0.030761826783418655,
-0.10865642130374908,
-0.1777285486459732,
-0.029239902272820473,
0.04332805052399635,
-0.029156019911170006,
0.2148451805114746,
0.08230935037136078,
-0.10811838507652283,
0.04147826135158539,
0.14191769063472748,
0.012197999283671379,
-0.02109965868294239,
0.0665895864367485,
0.03114617057144642,
-0.0415240116417408,
0.0315546989440918,
0.05364081636071205,
0.02695104479789734,
0.01386828813701868,
0.09806391596794128,
0.034411270171403885,
0.18255719542503357,
0.0466749370098114,
-0.000722017080988735,
-0.03773844614624977,
0.04240093007683754,
-0.04733574390411377,
-0.0825078934431076,
0.0013499214546754956,
-0.0061407433822751045,
0.09443031251430511,
0.1078026294708252,
-0.03494451195001602,
0.059826306998729706,
-0.036642447113990784,
-0.06999993324279785,
-0.18911780416965485,
-0.22660097479820251,
-0.10347491502761841,
-0.11520174890756607,
-0.02471086196601391,
-0.13091804087162018,
-0.031870435923337936,
-0.06778175383806229,
0.06254561245441437,
-0.06683332473039627,
-0.017265021800994873,
-0.09180936962366104,
-0.08923530578613281,
0.11526410281658173,
-0.017915841192007065,
0.03985939919948578,
-0.09651991724967957,
0.05837034806609154,
-0.044451288878917694,
-0.015991276130080223,
0.01711033657193184,
0.008115936070680618,
0.02011827379465103,
0.0056763640604913235,
-0.15633966028690338,
-0.08976205438375473,
-0.025197237730026245,
0.032131992280483246,
0.023455535992980003,
0.1292855441570282,
0.028075838461518288,
-0.01833927072584629,
0.0012358390958979726,
0.1515517234802246,
-0.021603934466838837,
0.047748345881700516,
-0.06880171597003937,
0.13873493671417236,
-0.0656609982252121,
0.05179920047521591,
0.017087219282984734,
-0.023172438144683838,
-0.013643622398376465,
0.24995717406272888,
0.30255842208862305,
-0.07670634984970093,
0.048360832035541534,
-0.08468369394540787,
0.03988352417945862,
0.10284733027219772,
-0.01194666512310505,
0.1017809510231018,
0.1006510928273201,
-0.09215588122606277,
-0.006847034674137831,
-0.07011069357395172,
0.015511100180447102,
0.012984472326934338,
-0.0038671090733259916,
0.09742942452430725,
-0.016347547993063927,
-0.10898389667272568,
0.0714397132396698,
-0.1079372763633728,
0.10655979067087173,
0.05662048980593681,
-0.13064537942409515,
-0.1264023780822754,
-0.01450398564338684,
-0.05916629731655121,
0.09690476208925247,
0.0571432039141655,
-0.046208109706640244,
-0.03422797843813896,
0.07426980882883072,
0.013810195028781891,
-0.2681475877761841,
-0.07483840733766556,
0.13028095662593842,
-0.06015380844473839,
0.03962317481637001,
-0.025568155571818352,
0.03813006356358528,
0.1204998642206192,
-0.04540976881980896,
-0.07427392899990082,
0.06492545455694199,
-0.022379707545042038,
-0.044194623827934265,
0.047902002930641174,
0.022677643224596977,
0.0738506019115448,
0.10667458176612854,
0.0999726876616478,
-0.08540104329586029,
0.05482148379087448,
-0.00712290033698082,
-0.07815404236316681,
-0.12743525207042694,
0.11601484566926956,
-0.06594635546207428,
0.062281783670186996,
0.12409310787916183,
-0.009104955941438675,
-0.03628288581967354,
-0.050842609256505966,
-0.004416741896420717,
-0.03463469818234444,
0.013115550391376019,
0.010044099763035774,
-0.1316518634557724,
-0.034468043595552444,
0.015875408425927162,
0.02473510429263115,
-0.17521928250789642,
-0.06451153755187988,
-0.12345300614833832,
-0.025054823607206345,
-0.018124623224139214,
0.0854664146900177,
-0.030607707798480988,
0.03598397225141525,
-0.015312272123992443,
-0.13997547328472137,
0.05644383281469345,
0.10097890347242355,
-0.09110696613788605,
-0.05480612441897392
] |
null | null |
transformers
|
# Rick DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
arifbhrn/DialogGPT-small-Rickk
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Rick DialoGPT Model
|
[
"# Rick DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Rick DialoGPT Model"
] |
[
51,
7
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick DialoGPT Model"
] |
[
-0.027243174612522125,
0.09208611398935318,
-0.005486058536916971,
0.01197603065520525,
0.13312271237373352,
-0.0006643096567131579,
0.14875547587871552,
0.13561291992664337,
-0.012389403767883778,
-0.048079900443553925,
0.13848258554935455,
0.20838283002376556,
-0.007769247982650995,
0.06212212145328522,
-0.07722679525613785,
-0.3253750503063202,
0.05440690368413925,
0.05986349284648895,
-0.02559526450932026,
0.11941008269786835,
0.10155656188726425,
-0.034638021141290665,
0.07502283155918121,
0.008745936676859856,
-0.1460564285516739,
0.011253442615270615,
0.020986590534448624,
-0.11265120655298233,
0.11301227658987045,
0.0699501633644104,
0.03311868757009506,
0.044131726026535034,
-0.04560676962137222,
-0.12763948738574982,
0.04502782225608826,
0.00030866602901369333,
-0.04332113638520241,
0.05997459217905998,
0.016281595453619957,
-0.09000954777002335,
0.11693226546049118,
0.12603440880775452,
-0.01263172086328268,
0.041781701147556305,
-0.1548357903957367,
-0.004369331523776054,
-0.01233562733978033,
0.06789606809616089,
0.06087101250886917,
0.10755407065153122,
-0.04065045714378357,
0.11729123443365097,
-0.06241777911782265,
0.11526333540678024,
0.1129850223660469,
-0.291816771030426,
-0.016308816149830818,
0.14326390624046326,
0.043570004403591156,
0.04201141744852066,
-0.04241296648979187,
0.09895236790180206,
0.01734745316207409,
-0.009189855307340622,
-0.04667704179883003,
-0.07920589298009872,
-0.0809992179274559,
0.022899743169546127,
-0.08393258601427078,
-0.009693359956145287,
0.24909301102161407,
-0.033697742968797684,
0.07867740839719772,
-0.07909003645181656,
-0.08747624605894089,
-0.011933685280382633,
-0.03604159876704216,
-0.03430533409118652,
-0.10349667817354202,
0.07883962988853455,
-0.03785189241170883,
-0.09532928466796875,
-0.11454451829195023,
-0.029063701629638672,
-0.16551746428012848,
0.1769428551197052,
0.028738701716065407,
0.03337583318352699,
-0.22648879885673523,
0.09508261829614639,
-0.012410550378262997,
-0.09879330545663834,
0.018604513257741928,
-0.08811058849096298,
0.012304049916565418,
0.017966609448194504,
-0.025972042232751846,
-0.002111254259943962,
0.08367783576250076,
0.11593183130025864,
0.01627914048731327,
0.018418017774820328,
-0.01303142961114645,
0.05024925619363785,
0.039101485162973404,
0.07016518712043762,
-0.018131986260414124,
-0.026958800852298737,
0.025394905358552933,
-0.09519384801387787,
-0.01311302836984396,
-0.06533002108335495,
-0.19878731667995453,
-0.008748088963329792,
0.05362382158637047,
0.059645626693964005,
0.040223345160484314,
0.1349429488182068,
0.005914759822189808,
-0.04811347648501396,
0.041568055748939514,
-0.017372997477650642,
-0.016568226739764214,
0.013325352221727371,
0.004558354616165161,
0.14832930266857147,
0.012210249900817871,
0.05107790604233742,
-0.11448643356561661,
0.0074756252579391,
-0.04443434625864029,
-0.019875049591064453,
-0.033431850373744965,
-0.05190093815326691,
-0.010580608621239662,
-0.024629589170217514,
0.015543424524366856,
-0.1382266879081726,
-0.1671048104763031,
-0.0113193579018116,
-0.006982414051890373,
-0.04376089945435524,
-0.11932645738124847,
-0.1048901304602623,
-0.03145192563533783,
0.04379252344369888,
-0.060927584767341614,
-0.0003760824038181454,
-0.04660411551594734,
0.09378229826688766,
-0.03543102741241455,
0.07682112604379654,
-0.10023638606071472,
0.0828537717461586,
-0.07001189142465591,
-0.04422231763601303,
-0.0734889879822731,
0.13164658844470978,
0.014363138936460018,
0.05487450957298279,
-0.031934577971696854,
-0.01827416382730007,
-0.10224048048257828,
0.07911752909421921,
-0.04339373856782913,
0.23623128235340118,
-0.09449771791696548,
-0.10362883657217026,
0.26979705691337585,
-0.053989510983228683,
-0.1375254988670349,
0.10795111209154129,
-0.015854641795158386,
0.11475867033004761,
0.12686948478221893,
0.18240338563919067,
0.06434911489486694,
0.007867260836064816,
0.07431085407733917,
0.11333738267421722,
-0.0774611383676529,
-0.018117602914571762,
0.014873803593218327,
-0.020292608067393303,
-0.07848027348518372,
0.023533256724476814,
0.07671299576759338,
0.05307117849588394,
-0.05429181456565857,
-0.015286878682672977,
0.00432937266305089,
0.004517627414315939,
0.05698307976126671,
-0.02530503273010254,
0.12313884496688843,
-0.029461434110999107,
-0.07295558601617813,
-0.029503753408789635,
0.027530280873179436,
-0.05828499048948288,
0.03278997913002968,
-0.08230485767126083,
0.03637091815471649,
-0.014406797476112843,
0.07024850696325302,
-0.16572508215904236,
-0.09323301911354065,
-0.05250932276248932,
0.1899155229330063,
0.06807822734117508,
0.11413464695215225,
0.05567482113838196,
-0.06841246038675308,
-0.0038719952572137117,
0.018287649378180504,
0.1991138458251953,
-0.01677977479994297,
-0.07748494297266006,
-0.09769339859485626,
0.10122697055339813,
-0.07130109518766403,
0.06141059845685959,
-0.050490207970142365,
0.017946461215615273,
0.020556224510073662,
0.1050461083650589,
-0.03456922993063927,
0.039413414895534515,
0.011159577406942844,
-0.034563858062028885,
-0.06218598783016205,
-0.004433273337781429,
0.09716981649398804,
0.0021626276429742575,
-0.10631977766752243,
0.24286337196826935,
-0.19168923795223236,
0.12176351994276047,
0.17641966044902802,
-0.19923987984657288,
-0.0002552573860157281,
-0.11963175982236862,
-0.026344671845436096,
0.011637656949460506,
0.037626978009939194,
-0.042151857167482376,
0.24314165115356445,
-0.00910688005387783,
0.16631373763084412,
-0.03389734402298927,
-0.04332707077264786,
-0.041059546172618866,
-0.046011339873075485,
0.010055569931864738,
0.11430004984140396,
0.1047205775976181,
-0.17159950733184814,
0.17967921495437622,
0.05867021903395653,
0.05177219957113266,
0.16841758787631989,
0.018001655116677284,
0.021052619442343712,
0.06948674470186234,
-0.003431870136409998,
-0.03584783151745796,
-0.07413756102323532,
-0.2106374204158783,
-0.023212855681777,
0.0793403834104538,
0.048357341438531876,
0.1068209707736969,
-0.1037900522351265,
-0.03368109092116356,
-0.010547412559390068,
-0.021230356767773628,
0.03035620041191578,
0.14086326956748962,
0.013085569255053997,
0.1286563277244568,
-0.024180158972740173,
-0.06866493821144104,
0.06965550780296326,
0.014881031587719917,
-0.08571527898311615,
0.19352088868618011,
-0.10702410340309143,
-0.34334462881088257,
-0.10363983362913132,
-0.18596062064170837,
-0.056601256132125854,
0.04553624242544174,
0.11461924016475677,
-0.14119702577590942,
-0.020731983706355095,
0.006813736632466316,
0.06912991404533386,
-0.11165751516819,
0.01017086487263441,
-0.03630850836634636,
-0.017619650810956955,
-0.13406261801719666,
-0.1034051924943924,
-0.05356309190392494,
-0.044913630932569504,
-0.05510649085044861,
0.12040390819311142,
-0.15435875952243805,
0.020806124433875084,
0.23555229604244232,
0.06075655668973923,
0.07018083333969116,
-0.03907359018921852,
0.17685799300670624,
-0.1052674949169159,
0.011976814828813076,
0.2128676474094391,
-0.03831172361969948,
0.06525631994009018,
0.11611197143793106,
-0.01394710224121809,
-0.0662488266825676,
0.036592915654182434,
-0.009823341853916645,
-0.07247381657361984,
-0.21345274150371552,
-0.1158827692270279,
-0.1087421104311943,
0.054685093462467194,
0.04713849350810051,
0.050020426511764526,
0.1613347977399826,
0.07427749037742615,
-0.04962149262428284,
-0.0022197163198143244,
0.06106492131948471,
0.0832381621003151,
0.2504972517490387,
-0.06253999471664429,
0.1427627056837082,
-0.025090228766202927,
-0.16789253056049347,
0.06259234994649887,
0.0661388710141182,
0.09291604906320572,
0.06118352338671684,
0.10224727541208267,
0.005179570056498051,
0.009344357997179031,
0.12825439870357513,
0.07115643471479416,
0.008030776865780354,
-0.03595518320798874,
-0.039997417479753494,
-0.03642706945538521,
-0.013250070624053478,
0.032193150371313095,
0.046790316700935364,
-0.16567666828632355,
-0.021018991246819496,
0.009807335212826729,
0.05824935808777809,
0.02185324765741825,
0.08615364134311676,
-0.18498282134532928,
-0.016169089823961258,
0.06576614826917648,
-0.011832303367555141,
-0.11644340306520462,
0.08480028808116913,
0.0007836486911401153,
-0.1121063381433487,
0.03723234683275223,
-0.027525627985596657,
0.13150714337825775,
-0.08457524329423904,
0.0741792693734169,
-0.12022519111633301,
-0.0374552421271801,
-0.010245736688375473,
0.12193918228149414,
-0.29501426219940186,
0.19123348593711853,
-0.009575535543262959,
-0.04439779743552208,
-0.1071409061551094,
-0.015645509585738182,
0.02963484264910221,
0.10361164063215256,
0.11110331863164902,
-0.020523378625512123,
-0.02764100395143032,
0.06007368490099907,
-0.07205203175544739,
0.0399978905916214,
0.09906689822673798,
-0.06730470806360245,
-0.013155711814761162,
-0.052545808255672455,
0.00039069546619430184,
0.010376452468335629,
-0.10966821759939194,
0.022783124819397926,
-0.19194799661636353,
0.08703918755054474,
0.08162695169448853,
0.09630028903484344,
0.037212129682302475,
-0.029887177050113678,
-0.07769683748483658,
0.2589099109172821,
0.009560960344970226,
-0.10013746470212936,
-0.10953836888074875,
0.008171502500772476,
0.04785030707716942,
-0.07699282467365265,
-0.016966527327895164,
-0.0694924145936966,
0.04450516775250435,
-0.06552471220493317,
-0.18611730635166168,
0.11722762882709503,
-0.09691806137561798,
-0.03250948712229729,
-0.036249466240406036,
0.21333028376102448,
-0.03155504912137985,
0.017869247123599052,
0.04537748545408249,
-0.00578570831567049,
-0.11741422116756439,
-0.10654788464307785,
0.0012778750387951732,
-0.004119161982089281,
0.016931969672441483,
0.023226622492074966,
-0.03199922665953636,
-0.009455137886106968,
-0.06797713041305542,
-0.014383019879460335,
0.3228513300418854,
0.12615877389907837,
-0.042267147451639175,
0.15242800116539001,
0.09877358376979828,
-0.06251336634159088,
-0.2941497564315796,
-0.11165541410446167,
-0.07421603053808212,
-0.05438753217458725,
-0.09733224660158157,
-0.18137554824352264,
0.08739634603261948,
-0.05383281409740448,
-0.013516134582459927,
0.09413999319076538,
-0.25194358825683594,
-0.10185287892818451,
0.2005643993616104,
-0.03753361105918884,
0.4304826855659485,
-0.11250142753124237,
-0.07815388590097427,
-0.04850279167294502,
-0.14005880057811737,
0.19035954773426056,
0.004324326757341623,
0.10461755096912384,
-0.0006430890643969178,
0.19764995574951172,
0.05591731518507004,
-0.0006032987730577588,
0.07056128233671188,
0.01866593211889267,
-0.057801030576229095,
-0.09095179289579391,
-0.0913778692483902,
-0.0337459035217762,
0.010270410217344761,
0.0292131919413805,
-0.07448325306177139,
0.04388400912284851,
-0.13094636797904968,
-0.05198022723197937,
-0.08626694977283478,
0.038746368139982224,
0.027130719274282455,
-0.06653520464897156,
-0.0030553280375897884,
-0.04914497584104538,
0.0004573945188894868,
0.007742773275822401,
0.21047258377075195,
-0.10902713984251022,
0.1467881053686142,
0.028732312843203545,
0.1500566452741623,
-0.09794784337282181,
-0.04768699035048485,
-0.06421241164207458,
-0.05478411167860031,
0.07145597785711288,
-0.12202182412147522,
0.03240978345274925,
0.1044924184679985,
-0.026888413354754448,
0.08732181787490845,
0.1105954647064209,
-0.010995322838425636,
0.005803761538118124,
0.08983830362558365,
-0.241703063249588,
-0.06713853776454926,
-0.08410414308309555,
0.05373041704297066,
0.05893997475504875,
0.10275863856077194,
0.20927143096923828,
0.007167487405240536,
-0.031165437772870064,
0.021489497274160385,
0.027375908568501472,
-0.017840299755334854,
0.05977841466665268,
0.010519524104893208,
0.030491052195429802,
-0.14741286635398865,
0.043485816568136215,
-0.013757874257862568,
-0.09077676385641098,
0.02600322663784027,
0.14754873514175415,
-0.10901660472154617,
-0.12182232737541199,
-0.03921690955758095,
0.13600249588489532,
-0.14775370061397552,
-0.009947444312274456,
-0.0477454848587513,
-0.12692049145698547,
0.06857728958129883,
0.1067143976688385,
0.0457911379635334,
0.04121949151158333,
-0.09239879250526428,
-0.027268609032034874,
-0.0535728819668293,
0.00003198942795279436,
0.028995376080274582,
-0.0204177163541317,
-0.05248761177062988,
0.040780652314424515,
-0.03588524088263512,
0.12051229178905487,
-0.08552545309066772,
-0.10064204037189484,
-0.16698434948921204,
0.03528384119272232,
-0.07174701243638992,
-0.08977310359477997,
-0.0871967226266861,
-0.03724304214119911,
0.006766482722014189,
-0.0405125692486763,
-0.02825779654085636,
-0.03461418300867081,
-0.1126255914568901,
0.03079685941338539,
-0.04579872637987137,
0.003088617930188775,
-0.07116411626338959,
0.029772473499178886,
0.0525958277285099,
-0.029091687873005867,
0.149556964635849,
0.14025014638900757,
-0.11192594468593597,
0.09547203034162521,
-0.1507159322500229,
-0.07066365331411362,
0.09605675935745239,
0.018403515219688416,
0.04981891065835953,
0.05175008252263069,
0.009065150283277035,
0.051755502820014954,
0.06169715151190758,
0.04307684674859047,
0.0153890922665596,
-0.07590135186910629,
0.06697173416614532,
-0.06090308725833893,
-0.10307016223669052,
-0.05066140368580818,
-0.003966273739933968,
0.015159476548433304,
0.07283487915992737,
0.10097057372331619,
-0.056661296635866165,
0.09506311267614365,
-0.05649305135011673,
0.04625694453716278,
0.024318000301718712,
-0.17797043919563293,
0.03397766128182411,
-0.08718447387218475,
0.05030312016606331,
0.010050542652606964,
0.1727033108472824,
0.02054430916905403,
-0.019508427008986473,
0.02473587542772293,
0.0719463899731636,
0.04261681064963341,
-0.013226886279881,
0.19012948870658875,
0.10657399147748947,
-0.03943915665149689,
-0.0805516242980957,
0.09759991616010666,
0.04438556358218193,
0.04173632711172104,
0.14543114602565765,
-0.05563090741634369,
-0.03441290557384491,
0.081944540143013,
-0.0026839920319616795,
0.010976077988743782,
-0.09896437078714371,
-0.13543705642223358,
-0.026787811890244484,
0.036508288234472275,
-0.03667739778757095,
0.10571453720331192,
0.15851758420467377,
-0.005720720160752535,
0.01726081222295761,
-0.01855739764869213,
-0.05729815363883972,
-0.1993623524904251,
-0.19528920948505402,
-0.083323635160923,
-0.13647840917110443,
0.0050200955010950565,
-0.13574683666229248,
0.04266147315502167,
0.026296362280845642,
0.09698255360126495,
-0.04634363576769829,
0.050944969058036804,
0.03791060671210289,
-0.11099781841039658,
0.058360110968351364,
-0.043620482087135315,
0.09173028916120529,
-0.03267880156636238,
0.014702340587973595,
-0.060175783932209015,
0.035412851721048355,
0.016039982438087463,
0.041373249143362045,
-0.02921622060239315,
0.019025372341275215,
-0.12458328902721405,
-0.08709227293729782,
-0.06697598844766617,
0.06596853584051132,
0.006195025984197855,
0.16954803466796875,
0.019531596451997757,
-0.027915386483073235,
0.028833186253905296,
0.23899038136005402,
-0.07318265736103058,
-0.09635625779628754,
-0.06982157379388809,
0.21012257039546967,
-0.009315763600170612,
0.08784335851669312,
-0.03747710958123207,
0.009438461624085903,
-0.08562079071998596,
0.3506644368171692,
0.29213622212409973,
-0.09391074627637863,
0.010968702845275402,
-0.0027621579356491566,
0.04181644320487976,
0.12788556516170502,
0.09239348024129868,
0.10824161767959595,
0.29070642590522766,
-0.06708572804927826,
-0.03647898510098457,
-0.006994254421442747,
-0.0254643727093935,
-0.055716969072818756,
0.0551714263856411,
0.05315792188048363,
-0.06511329114437103,
-0.01592782698571682,
0.11738577485084534,
-0.2489209920167923,
0.0614120177924633,
-0.15840938687324524,
-0.16190756857395172,
-0.07126864790916443,
-0.0001230158086400479,
0.0958227664232254,
0.01604771800339222,
0.09578458964824677,
-0.011418631300330162,
-0.06834693253040314,
0.04414822906255722,
0.020037546753883362,
-0.20774760842323303,
0.009963343851268291,
0.06968449801206589,
-0.051950447261333466,
-0.05526239052414894,
-0.017540784552693367,
0.07181108742952347,
0.0862373560667038,
0.031932324171066284,
-0.021655123680830002,
0.04088883846998215,
-0.011214682832360268,
-0.07533704489469528,
0.03916772082448006,
0.027806051075458527,
0.005651058629155159,
-0.08518505096435547,
0.07656224071979523,
-0.16369622945785522,
0.03412613272666931,
-0.0035786160733550787,
-0.048953261226415634,
-0.014727948233485222,
0.030175231397151947,
-0.061420172452926636,
0.08509553223848343,
0.0839199498295784,
-0.0171944722533226,
-0.016525855287909508,
-0.0222842525690794,
-0.012990890070796013,
-0.020874707028269768,
-0.0818524956703186,
-0.09698375314474106,
-0.15574125945568085,
-0.1261346936225891,
0.08575325459241867,
-0.00355695397593081,
-0.19997835159301758,
0.028783639892935753,
-0.12125882506370544,
0.04249454662203789,
-0.12142720073461533,
0.09701541811227798,
0.0825105607509613,
0.02303435280919075,
-0.0030652873683720827,
0.006164520047605038,
0.03737448528409004,
0.07968182861804962,
-0.13731823861598969,
-0.08554888516664505
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-Bengali
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) Bengali using a subset of 40,000 utterances from [Bengali ASR training data set containing ~196K utterances](https://www.openslr.org/53/). Tested WER using ~4200 held out from training.
When using this model, make sure that your speech input is sampled at 16kHz.
Train Script can be Found at : train.py
Data Prep Notebook : https://colab.research.google.com/drive/1JMlZPU-DrezXjZ2t7sOVqn7CJjZhdK2q?usp=sharing
Inference Notebook : https://colab.research.google.com/drive/1uKC2cK9JfUPDTUHbrNdOYqKtNozhxqgZ?usp=sharing
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
processor = Wav2Vec2Processor.from_pretrained("arijitx/wav2vec2-large-xlsr-bengali")
model = Wav2Vec2ForCTC.from_pretrained("arijitx/wav2vec2-large-xlsr-bengali")
# model = model.to("cuda")
resampler = torchaudio.transforms.Resample(TEST_AUDIO_SR, 16_000)
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch)
speech = resampler(speech_array).squeeze().numpy()
return speech
speech_array = speech_file_to_array_fn("test_file.wav")
inputs = processor(speech_array, sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values).logits
predicted_ids = torch.argmax(logits, dim=-1)
preds = processor.batch_decode(predicted_ids)[0]
print(preds.replace("[PAD]",""))
```
**Test Result**: WER on ~4200 utterance : 32.45 %
|
{"language": "Bengali", "license": "cc-by-sa-4.0", "tags": ["bn", "audio", "automatic-speech-recognition", "speech"], "datasets": ["OpenSLR"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Bengali by Arijit", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "OpenSLR", "type": "OpenSLR", "args": "ben"}, "metrics": [{"type": "wer", "value": 32.45, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
arijitx/wav2vec2-large-xlsr-bengali
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"bn",
"audio",
"speech",
"dataset:OpenSLR",
"license:cc-by-sa-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"Bengali"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #bn #audio #speech #dataset-OpenSLR #license-cc-by-sa-4.0 #model-index #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-XLSR-Bengali
Fine-tuned facebook/wav2vec2-large-xlsr-53 Bengali using a subset of 40,000 utterances from Bengali ASR training data set containing ~196K utterances. Tested WER using ~4200 held out from training.
When using this model, make sure that your speech input is sampled at 16kHz.
Train Script can be Found at : URL
Data Prep Notebook : URL
Inference Notebook : URL
## Usage
The model can be used directly (without a language model) as follows:
Test Result: WER on ~4200 utterance : 32.45 %
|
[
"# Wav2Vec2-Large-XLSR-Bengali\nFine-tuned facebook/wav2vec2-large-xlsr-53 Bengali using a subset of 40,000 utterances from Bengali ASR training data set containing ~196K utterances. Tested WER using ~4200 held out from training.\nWhen using this model, make sure that your speech input is sampled at 16kHz.\nTrain Script can be Found at : URL \n\n Data Prep Notebook : URL\n Inference Notebook : URL",
"## Usage\n\nThe model can be used directly (without a language model) as follows:\n\nTest Result: WER on ~4200 utterance : 32.45 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #bn #audio #speech #dataset-OpenSLR #license-cc-by-sa-4.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-XLSR-Bengali\nFine-tuned facebook/wav2vec2-large-xlsr-53 Bengali using a subset of 40,000 utterances from Bengali ASR training data set containing ~196K utterances. Tested WER using ~4200 held out from training.\nWhen using this model, make sure that your speech input is sampled at 16kHz.\nTrain Script can be Found at : URL \n\n Data Prep Notebook : URL\n Inference Notebook : URL",
"## Usage\n\nThe model can be used directly (without a language model) as follows:\n\nTest Result: WER on ~4200 utterance : 32.45 %"
] |
[
74,
114,
36
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #bn #audio #speech #dataset-OpenSLR #license-cc-by-sa-4.0 #model-index #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-XLSR-Bengali\nFine-tuned facebook/wav2vec2-large-xlsr-53 Bengali using a subset of 40,000 utterances from Bengali ASR training data set containing ~196K utterances. Tested WER using ~4200 held out from training.\nWhen using this model, make sure that your speech input is sampled at 16kHz.\nTrain Script can be Found at : URL \n\n Data Prep Notebook : URL\n Inference Notebook : URL## Usage\n\nThe model can be used directly (without a language model) as follows:\n\nTest Result: WER on ~4200 utterance : 32.45 %"
] |
[
-0.13055333495140076,
0.007073352579027414,
-0.0013982966775074601,
-0.02121923677623272,
0.050784774124622345,
-0.11456571519374847,
0.03947533294558525,
0.10132066905498505,
0.027107207104563713,
0.045785240828990936,
0.060035862028598785,
-0.01502678357064724,
0.057756680995225906,
0.04654005542397499,
-0.01670774817466736,
-0.15445354580879211,
0.022169774398207664,
-0.0362653024494648,
0.024901343509554863,
0.12792639434337616,
0.10071717202663422,
-0.0704227015376091,
0.024289090186357498,
0.07979766279459,
-0.18329927325248718,
0.08688563853502274,
0.005829208064824343,
-0.09211906045675278,
0.02658487856388092,
0.050007786601781845,
0.07941678911447525,
0.02095150388777256,
0.03080977313220501,
-0.1520494818687439,
0.020597387105226517,
0.004211168736219406,
0.06108492612838745,
-0.019157864153385162,
-0.006803283002227545,
-0.02317039668560028,
0.11589298397302628,
0.04302570968866348,
-0.053202465176582336,
0.0713476687669754,
-0.06767775863409042,
-0.16531427204608917,
0.002225018572062254,
0.06410042941570282,
-0.00732810702174902,
0.08218129724264145,
-0.08000411093235016,
0.08332618325948715,
-0.1294124275445938,
0.09977378696203232,
0.09145910292863846,
-0.18899452686309814,
0.0063917553052306175,
-0.029747823253273964,
0.10038083791732788,
0.1479785442352295,
-0.119044229388237,
-0.02452012710273266,
0.0992557629942894,
-0.003231079550459981,
-0.022492341697216034,
-0.05344761908054352,
-0.12554241716861725,
0.03332212567329407,
-0.11203319579362869,
0.050662897527217865,
0.20886075496673584,
-0.008380332961678505,
-0.08630846440792084,
-0.05295401066541672,
0.006972361356019974,
-0.06757885962724686,
0.03527620807290077,
-0.0016016406007111073,
-0.05116117745637894,
0.03487410023808479,
-0.04278095066547394,
0.016864441335201263,
-0.1251630038022995,
-0.1145927831530571,
0.02702920138835907,
0.1425989419221878,
0.031096309423446655,
0.04638185724616051,
-0.10642673075199127,
-0.01006176508963108,
-0.06502196937799454,
-0.09818258881568909,
-0.03645304590463638,
-0.05321641266345978,
-0.06400469690561295,
0.06209913641214371,
-0.07948160916566849,
-0.13313283026218414,
0.03729763999581337,
0.0009899704018607736,
-0.047241151332855225,
0.026490669697523117,
0.013889329507946968,
0.040624912828207016,
0.03126496076583862,
0.14225943386554718,
-0.07393516600131989,
-0.03677629679441452,
-0.01603713631629944,
0.08035090565681458,
0.004692383576184511,
-0.022820452228188515,
-0.006979936733841896,
-0.04687732458114624,
-0.00030151387909427285,
0.00897607859224081,
-0.12632043659687042,
-0.0031677153892815113,
-0.036389365792274475,
-0.017814602702856064,
0.009588055312633514,
-0.13809847831726074,
-0.0012756312498822808,
0.05591430142521858,
0.007686915807425976,
0.16085219383239746,
0.052134349942207336,
-0.0107421288266778,
-0.09620073437690735,
-0.1348605453968048,
-0.01982410065829754,
0.025082236155867577,
-0.06161472946405411,
-0.11144597828388214,
-0.00408019358292222,
-0.0003062434552703053,
-0.026749784126877785,
-0.11266539990901947,
-0.10548175871372223,
-0.05232661962509155,
-0.032040808349847794,
-0.04838890582323074,
0.053245093673467636,
-0.07939745485782623,
-0.024771125987172127,
0.026791289448738098,
-0.06599900871515274,
-0.05080186203122139,
-0.023072030395269394,
0.12324411422014236,
0.12401216477155685,
0.1280389130115509,
-0.013116836547851562,
0.09832794219255447,
-0.03991275653243065,
-0.030207637697458267,
0.016467910259962082,
0.15035156905651093,
-0.08443886041641235,
0.005344409495592117,
-0.09821967780590057,
-0.08560681343078613,
-0.09727604687213898,
0.07177650183439255,
0.08939766138792038,
0.04500994458794594,
-0.1824435442686081,
-0.07514358311891556,
0.24563707411289215,
-0.12215587496757507,
0.04093258082866669,
0.23575972020626068,
0.008213726803660393,
0.05234629660844803,
0.10930216312408447,
0.2687015235424042,
0.07807142287492752,
-0.15769441425800323,
0.043851133435964584,
0.04393591731786728,
0.011495580896735191,
-0.031157977879047394,
0.10018840432167053,
-0.06570147722959518,
-0.012320135720074177,
0.005189007148146629,
0.12187566608190536,
0.0619644932448864,
-0.035180963575839996,
-0.06207067891955376,
-0.019320698454976082,
-0.08137321472167969,
0.028203416615724564,
0.019897842779755592,
0.017459673807024956,
-0.09897582978010178,
-0.07363517582416534,
0.05237564072012901,
0.15382806956768036,
-0.15141792595386505,
0.02944433130323887,
-0.17148993909358978,
0.10634161531925201,
-0.11632420867681503,
0.00770803727209568,
-0.12702307105064392,
0.2869667708873749,
0.01901083253324032,
0.020530516281723976,
0.09790269285440445,
0.11601575464010239,
0.06788196414709091,
0.006605502683669329,
0.030223743990063667,
0.019952578470110893,
0.05783436447381973,
0.000641770544461906,
-0.049925245344638824,
-0.08926251530647278,
0.005928428377956152,
-0.09348150342702866,
-0.010175415314733982,
-0.19746312499046326,
-0.015685640275478363,
0.08057320863008499,
-0.026614664122462273,
0.004752026405185461,
-0.03796694800257683,
0.09719988703727722,
0.07238293439149857,
0.03309791162610054,
0.032971229404211044,
0.012375347316265106,
0.03188223019242287,
-0.03908485174179077,
0.16303156316280365,
-0.08462658524513245,
0.0300175528973341,
0.08711227029561996,
-0.03946933522820473,
-0.01090624462813139,
-0.0018050771905109286,
0.005039606709033251,
-0.04104994237422943,
-0.004198242910206318,
-0.09494458884000778,
0.2147415578365326,
-0.009262396953999996,
0.14105536043643951,
-0.10447531193494797,
0.0101970499381423,
0.030036471784114838,
-0.09719470143318176,
0.08142620325088501,
0.11366363614797592,
-0.12105081975460052,
-0.09866368025541306,
0.038337722420692444,
0.0031014862470328808,
-0.14877106249332428,
0.2359522581100464,
-0.044854916632175446,
-0.11008966714143753,
0.013984547927975655,
0.03311780095100403,
-0.07984103262424469,
0.06438526511192322,
-0.2307763248682022,
-0.10913147777318954,
0.023365309461951256,
-0.008585252799093723,
0.039063699543476105,
-0.1280614137649536,
0.04470209777355194,
0.019470319151878357,
-0.09279102087020874,
-0.18739290535449982,
0.07730555534362793,
-0.06689825654029846,
0.010822011157870293,
-0.09784622490406036,
-0.041889287531375885,
-0.01655963435769081,
-0.05540395900607109,
-0.13997569680213928,
0.09909745305776596,
-0.0640689954161644,
-0.18964163959026337,
-0.1705697774887085,
-0.02270573191344738,
0.06397820264101028,
0.007923428900539875,
0.0805756151676178,
-0.18090510368347168,
-0.04762769863009453,
-0.026725096628069878,
0.02675861492753029,
0.014683282002806664,
-0.025025544688105583,
0.041710030287504196,
-0.05536887049674988,
0.03526589646935463,
-0.1566346287727356,
0.010346938855946064,
0.0003076246357522905,
0.02968059852719307,
0.0663246363401413,
-0.05395671725273132,
0.03347428888082504,
0.16732662916183472,
-0.04387934133410454,
0.009917843155562878,
0.021895812824368477,
0.13296541571617126,
-0.02588581293821335,
-0.00750328041613102,
0.12084294110536575,
-0.01980363205075264,
0.02080165408551693,
0.0698947086930275,
0.0082088066264987,
-0.01658022403717041,
0.022603055462241173,
-0.04101700335741043,
-0.011476093903183937,
-0.22533971071243286,
-0.01941094733774662,
-0.05498505383729935,
-0.05514007806777954,
-0.010382085107266903,
-0.009200435131788254,
0.007674564141780138,
0.01578470878303051,
0.01707768253982067,
-0.008677017875015736,
-0.013759779743850231,
0.012690211646258831,
0.06071482226252556,
-0.030416997149586678,
0.09760110080242157,
-0.09646051377058029,
-0.04574549198150635,
0.04635204002261162,
0.03184191510081291,
0.1277046799659729,
0.07250722497701645,
0.1794360727071762,
0.05480215325951576,
0.11099208146333694,
0.15216857194900513,
0.04938170686364174,
-0.09414555877447128,
-0.0009330805623903871,
-0.011186128482222557,
-0.0879707857966423,
-0.07462973892688751,
0.09598108381032944,
0.15703053772449493,
0.01042288076132536,
-0.016701171174645424,
-0.07165928184986115,
0.055594515055418015,
0.15644380450248718,
0.09311120212078094,
-0.20225472748279572,
-0.08057089895009995,
0.008290857076644897,
-0.08975318819284439,
-0.01841546595096588,
0.07443369925022125,
0.20844766497612,
-0.010130632668733597,
-0.026604054495692253,
0.006822698283940554,
0.07605220377445221,
0.04850305989384651,
0.04625009000301361,
-0.09483836591243744,
0.09639576077461243,
-0.030664538964629173,
0.06539827585220337,
-0.203178271651268,
0.15527260303497314,
0.027264123782515526,
0.08592724800109863,
-0.014535779133439064,
-0.04265972226858139,
0.001737551181577146,
-0.09438639879226685,
0.06319260597229004,
0.01606925204396248,
0.026169579476118088,
-0.09996089339256287,
-0.08340509235858917,
0.044224951416254044,
0.06952785700559616,
0.19379635155200958,
0.07496996223926544,
-0.011791149154305458,
-0.004331006668508053,
0.023674223572015762,
-0.08713705092668533,
-0.12223615497350693,
-0.035986993461847305,
-0.007533743046224117,
0.192122220993042,
0.06851057708263397,
-0.02887275442481041,
-0.06338638067245483,
-0.10737158358097076,
0.10218966007232666,
-0.13801833987236023,
-0.09687474370002747,
-0.00589448818936944,
-0.031532224267721176,
0.09806773066520691,
-0.03137080743908882,
-0.016365420073270798,
0.053983256220817566,
0.02228899486362934,
0.023375339806079865,
0.008338251151144505,
0.013825210742652416,
-0.03236531466245651,
-0.039312694221735,
-0.011199667118489742,
0.13115380704402924,
0.05024024471640587,
0.035239558666944504,
0.02069541998207569,
-0.061530787497758865,
-0.05611444637179375,
-0.09147078543901443,
0.008450769819319248,
0.08985384553670883,
-0.10096316784620285,
0.04911128804087639,
-0.0335063561797142,
-0.15546023845672607,
-0.09523187577724457,
-0.05369080603122711,
0.12148815393447876,
0.07430122047662735,
-0.04018402472138405,
0.11383546143770218,
0.3740083873271942,
-0.03539353236556053,
-0.17060384154319763,
-0.08234773576259613,
0.034470923244953156,
0.16920220851898193,
-0.1423746794462204,
-0.11962607502937317,
0.03325563669204712,
0.007386837620288134,
-0.01580703817307949,
-0.024992095306515694,
-0.19553853571414948,
-0.12035942822694778,
0.1718517243862152,
-0.026292158290743828,
0.23840685188770294,
-0.1365167200565338,
-0.09358299523591995,
-0.03254100680351257,
-0.010938367806375027,
-0.008052583783864975,
-0.14062079787254333,
0.11892593652009964,
-0.02397291734814644,
0.10062413662672043,
0.04011665657162666,
-0.029041515663266182,
0.08594164997339249,
0.03246527537703514,
-0.012196102179586887,
-0.011847855523228645,
0.06289689987897873,
0.006765355821698904,
0.046706557273864746,
0.16228750348091125,
-0.16193555295467377,
0.04590967670083046,
-0.13723009824752808,
-0.07597796618938446,
-0.0690196081995964,
0.00041787020745687187,
0.06600011140108109,
-0.022701304405927658,
0.013531563803553581,
-0.030849266797304153,
-0.003433533711358905,
0.015138416551053524,
-0.11157035827636719,
-0.13766823709011078,
-0.00781718548387289,
0.18141284584999084,
0.16054703295230865,
-0.00638445233926177,
-0.0327192097902298,
-0.014342129230499268,
-0.038264691829681396,
0.1462247222661972,
-0.15548688173294067,
-0.0032856417819857597,
0.047455836087465286,
0.09244810044765472,
0.1398247629404068,
-0.01097012497484684,
-0.13889852166175842,
0.09258691221475601,
0.09444737434387207,
0.011576642282307148,
0.02449207939207554,
-0.022615429013967514,
0.016102660447359085,
-0.021398497745394707,
0.02959485538303852,
0.07302311807870865,
-0.12070721387863159,
-0.005050648469477892,
-0.05250994488596916,
0.007008288986980915,
-0.14358088374137878,
0.23722833395004272,
0.12949199974536896,
0.0653754472732544,
-0.08944298326969147,
0.0934905931353569,
0.005114423111081123,
-0.035136811435222626,
0.07637173682451248,
0.0747893899679184,
-0.054603997617959976,
-0.05651555582880974,
-0.10185268521308899,
0.003240307793021202,
0.014912014827132225,
-0.1160680279135704,
0.018147198483347893,
-0.06124996021389961,
-0.011394914239645004,
0.1684918999671936,
0.015509888529777527,
0.026273652911186218,
-0.09930039942264557,
-0.008500275202095509,
-0.06835994869470596,
0.08805574476718903,
0.110697902739048,
-0.03770192712545395,
-0.0836978331208229,
0.06768365204334259,
0.014214209280908108,
0.02954007498919964,
-0.0565183199942112,
-0.06656300276517868,
-0.08041687309741974,
0.0648370012640953,
-0.2050246298313141,
0.032884012907743454,
-0.027350397780537605,
-0.006781571079045534,
-0.0039940280839800835,
-0.04724512994289398,
0.01018618792295456,
0.062402013689279556,
-0.06535401195287704,
0.07588090747594833,
0.0075710490345954895,
0.0387716218829155,
-0.09805873036384583,
0.01440817303955555,
0.06130940467119217,
-0.03990856185555458,
0.03248697146773338,
0.13719165325164795,
-0.17547453939914703,
0.05963972210884094,
-0.14351628720760345,
-0.020128609612584114,
0.07637619227170944,
0.043471623212099075,
-0.0001664224109845236,
-0.08246961236000061,
0.044047001749277115,
0.08802998065948486,
0.10501578450202942,
-0.031712133437395096,
0.12189395725727081,
-0.03872758895158768,
0.01667020656168461,
-0.1285654902458191,
0.004054216202348471,
-0.07895160466432571,
0.030369652435183525,
0.04854551702737808,
0.14091232419013977,
0.11933649331331253,
-0.11707004904747009,
-0.01106481347233057,
-0.0713529959321022,
0.03379819169640541,
-0.014506946317851543,
-0.026864388957619667,
-0.10082688927650452,
-0.037432216107845306,
0.05932968109846115,
-0.07223400473594666,
0.12107744812965393,
-0.07524332404136658,
-0.11436726152896881,
-0.049087394028902054,
-0.2145567685365677,
-0.07225365936756134,
-0.02464636042714119,
0.28282639384269714,
0.07248634099960327,
-0.0059495531022548676,
-0.10641714930534363,
-0.03700340539216995,
0.04566456377506256,
0.11817953735589981,
-0.0624258890748024,
0.25509369373321533,
0.01711452193558216,
0.12489845603704453,
0.05041556805372238,
-0.0763980969786644,
0.008263858035206795,
0.03140720725059509,
-0.10226284712553024,
0.018849344924092293,
-0.08281389623880386,
0.19316315650939941,
0.2461908757686615,
-0.03908562660217285,
0.040209703147411346,
-0.014523123390972614,
-0.09847134351730347,
-0.09750060737133026,
-0.056224942207336426,
-0.07105879485607147,
-0.10403648018836975,
0.036262840032577515,
-0.09421051293611526,
0.0899273231625557,
-0.05562978982925415,
0.05493079870939255,
-0.04129841551184654,
0.1605985313653946,
0.038178008049726486,
-0.11218330264091492,
0.12332702428102493,
-0.07862182706594467,
-0.0026387202087789774,
0.038741979748010635,
0.041713327169418335,
0.21382185816764832,
0.011725623160600662,
0.07726075500249863,
0.08279953896999359,
-0.03244337439537048,
0.026087472215294838,
-0.07865850627422333,
-0.04533649981021881,
-0.03284793347120285,
0.02450554445385933,
0.06808129698038101,
0.1324738711118698,
0.1328907459974289,
-0.10543806105852127,
-0.006324473302811384,
-0.0073750196024775505,
-0.08148160576820374,
-0.13162140548229218,
-0.11820206046104431,
0.24044111371040344,
0.022181788459420204,
0.06297709792852402,
-0.03885611519217491,
-0.09073101729154587,
0.017934609204530716,
0.21918298304080963,
0.09402187168598175,
-0.009122442454099655,
0.023967402055859566,
-0.052393630146980286,
-0.0003545843355823308,
-0.11999520659446716,
0.08635647594928741,
0.0988045409321785,
0.18511638045310974,
0.02503124438226223,
0.029374955222010612,
-0.04479930177330971,
-0.10050389915704727,
-0.06457966566085815,
0.02013995312154293,
-0.04561297595500946,
-0.0773373395204544,
-0.02696421556174755,
0.07395418733358383,
-0.08158378303050995,
-0.07102212309837341,
-0.11739035695791245,
0.01534106396138668,
-0.061245501041412354,
-0.003662173170596361,
-0.053350143134593964,
0.1094171479344368,
-0.010846798308193684,
-0.07171638309955597,
0.0331854522228241,
0.055309802293777466,
0.0009321566321887076,
-0.03553782403469086,
0.027262259274721146,
0.05778944492340088,
-0.04309516400098801,
-0.11209995299577713,
0.033166591078042984,
0.22892828285694122,
0.03340970352292061,
0.09515056014060974,
0.005218392703682184,
0.20116554200649261,
-0.03672130033373833,
-0.08358576148748398,
-0.02745101787149906,
0.19362911581993103,
0.02627662941813469,
0.11305182427167892,
0.008895526640117168,
0.0468628965318203,
-0.02081253007054329,
-0.08620655536651611,
0.01058973465114832,
-0.16271016001701355,
-0.017498157918453217,
-0.017217392101883888,
0.09682147204875946,
0.07930146157741547,
-0.05257477983832359,
-0.04024692252278328,
-0.0435635931789875,
0.06278590857982635,
0.011826368980109692,
-0.09563130140304565,
-0.057876065373420715,
-0.18962974846363068,
-0.007914414629340172,
-0.01137780211865902,
0.06721680611371994,
-0.2761361300945282,
0.008369345217943192,
-0.025685971602797508,
0.01943403109908104,
0.03650734946131706,
0.09340441226959229,
0.0750681683421135,
0.10214073956012726,
0.018840238451957703,
-0.1203235387802124,
0.03973044827580452,
0.09946739673614502,
-0.19165164232254028,
-0.11492186784744263
] |
null | null |
transformers
|
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the OPENSLR_SLR53 - bengali dataset.
It achieves the following results on the evaluation set.
Without language model :
- WER: 0.21726385291857586
- CER: 0.04725010353701041
With 5 gram language model trained on 30M sentences randomly chosen from [AI4Bharat IndicCorp](https://indicnlp.ai4bharat.org/corpora/) dataset :
- WER: 0.15322879016421437
- CER: 0.03413696666806267
Note : 5% of a total 10935 samples have been used for evaluation. Evaluation set has 10935 examples which was not part of training training was done on first 95% and eval was done on last 5%. Training was stopped after 180k steps. Output predictions are available under files section.
### Training hyperparameters
The following hyperparameters were used during training:
- dataset_name="openslr"
- model_name_or_path="facebook/wav2vec2-xls-r-300m"
- dataset_config_name="SLR53"
- output_dir="./wav2vec2-xls-r-300m-bengali"
- overwrite_output_dir
- num_train_epochs="50"
- per_device_train_batch_size="32"
- per_device_eval_batch_size="32"
- gradient_accumulation_steps="1"
- learning_rate="7.5e-5"
- warmup_steps="2000"
- length_column_name="input_length"
- evaluation_strategy="steps"
- text_column_name="sentence"
- chars_to_ignore , ? . ! \- \; \: \" “ % ‘ ” � — ’ … –
- save_steps="2000"
- eval_steps="3000"
- logging_steps="100"
- layerdrop="0.0"
- activation_dropout="0.1"
- save_total_limit="3"
- freeze_feature_encoder
- feat_proj_dropout="0.0"
- mask_time_prob="0.75"
- mask_time_length="10"
- mask_feature_prob="0.25"
- mask_feature_length="64"
- preprocessing_num_workers 32
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
Notes
- Training and eval code modified from : https://github.com/huggingface/transformers/tree/master/examples/research_projects/robust-speech-event.
- Bengali speech data was not available from common voice or librispeech multilingual datasets, so OpenSLR53 has been used.
- Minimum audio duration of 0.5s has been used to filter the training data which excluded may be 10-20 samples.
- OpenSLR53 transcripts are *not* part of LM training and LM used to evaluate.
|
{"language": ["bn"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "bn", "hf-asr-leaderboard", "openslr_SLR53", "robust-speech-event"], "datasets": ["openslr", "SLR53", "AI4Bharat/IndicCorp"], "metrics": ["wer", "cer"], "model-index": [{"name": "arijitx/wav2vec2-xls-r-300m-bengali", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Open SLR", "type": "openslr", "args": "SLR53"}, "metrics": [{"type": "wer", "value": 0.21726385291857586, "name": "Test WER"}, {"type": "cer", "value": 0.04725010353701041, "name": "Test CER"}, {"type": "wer", "value": 0.15322879016421437, "name": "Test WER with lm"}, {"type": "cer", "value": 0.03413696666806267, "name": "Test CER with lm"}]}]}]}
|
automatic-speech-recognition
|
arijitx/wav2vec2-xls-r-300m-bengali
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"bn",
"hf-asr-leaderboard",
"openslr_SLR53",
"robust-speech-event",
"dataset:openslr",
"dataset:SLR53",
"dataset:AI4Bharat/IndicCorp",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"bn"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #bn #hf-asr-leaderboard #openslr_SLR53 #robust-speech-event #dataset-openslr #dataset-SLR53 #dataset-AI4Bharat/IndicCorp #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the OPENSLR_SLR53 - bengali dataset.
It achieves the following results on the evaluation set.
Without language model :
- WER: 0.21726385291857586
- CER: 0.04725010353701041
With 5 gram language model trained on 30M sentences randomly chosen from AI4Bharat IndicCorp dataset :
- WER: 0.15322879016421437
- CER: 0.03413696666806267
Note : 5% of a total 10935 samples have been used for evaluation. Evaluation set has 10935 examples which was not part of training training was done on first 95% and eval was done on last 5%. Training was stopped after 180k steps. Output predictions are available under files section.
### Training hyperparameters
The following hyperparameters were used during training:
- dataset_name="openslr"
- model_name_or_path="facebook/wav2vec2-xls-r-300m"
- dataset_config_name="SLR53"
- output_dir="./wav2vec2-xls-r-300m-bengali"
- overwrite_output_dir
- num_train_epochs="50"
- per_device_train_batch_size="32"
- per_device_eval_batch_size="32"
- gradient_accumulation_steps="1"
- learning_rate="7.5e-5"
- warmup_steps="2000"
- length_column_name="input_length"
- evaluation_strategy="steps"
- text_column_name="sentence"
- chars_to_ignore , ? . ! \- \; \: \" “ % ‘ ” � — ’ … –
- save_steps="2000"
- eval_steps="3000"
- logging_steps="100"
- layerdrop="0.0"
- activation_dropout="0.1"
- save_total_limit="3"
- freeze_feature_encoder
- feat_proj_dropout="0.0"
- mask_time_prob="0.75"
- mask_time_length="10"
- mask_feature_prob="0.25"
- mask_feature_length="64"
- preprocessing_num_workers 32
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
Notes
- Training and eval code modified from : URL
- Bengali speech data was not available from common voice or librispeech multilingual datasets, so OpenSLR53 has been used.
- Minimum audio duration of 0.5s has been used to filter the training data which excluded may be 10-20 samples.
- OpenSLR53 transcripts are *not* part of LM training and LM used to evaluate.
|
[
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n\n- dataset_name=\"openslr\" \t\n- model_name_or_path=\"facebook/wav2vec2-xls-r-300m\" \t\n- dataset_config_name=\"SLR53\" \t\n- output_dir=\"./wav2vec2-xls-r-300m-bengali\" \t\n- overwrite_output_dir \t\n- num_train_epochs=\"50\" \t\n- per_device_train_batch_size=\"32\" \t\n- per_device_eval_batch_size=\"32\" \t\n- gradient_accumulation_steps=\"1\" \t\n- learning_rate=\"7.5e-5\" \t\n- warmup_steps=\"2000\" \t\n- length_column_name=\"input_length\" \t\n- evaluation_strategy=\"steps\" \t\n- text_column_name=\"sentence\" \t\n- chars_to_ignore , ? . ! \\- \\; \\: \\\" “ % ‘ ” � — ’ … – \t\n- save_steps=\"2000\" \t\n- eval_steps=\"3000\" \t\n- logging_steps=\"100\" \t\n- layerdrop=\"0.0\" \t\n- activation_dropout=\"0.1\" \t\n- save_total_limit=\"3\" \t\n- freeze_feature_encoder \t\n- feat_proj_dropout=\"0.0\" \t\n- mask_time_prob=\"0.75\" \t\n- mask_time_length=\"10\" \t\n- mask_feature_prob=\"0.25\" \t\n- mask_feature_length=\"64\" \n- preprocessing_num_workers 32",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0\n\nNotes\n- Training and eval code modified from : URL \n- Bengali speech data was not available from common voice or librispeech multilingual datasets, so OpenSLR53 has been used.\n- Minimum audio duration of 0.5s has been used to filter the training data which excluded may be 10-20 samples.\n- OpenSLR53 transcripts are *not* part of LM training and LM used to evaluate."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #bn #hf-asr-leaderboard #openslr_SLR53 #robust-speech-event #dataset-openslr #dataset-SLR53 #dataset-AI4Bharat/IndicCorp #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n\n- dataset_name=\"openslr\" \t\n- model_name_or_path=\"facebook/wav2vec2-xls-r-300m\" \t\n- dataset_config_name=\"SLR53\" \t\n- output_dir=\"./wav2vec2-xls-r-300m-bengali\" \t\n- overwrite_output_dir \t\n- num_train_epochs=\"50\" \t\n- per_device_train_batch_size=\"32\" \t\n- per_device_eval_batch_size=\"32\" \t\n- gradient_accumulation_steps=\"1\" \t\n- learning_rate=\"7.5e-5\" \t\n- warmup_steps=\"2000\" \t\n- length_column_name=\"input_length\" \t\n- evaluation_strategy=\"steps\" \t\n- text_column_name=\"sentence\" \t\n- chars_to_ignore , ? . ! \\- \\; \\: \\\" “ % ‘ ” � — ’ … – \t\n- save_steps=\"2000\" \t\n- eval_steps=\"3000\" \t\n- logging_steps=\"100\" \t\n- layerdrop=\"0.0\" \t\n- activation_dropout=\"0.1\" \t\n- save_total_limit=\"3\" \t\n- freeze_feature_encoder \t\n- feat_proj_dropout=\"0.0\" \t\n- mask_time_prob=\"0.75\" \t\n- mask_time_length=\"10\" \t\n- mask_feature_prob=\"0.25\" \t\n- mask_feature_length=\"64\" \n- preprocessing_num_workers 32",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0\n\nNotes\n- Training and eval code modified from : URL \n- Bengali speech data was not available from common voice or librispeech multilingual datasets, so OpenSLR53 has been used.\n- Minimum audio duration of 0.5s has been used to filter the training data which excluded may be 10-20 samples.\n- OpenSLR53 transcripts are *not* part of LM training and LM used to evaluate."
] |
[
109,
363,
134
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #bn #hf-asr-leaderboard #openslr_SLR53 #robust-speech-event #dataset-openslr #dataset-SLR53 #dataset-AI4Bharat/IndicCorp #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\nThe following hyperparameters were used during training:\n\n- dataset_name=\"openslr\" \t\n- model_name_or_path=\"facebook/wav2vec2-xls-r-300m\" \t\n- dataset_config_name=\"SLR53\" \t\n- output_dir=\"./wav2vec2-xls-r-300m-bengali\" \t\n- overwrite_output_dir \t\n- num_train_epochs=\"50\" \t\n- per_device_train_batch_size=\"32\" \t\n- per_device_eval_batch_size=\"32\" \t\n- gradient_accumulation_steps=\"1\" \t\n- learning_rate=\"7.5e-5\" \t\n- warmup_steps=\"2000\" \t\n- length_column_name=\"input_length\" \t\n- evaluation_strategy=\"steps\" \t\n- text_column_name=\"sentence\" \t\n- chars_to_ignore , ? . ! \\- \\; \\: \\\" “ % ‘ ” � — ’ … – \t\n- save_steps=\"2000\" \t\n- eval_steps=\"3000\" \t\n- logging_steps=\"100\" \t\n- layerdrop=\"0.0\" \t\n- activation_dropout=\"0.1\" \t\n- save_total_limit=\"3\" \t\n- freeze_feature_encoder \t\n- feat_proj_dropout=\"0.0\" \t\n- mask_time_prob=\"0.75\" \t\n- mask_time_length=\"10\" \t\n- mask_feature_prob=\"0.25\" \t\n- mask_feature_length=\"64\" \n- preprocessing_num_workers 32"
] |
[
-0.0884750634431839,
0.11381330341100693,
-0.007034501992166042,
0.0594245120882988,
0.06869327276945114,
0.05305963754653931,
0.08970417827367783,
0.12424473464488983,
-0.011083393357694149,
0.17234788835048676,
0.12613993883132935,
0.08813050389289856,
0.06922250986099243,
0.19656985998153687,
-0.041648633778095245,
-0.15450261533260345,
0.0460488498210907,
-0.10292384773492813,
-0.020532432943582535,
0.09577188640832901,
0.049025487154722214,
-0.07489459961652756,
0.04925985261797905,
0.014914370141923428,
-0.028064411133527756,
-0.0364886038005352,
-0.04740716516971588,
-0.07413487136363983,
0.05332391336560249,
0.01633555255830288,
0.022438911721110344,
0.022387918084859848,
0.0570991113781929,
-0.2260928452014923,
-0.0075847995467484,
0.10702041536569595,
0.019527919590473175,
0.08859451860189438,
0.13037574291229248,
-0.034128911793231964,
0.09556510299444199,
-0.14604035019874573,
0.07363606244325638,
0.045323245227336884,
-0.103227898478508,
-0.22402580082416534,
-0.09636000543832779,
0.084141306579113,
0.09627122431993484,
0.09510460495948792,
-0.026055457070469856,
0.07196256518363953,
-0.11303193867206573,
0.060495276004076004,
0.15337519347667694,
-0.18509776890277863,
-0.03878747671842575,
-0.051738034933805466,
0.024128548800945282,
-0.03846385329961777,
-0.07952301949262619,
-0.05402924120426178,
-0.021543122828006744,
-0.0017646915512159467,
0.055579643696546555,
0.006746532861143351,
0.07418567687273026,
-0.011184817180037498,
-0.10743866115808487,
-0.0346083864569664,
0.06761875748634338,
0.0869782418012619,
-0.0159193966537714,
-0.15312744677066803,
-0.0414712056517601,
-0.11967039853334427,
-0.0006921858876012266,
-0.0012530520325526595,
-0.0076974378898739815,
-0.0535832978785038,
-0.005829489324241877,
0.06687623262405396,
-0.018273917958140373,
-0.07283646613359451,
0.031873274594545364,
0.09327492862939835,
0.022568153217434883,
-0.005245709791779518,
0.008279831148684025,
0.06749531626701355,
0.09046807885169983,
-0.14889872074127197,
-0.025875888764858246,
-0.013311668299138546,
-0.17124849557876587,
0.01226100791245699,
0.0058416929095983505,
0.0026339394971728325,
0.09085001051425934,
0.168653666973114,
0.0066648549400269985,
0.11990220099687576,
0.0183720625936985,
4.6381430252040445e-7,
-0.016563158482313156,
0.10167711973190308,
-0.11539354920387268,
-0.15108586847782135,
-0.05138713866472244,
0.09867659956216812,
0.016344353556632996,
-0.023190835490822792,
0.0008937682723626494,
-0.01740211620926857,
0.04607689753174782,
0.020832527428865433,
0.06288919597864151,
0.06915763765573502,
-0.1003924161195755,
-0.01638108119368553,
0.053478024899959564,
-0.16601328551769257,
0.03357812389731407,
0.08959757536649704,
-0.1040496975183487,
0.0087785879150033,
0.03877582773566246,
-0.05579404905438423,
-0.09466290473937988,
0.15251365303993225,
-0.04315166175365448,
-0.0374218188226223,
-0.06689231097698212,
-0.09288860112428665,
0.010926258750259876,
-0.06538297981023788,
-0.055654045194387436,
-0.015295901335775852,
-0.09728758782148361,
-0.08957333117723465,
0.0950302705168724,
-0.08386575430631638,
0.005014475435018539,
-0.05894162505865097,
-0.09849045425653458,
0.08925125747919083,
-0.03496282175183296,
0.08115537464618683,
-0.08379798382520676,
0.0371168851852417,
-0.012081163935363293,
0.07960616052150726,
0.12725043296813965,
0.03142864257097244,
-0.016067950055003166,
0.05622225999832153,
-0.14973482489585876,
0.135163351893425,
-0.08295256644487381,
0.022088179364800453,
-0.1534847617149353,
-0.06596943736076355,
-0.004002314992249012,
-0.008716988377273083,
0.09803960472345352,
0.1390356868505478,
-0.16188257932662964,
-0.043811749666929245,
0.16389626264572144,
-0.01254641730338335,
-0.08173684030771255,
0.10851050913333893,
-0.017181532457470894,
-0.07977668195962906,
0.0006264974363148212,
0.11816897988319397,
0.06368443369865417,
-0.10112214088439941,
0.00586550310254097,
0.0026871170848608017,
-0.016783343628048897,
0.059990186244249344,
0.054361965507268906,
-0.09351567178964615,
0.09931226074695587,
-0.00038172717904672027,
-0.02561994083225727,
0.006172848865389824,
-0.043310098350048065,
-0.0450361929833889,
0.009411794133484364,
-0.026478735730051994,
0.019796820357441902,
-0.02976299449801445,
-0.02983217127621174,
-0.06417927891016006,
-0.17595402896404266,
-0.04977710172533989,
0.08507315069437027,
-0.0456521138548851,
0.006309766788035631,
-0.12655124068260193,
0.002909794682636857,
0.08722492307424545,
0.020301131531596184,
-0.15779340267181396,
-0.047415006905794144,
-0.016475461423397064,
-0.09682091325521469,
-0.007715429645031691,
-0.0133781423792243,
0.057366594672203064,
0.012488683685660362,
0.04611142724752426,
-0.043046433478593826,
0.012082148343324661,
-0.023302683606743813,
-0.01271361019462347,
-0.22457192838191986,
-0.053743503987789154,
-0.0016749897040426731,
0.16667352616786957,
-0.15810121595859528,
0.0076230550184845924,
0.027643002569675446,
0.12635661661624908,
-0.011970862746238708,
-0.063300721347332,
0.02121216431260109,
0.0032527251169085503,
0.020257355645298958,
-0.07529556751251221,
-0.007086570840328932,
-0.007338422350585461,
-0.105765700340271,
-0.004490445833653212,
-0.18428462743759155,
-0.019925858825445175,
0.07210255414247513,
0.04670320078730583,
-0.1118101254105568,
0.024279450997710228,
-0.025805488228797913,
-0.06388073414564133,
0.004018033854663372,
-0.01541561633348465,
0.1345599740743637,
0.10339637100696564,
0.0827738493680954,
-0.02530427649617195,
-0.0534355491399765,
-0.002685775514692068,
0.007299405988305807,
0.011719011701643467,
0.13668125867843628,
-0.023967847228050232,
-0.03512434661388397,
0.03523654863238335,
0.006891324184834957,
-0.025782160460948944,
0.10025869309902191,
-0.026994287967681885,
-0.08502824604511261,
-0.04714943468570709,
0.05875535309314728,
0.019989483058452606,
0.03614191338419914,
-0.12400802969932556,
0.04080559313297272,
0.05461496859788895,
0.01912618614733219,
-0.012991317547857761,
-0.09517738968133926,
0.0295773446559906,
0.023876311257481575,
-0.05731664225459099,
-0.03654803708195686,
-0.002049438888207078,
0.042038604617118835,
0.048360489308834076,
0.053375132381916046,
-0.047987937927246094,
0.013715174980461597,
-0.08038259297609329,
-0.08945842832326889,
0.17622999846935272,
-0.09749500453472137,
-0.14591683447360992,
-0.10105568915605545,
-0.06837626546621323,
-0.023983631283044815,
-0.05092952400445938,
0.011203078553080559,
-0.08334455639123917,
-0.1050216481089592,
-0.05455516651272774,
0.04726748913526535,
0.018320752307772636,
-0.0194815956056118,
0.08351175487041473,
0.0031285525765269995,
0.09571892023086548,
-0.08781155198812485,
0.013651530258357525,
-0.03262525051832199,
0.029100891202688217,
0.010309486649930477,
0.09639326483011246,
0.027836807072162628,
0.106312096118927,
0.043121088296175,
0.039633605629205704,
0.018124865368008614,
0.17326650023460388,
-0.08659561723470688,
0.02126818336546421,
0.14810696244239807,
-0.04427073150873184,
0.0832623839378357,
0.13948741555213928,
0.014448978938162327,
-0.0660344585776329,
0.02804141864180565,
0.05907195806503296,
0.000884378154296428,
-0.2355414777994156,
-0.05259208381175995,
-0.05133936554193497,
0.01875241845846176,
0.09572399407625198,
0.013576393015682697,
-0.06659992784261703,
0.012519010342657566,
-0.011400226503610611,
-0.040814921259880066,
0.029319997876882553,
0.035485684871673584,
0.07646188884973526,
0.06234430521726608,
0.0928524062037468,
0.00759353069588542,
0.0288664773106575,
0.04469411447644234,
-0.07406657189130783,
0.10396988689899445,
-0.0621517151594162,
0.17697681486606598,
0.03970349580049515,
0.12524256110191345,
-0.02954583801329136,
0.02878003567457199,
0.00132144580129534,
0.016283772885799408,
0.057981546968221664,
-0.0856810212135315,
-0.08663901686668396,
0.032127495855093,
0.030981482937932014,
0.002438880270346999,
-0.031042007729411125,
0.11848558485507965,
0.06116091459989548,
0.2510513961315155,
0.09745194017887115,
-0.29890432953834534,
-0.029482493177056313,
-0.045279208570718765,
-0.024180732667446136,
-0.07939234375953674,
-0.005688745994120836,
0.09496733546257019,
-0.11093372106552124,
0.07683596014976501,
-0.050802528858184814,
0.07949178665876389,
-0.1127443015575409,
0.02045360580086708,
0.10082181543111801,
0.17552584409713745,
0.01241853553801775,
0.015416950918734074,
-0.11721346527338028,
0.17386983335018158,
-0.0072145406156778336,
0.07565168291330338,
-0.0509573258459568,
0.07900577783584595,
0.014297880232334137,
-0.08407902717590332,
0.12655378878116608,
-0.02029220201075077,
-0.05121348053216934,
-0.07475876808166504,
-0.13574296236038208,
-0.0065988521091639996,
0.12396898120641708,
-0.018363725394010544,
0.08588044345378876,
-0.02230064570903778,
-0.04525870457291603,
0.02269837073981762,
-0.005955546163022518,
-0.10687268525362015,
-0.10650065541267395,
0.05261357128620148,
-0.007404720410704613,
0.03924349695444107,
-0.04649199917912483,
-0.022664176300168037,
-0.07825355976819992,
0.17925705015659332,
-0.16814641654491425,
-0.05694495141506195,
-0.13851012289524078,
0.03146178647875786,
0.16549457609653473,
-0.06375469267368317,
0.026960017159581184,
-0.0069963340647518635,
0.059092819690704346,
0.06222180649638176,
-0.04736237972974777,
0.12171227484941483,
-0.07653989642858505,
-0.14644405245780945,
-0.07228359580039978,
0.12307684868574142,
0.06088375300168991,
0.013020613230764866,
-0.03187413886189461,
0.017805779352784157,
-0.0227451641112566,
-0.10182302445173264,
0.04448315128684044,
0.04444243386387825,
0.053026556968688965,
0.06178769841790199,
-0.026632089167833328,
-0.04936337098479271,
-0.010113010182976723,
0.011567744426429272,
0.07672911137342453,
0.2525970935821533,
-0.07742816209793091,
0.07781072705984116,
0.06856872141361237,
0.0019715020898729563,
-0.1787818819284439,
0.02800150401890278,
0.1363595873117447,
0.0402001291513443,
-0.010108903050422668,
-0.16482040286064148,
0.09747453778982162,
0.11899244040250778,
0.004583518952131271,
0.04358561709523201,
-0.2797001004219055,
-0.13969062268733978,
0.10517636686563492,
0.015187639743089676,
-0.14071936905384064,
-0.1045590192079544,
-0.059733737260103226,
-0.09990110993385315,
-0.1286124587059021,
0.023614663630723953,
-0.010669874958693981,
0.07276342064142227,
-0.004655301105231047,
-0.013331729918718338,
0.03333865478634834,
-0.06762004643678665,
0.11426375806331635,
0.029158418998122215,
0.054502684623003006,
-0.07068105041980743,
0.08367977291345596,
-0.0027359388768672943,
-0.09599435329437256,
0.08441705256700516,
-0.10379964113235474,
0.050637565553188324,
-0.11980178207159042,
-0.003810817375779152,
-0.06368812918663025,
0.00785263441503048,
-0.04341582953929901,
-0.019239533692598343,
0.003881666576489806,
0.02868877537548542,
0.10559657216072083,
-0.010288933292031288,
0.022999925538897514,
-0.038923583924770355,
0.018446534872055054,
0.13701383769512177,
0.09634450823068619,
0.011687173508107662,
-0.13760079443454742,
0.02046331577003002,
0.03818269819021225,
0.020120570436120033,
-0.10790310055017471,
0.03384384140372276,
0.10204266011714935,
0.03561447188258171,
0.11622750014066696,
0.02857748605310917,
-0.06456660479307175,
-0.04189370945096016,
0.07310839742422104,
-0.09765497595071793,
-0.08396763354539871,
0.05286732316017151,
-0.06721001863479614,
-0.14133991301059723,
-0.05475761368870735,
0.10524468123912811,
0.0415201261639595,
0.03292999044060707,
0.020870734006166458,
0.06396500766277313,
-0.04241872951388359,
0.19918471574783325,
0.007938191294670105,
0.10963012278079987,
-0.09092497825622559,
0.07770396023988724,
0.058197032660245895,
-0.025339912623167038,
0.05881800130009651,
0.15116189420223236,
-0.09034644067287445,
0.0004541625967249274,
0.00771821616217494,
0.07780761271715164,
0.0792274922132492,
-0.024272477254271507,
-0.035581715404987335,
-0.09134593605995178,
0.08260233700275421,
0.079293392598629,
-0.00308180321007967,
0.027363203465938568,
0.006324893329292536,
-0.018434926867485046,
-0.06924609839916229,
0.07533162087202072,
0.08753079921007156,
0.01959107257425785,
-0.06297118216753006,
0.11850205063819885,
-0.012255442328751087,
-0.017544938251376152,
0.011255268007516861,
0.0014854090986773372,
-0.14341187477111816,
0.021020065993070602,
-0.11111241579055786,
0.015133497305214405,
-0.06275700032711029,
-0.013130264356732368,
0.008421237580478191,
0.028817128390073776,
-0.0044661108404397964,
0.013100420124828815,
-0.06081846356391907,
-0.10969068109989166,
-0.002729048952460289,
0.0646052435040474,
-0.16381172835826874,
0.012355245649814606,
0.01238888781517744,
-0.1347973495721817,
0.10312455147504807,
0.05972345173358917,
-0.004641686100512743,
-0.038424331694841385,
-0.14166273176670074,
-0.044471774250268936,
0.003485295921564102,
0.027942903339862823,
0.07448263466358185,
-0.149294912815094,
0.008261051028966904,
-0.06189494952559471,
-0.05043119564652443,
-0.006326981354504824,
0.01315050758421421,
-0.13231432437896729,
0.05628326162695885,
-0.04874619096517563,
-0.010090390220284462,
-0.0732063353061676,
0.05086042732000351,
0.077338308095932,
0.028090419247746468,
0.14482364058494568,
-0.03797487914562225,
0.05767826363444328,
-0.16632089018821716,
-0.0408756285905838,
0.03778361529111862,
0.012477453798055649,
0.032134685665369034,
0.007233727723360062,
0.14881891012191772,
-0.06537345051765442,
0.0405861996114254,
-0.009516888298094273,
-0.05508539825677872,
0.0327707976102829,
-0.05646074190735817,
-0.03497600555419922,
0.03140752390027046,
0.05048651993274689,
0.03244476765394211,
-0.01206944789737463,
0.09828809648752213,
-0.04497409611940384,
0.026385297998785973,
0.03489146754145622,
0.19402086734771729,
0.18508380651474,
0.06785136461257935,
0.057888977229595184,
0.0694599375128746,
-0.1420658379793167,
-0.10755633562803268,
0.18486051261425018,
-0.11492689698934555,
0.15399272739887238,
-0.05977673456072807,
0.07344058901071548,
0.0380532369017601,
-0.1368536502122879,
0.040450919419527054,
-0.032099708914756775,
-0.09823736548423767,
-0.10382301360368729,
-0.09047155827283859,
-0.051593948155641556,
-0.10236746817827225,
0.02894308604300022,
-0.06841811537742615,
0.06558981537818909,
0.11823968589305878,
0.0525067038834095,
0.03667366877198219,
0.11466523259878159,
-0.039537958800792694,
-0.03694840148091316,
0.08517801016569138,
0.03468979522585869,
-0.03867892175912857,
0.010262108407914639,
-0.058918096125125885,
0.02552945353090763,
0.023840731009840965,
0.0926855206489563,
-0.011311146430671215,
-0.022671928629279137,
0.07185029238462448,
-0.02278570458292961,
-0.09739013761281967,
0.041284456849098206,
0.00820437166839838,
0.011285186745226383,
0.11921238899230957,
0.06893470138311386,
0.003896945621818304,
-0.049725379794836044,
0.18420396745204926,
-0.09563086926937103,
-0.08011072874069214,
-0.1543472409248352,
0.07368741184473038,
0.026912730187177658,
0.0035758865997195244,
-0.012144812382757664,
-0.11545224487781525,
-0.04912111535668373,
0.12044861167669296,
0.1419697403907776,
-0.09627041965723038,
-0.009949141182005405,
0.013699637725949287,
-0.0008408782887272537,
-0.05221853405237198,
0.0922383964061737,
0.07352136820554733,
0.09910990297794342,
0.00979191716760397,
0.039314161986112595,
-0.0014859494986012578,
-0.0901695191860199,
-0.03538031503558159,
0.04746396839618683,
0.006663063075393438,
0.005788757931441069,
-0.024768386036157608,
0.07420932501554489,
-0.07161068171262741,
-0.18958692252635956,
0.059743937104940414,
-0.10408954322338104,
-0.18344493210315704,
-0.022589610889554024,
0.041876863688230515,
0.011669496074318886,
0.08299045264720917,
0.029580069705843925,
-0.04673662781715393,
0.11192584782838821,
-0.019146108999848366,
-0.017882006242871284,
-0.10772727429866791,
0.0010357340797781944,
-0.02492430806159973,
0.17805913090705872,
0.01435285434126854,
0.07606931030750275,
0.1187983825802803,
0.040168337523937225,
-0.10912945121526718,
0.020452693104743958,
0.07977496832609177,
-0.1167905330657959,
0.04242195934057236,
0.11680317670106888,
-0.037989284843206406,
0.12684349715709686,
0.07131895422935486,
-0.08831030875444412,
-0.01927899196743965,
-0.02240198291838169,
0.0016968001145869493,
-0.10998072475194931,
-0.018751468509435654,
-0.024470029398798943,
0.15022771060466766,
0.24162977933883667,
-0.03631650283932686,
-0.004504488315433264,
-0.030469201505184174,
0.01998419314622879,
-0.03892744332551956,
0.06693556904792786,
-0.049123965203762054,
-0.20315927267074585,
0.08186258375644684,
0.024958673864603043,
0.07047704607248306,
-0.13765586912631989,
-0.08149177581071854,
0.03760666772723198,
-0.035555399954319,
-0.019406937062740326,
0.09932437539100647,
0.0017630788497626781,
0.055277224630117416,
-0.03415892645716667,
-0.04366578534245491,
0.03246645629405975,
0.14767158031463623,
-0.1317664235830307,
-0.07286105304956436
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.