sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
# Wav2Vec2-Base-960h
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained and fine-tuned on 960 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and tokenizer
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-960h")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-960h")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-base-960h** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-960h").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-960h")
def map_to_pred(batch):
input_values = processor(batch["audio"]["array"], return_tensors="pt", padding="longest").input_values
with torch.no_grad():
logits = model(input_values.to("cuda")).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, batched=True, batch_size=1, remove_columns=["audio"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 3.4 | 8.6 |
|
{"language": "en", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["librispeech_asr"], "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}], "model-index": [{"name": "wav2vec2-base-960h", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (clean)", "type": "librispeech_asr", "config": "clean", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 3.4, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (other)", "type": "librispeech_asr", "config": "other", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 8.6, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-960h
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #safetensors #wav2vec2 #automatic-speech-recognition #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
Wav2Vec2-Base-960h
==================
Facebook's Wav2Vec2
The base model pretrained and fine-tuned on 960 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
Usage
=====
To transcribe audio files the model can be used as a standalone acoustic model as follows:
Evaluation
----------
This code snippet shows how to evaluate facebook/wav2vec2-base-960h on LibriSpeech's "clean" and "other" test data.
*Result (WER)*:
|
[] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #wav2vec2 #automatic-speech-recognition #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
95
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #wav2vec2 #automatic-speech-recognition #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
-0.14019905030727386,
0.08823640644550323,
-0.003466170746833086,
0.01857742667198181,
0.056414294987916946,
-0.036358267068862915,
0.09946588426828384,
0.07907144725322723,
0.0390666127204895,
-0.012946401722729206,
0.07277064770460129,
0.13242219388484955,
0.006045450456440449,
0.05041378736495972,
-0.0656772255897522,
-0.10601859539747238,
0.07872609049081802,
-0.019578471779823303,
0.007556425407528877,
0.07087083160877228,
0.11159656941890717,
-0.038483161479234695,
0.05095687881112099,
0.031306710094213486,
-0.012030384503304958,
0.031033014878630638,
0.06134192645549774,
-0.12454487383365631,
0.1087823361158371,
0.037797052413225174,
0.015216412954032421,
0.07150987535715103,
0.050368472933769226,
-0.09755824506282806,
0.030997611582279205,
0.02874802052974701,
-0.03621968626976013,
0.04854695871472359,
0.02653655596077442,
-0.023531118407845497,
0.0004948929417878389,
0.04327893629670143,
-0.04643133282661438,
0.08083764463663101,
-0.051328144967556,
-0.26074519753456116,
-0.07417257875204086,
0.11867661774158478,
0.014435213059186935,
0.07231763750314713,
0.003096114844083786,
0.1491265445947647,
-0.0928104966878891,
0.09802038222551346,
0.12040398269891739,
-0.26236987113952637,
0.048667795956134796,
-0.001575665664859116,
0.011819493025541306,
0.03144853562116623,
-0.03247227147221565,
0.03336413577198982,
0.03476765751838684,
0.006739158183336258,
0.050853680819272995,
-0.055047012865543365,
-0.23085154592990875,
0.0008219059091061354,
-0.11089765280485153,
-0.05535053834319115,
0.24785448610782623,
0.03696621209383011,
0.010506273247301579,
-0.030137471854686737,
-0.07911893725395203,
0.050299424678087234,
-0.01149536669254303,
0.0420268215239048,
-0.013545668683946133,
0.033437687903642654,
0.06523185968399048,
-0.006106784101575613,
-0.10922547429800034,
-0.08289860188961029,
-0.13490718603134155,
0.1126059889793396,
-0.02559206634759903,
0.06571492552757263,
-0.1232207641005516,
0.025801118463277817,
0.013668911531567574,
-0.128122016787529,
-0.000930260808672756,
-0.013585595414042473,
0.013288808986544609,
0.06158033385872841,
-0.013216312043368816,
0.011193854734301567,
0.1531182825565338,
0.0564303919672966,
-0.0016716414829716086,
-0.005607756786048412,
-0.056160975247621536,
0.08953778445720673,
-0.05167176574468613,
0.0767662301659584,
-0.08779400587081909,
-0.008655629120767117,
0.08443988114595413,
0.04543641582131386,
0.08131184428930283,
-0.037512652575969696,
-0.08665480464696884,
-0.029706647619605064,
0.042407725006341934,
0.04420921206474304,
0.06624743342399597,
0.025492016226053238,
-0.02597409300506115,
0.029117906466126442,
0.1316630095243454,
-0.14754542708396912,
-0.006476190406829119,
0.0595460906624794,
0.07006886601448059,
0.023586826398968697,
0.06987244635820389,
0.02214449644088745,
-0.06081707403063774,
-0.003119420725852251,
-0.012803708203136921,
-0.016442997381091118,
0.05104053020477295,
-0.040690306574106216,
0.06412268429994583,
-0.050827283412218094,
0.027283649891614914,
-0.16572795808315277,
-0.06456132233142853,
0.01159487571567297,
0.0026186692994087934,
0.030637672170996666,
-0.06244434416294098,
0.038353949785232544,
-0.07088928669691086,
0.05341636389493942,
-0.11219366639852524,
0.04431086778640747,
-0.08661288768053055,
0.06544718891382217,
0.0245992299169302,
0.08875284343957901,
-0.14635387063026428,
0.07122142612934113,
-0.07447691261768341,
-0.00963122770190239,
-0.028117505833506584,
0.04734376072883606,
-0.12199778854846954,
0.08037631958723068,
-0.06475789099931717,
-0.01409110240638256,
-0.11875434964895248,
0.03668297454714775,
-0.01767978072166443,
0.06653343141078949,
-0.15466180443763733,
-0.09218709170818329,
0.14938609302043915,
-0.12941506505012512,
-0.15541736781597137,
0.11831460148096085,
0.053396254777908325,
-0.01987934671342373,
0.049914196133613586,
0.2519778311252594,
0.03262060508131981,
-0.12898480892181396,
-0.028212355449795723,
0.10493644326925278,
-0.08165744692087173,
-0.12197131663560867,
0.06376899033784866,
-0.0557810477912426,
-0.0024717780761420727,
0.0028423680923879147,
-0.03164108470082283,
0.09036272019147873,
0.03145886957645416,
-0.09336747974157333,
-0.07378026843070984,
-0.11124937236309052,
-0.012807906605303288,
-0.008948166854679585,
-0.002068441826850176,
-0.018055807799100876,
-0.02563568390905857,
-0.06779281795024872,
0.06405431777238846,
-0.009126810356974602,
0.03656129539012909,
-0.09938640892505646,
0.11497580260038376,
-0.05254950746893883,
0.023766279220581055,
-0.14535601437091827,
0.1099332645535469,
-0.06716758757829666,
-0.014101742766797543,
0.04297797381877899,
0.05543016642332077,
0.07895753532648087,
-0.04590562731027603,
-0.017041541635990143,
-0.050415486097335815,
0.10753938555717468,
0.08388318866491318,
0.032981984317302704,
-0.2036411166191101,
0.02974899299442768,
-0.05815321207046509,
0.0964268296957016,
-0.03745659813284874,
-0.020768126472830772,
0.10879115760326385,
0.0915476456284523,
-0.0055186026729643345,
0.05642690509557724,
0.04899085313081741,
-0.025097286328673363,
0.00615399656817317,
-0.014635978266596794,
0.04181522876024246,
0.007446381729096174,
-0.054428763687610626,
0.21198873221874237,
-0.17294752597808838,
0.25927144289016724,
0.22315658628940582,
-0.056028660386800766,
0.06588253378868103,
0.13127508759498596,
-0.007047026418149471,
-0.010272148996591568,
0.04051542282104492,
-0.059644561260938644,
0.07531019300222397,
-0.031197257339954376,
0.12447647750377655,
-0.07981956005096436,
-0.0026075232308357954,
0.021993981674313545,
-0.0416276678442955,
0.009712010622024536,
0.0921788141131401,
-0.0321941003203392,
-0.14332900941371918,
0.11599285155534744,
0.22277218103408813,
-0.07629242539405823,
0.15821091830730438,
-0.07392741739749908,
-0.07011669129133224,
0.08010745793581009,
-0.0132266441360116,
-0.022560223937034607,
0.1366383582353592,
-0.1513824760913849,
-0.014133546501398087,
0.09282223880290985,
-0.011567745357751846,
0.030687430873513222,
-0.16151486337184906,
0.006934289820492268,
-0.015080736018717289,
-0.0714118629693985,
-0.16561190783977509,
0.08776617795228958,
-0.018627263605594635,
0.10173351317644119,
-0.09227720648050308,
-0.21508191525936127,
0.05678906664252281,
-0.03379787132143974,
-0.1075756773352623,
0.07014153152704239,
-0.08252701908349991,
-0.233488529920578,
-0.09569749981164932,
-0.03906705975532532,
-0.0376841202378273,
0.011524935252964497,
0.11416520178318024,
-0.07887022197246552,
-0.03492965176701546,
-0.05589018389582634,
-0.043863680213689804,
0.045908018946647644,
0.008081632666289806,
0.08938104659318924,
0.023908773437142372,
0.10640958696603775,
-0.14958493411540985,
-0.009178685024380684,
-0.023137135431170464,
0.03422253206372261,
0.056104063987731934,
0.05280023068189621,
0.07754034548997879,
0.14555805921554565,
0.050770267844200134,
0.017675278708338737,
0.007934884168207645,
0.15059055387973785,
-0.0825953558087349,
0.008334649726748466,
0.17835374176502228,
-0.0504286028444767,
0.017825545743107796,
0.17409725487232208,
0.038874056190252304,
-0.010325118899345398,
-0.028918510302901268,
-0.04088935628533363,
-0.08109907805919647,
-0.19512446224689484,
-0.11044841259717941,
-0.11150912195444107,
-0.032644979655742645,
-0.0013175108470022678,
0.08746795356273651,
0.043324895203113556,
0.00135748868342489,
-0.011782605201005936,
-0.06426740437746048,
0.026184389367699623,
-0.034607306122779846,
0.18417838215827942,
-0.030363552272319794,
0.0976235643029213,
-0.11403923481702805,
-0.04840165376663208,
0.0755864828824997,
0.09398612380027771,
0.027432167902588844,
0.08527448773384094,
0.009049023501574993,
0.045048851519823074,
0.15875907242298126,
0.07044633477926254,
0.13037407398223877,
0.0032004364766180515,
-0.023834113031625748,
0.00825861468911171,
-0.09319721907377243,
-0.00801240373402834,
0.08193832635879517,
0.050538551062345505,
-0.046993013471364975,
0.011395200155675411,
-0.05506487563252449,
0.047440286725759506,
0.17580920457839966,
0.09763815253973007,
-0.18973809480667114,
0.006711551453918219,
0.0337991863489151,
-0.045061737298965454,
-0.00916915014386177,
0.06497208774089813,
0.025014011189341545,
0.008245960809290409,
0.1036495566368103,
0.051214054226875305,
0.06842564791440964,
0.010520786046981812,
0.03898253291845322,
-0.08096018433570862,
-0.028879543766379356,
0.036689095199108124,
0.04628251865506172,
-0.2323175072669983,
0.24873340129852295,
0.036209892481565475,
0.046948716044425964,
-0.010931882075965405,
-0.0005871380562894046,
0.09745761752128601,
0.11731398105621338,
0.14643190801143646,
0.03178953379392624,
-0.04661093279719353,
-0.04269851744174957,
-0.08717244863510132,
0.06418267637491226,
0.003221773076802492,
0.10205104202032089,
-0.05252077430486679,
-0.033016350120306015,
-0.04359353333711624,
0.05449211597442627,
-0.0006323498673737049,
-0.14139722287654877,
-0.078850656747818,
0.03453100100159645,
0.2916620969772339,
0.04868017137050629,
-0.05267465114593506,
-0.06356770545244217,
-0.1900843232870102,
-0.018424587324261665,
-0.13163843750953674,
-0.0018616055604070425,
-0.07050243765115738,
-0.16947928071022034,
0.09912029653787613,
-0.020717274397611618,
0.04178934544324875,
-0.022034475579857826,
-0.006831753067672253,
-0.02445073239505291,
-0.1476871818304062,
0.10776430368423462,
-0.129242405295372,
-0.0776461660861969,
0.005601936485618353,
0.19424358010292053,
-0.04668959230184555,
0.055758822709321976,
0.03849119693040848,
0.040216825902462006,
-0.0668775886297226,
-0.063800148665905,
0.1226796880364418,
0.051133498549461365,
-0.06071294844150543,
0.02774941362440586,
-0.033928826451301575,
-0.21935704350471497,
0.02141675166785717,
-0.017011767253279686,
0.17652525007724762,
0.1697796732187271,
-0.0644197165966034,
0.1770894080400467,
0.23494793474674225,
-0.005089063663035631,
-0.31167975068092346,
-0.14725522696971893,
-0.08910327404737473,
-0.008248805068433285,
-0.0018972313264384866,
-0.07195399701595306,
0.12305280566215515,
-0.0695529580116272,
-0.11342960596084595,
0.07289837300777435,
-0.12114692479372025,
-0.0945102795958519,
0.3055291473865509,
-0.10193789750337601,
0.21934297680854797,
-0.12150853872299194,
-0.053110480308532715,
-0.0809190571308136,
-0.17752809822559357,
0.06603074818849564,
-0.2247803658246994,
0.07873599976301193,
0.003205522196367383,
0.023929061368107796,
-0.017812099307775497,
-0.044358447194099426,
0.09346576035022736,
0.05134580656886101,
-0.013532193377614021,
-0.039332613348960876,
0.05241282284259796,
0.08897308260202408,
0.001473572221584618,
0.12969957292079926,
-0.122565858066082,
0.0495140366256237,
-0.05303399637341499,
-0.004372310824692249,
-0.1095280721783638,
0.0970233678817749,
0.061414048075675964,
-0.004440706223249435,
0.02766077034175396,
-0.058038923889398575,
0.010235981084406376,
-0.012517057359218597,
0.13890889286994934,
-0.0735759362578392,
0.023425502702593803,
0.20431669056415558,
0.15245641767978668,
-0.22505146265029907,
-0.06541041284799576,
-0.03559476509690285,
-0.06618500500917435,
0.10774264484643936,
-0.10095009952783585,
0.1334838718175888,
0.023988215252757072,
0.047614000737667084,
0.04667166620492935,
0.07255958020687103,
-0.051814787089824677,
-0.03685656562447548,
0.12202223390340805,
-0.12448175996541977,
-0.08551391214132309,
0.004760940093547106,
0.057396117597818375,
0.02389032021164894,
0.10917820781469345,
0.14277176558971405,
-0.03164058178663254,
0.011583046056330204,
0.00634760269895196,
0.03550403192639351,
-0.13513445854187012,
0.11290737986564636,
0.1291426718235016,
0.05460872873663902,
-0.15644097328186035,
0.10129109025001526,
-0.03240901604294777,
-0.06015715003013611,
0.027705322951078415,
0.002136991824954748,
-0.09017565101385117,
-0.12992148101329803,
-0.06993185728788376,
0.018155116587877274,
-0.02581728808581829,
-0.14551064372062683,
-0.03116091899573803,
-0.12275300174951553,
0.014591998420655727,
0.13991235196590424,
0.053713925182819366,
0.05608602985739708,
-0.02533796615898609,
-0.0925578773021698,
0.016843734309077263,
0.03837394714355469,
-0.05919329449534416,
0.013290745206177235,
-0.18565881252288818,
-0.05818909406661987,
-0.004512756131589413,
0.0513082891702652,
-0.07362329959869385,
0.002174974186345935,
-0.0552467480301857,
0.045086443424224854,
-0.09350670874118805,
-0.012688184157013893,
-0.05516257882118225,
0.03204509615898132,
0.021767903119325638,
-0.09828221052885056,
-0.026477113366127014,
0.07968199998140335,
-0.10591194033622742,
-0.008341755717992783,
0.014615008607506752,
0.08105853945016861,
-0.15880513191223145,
-0.018501244485378265,
0.009077243506908417,
-0.02027415670454502,
0.10574422776699066,
0.1245032250881195,
-0.15720754861831665,
0.08351004123687744,
-0.2688552141189575,
-0.19615384936332703,
0.1391790509223938,
0.047519128769636154,
0.02412872202694416,
-0.07209242880344391,
-0.02357456460595131,
0.13004140555858612,
0.035370513796806335,
0.016498113051056862,
0.11243166029453278,
-0.04959467053413391,
-0.009042415767908096,
-0.1333898901939392,
-0.033920902758836746,
-0.007637788541615009,
-0.02901701256632805,
0.18175508081912994,
0.06910382211208344,
0.16270263493061066,
-0.040312763303518295,
-0.037036873400211334,
-0.10879722982645035,
0.037058018147945404,
-0.04644548147916794,
-0.15651935338974,
-0.16722548007965088,
0.009132307022809982,
0.02401420846581459,
-0.04984186589717865,
0.20394951105117798,
-0.01655355840921402,
-0.09464345127344131,
0.045886121690273285,
0.011084884405136108,
-0.019333496689796448,
0.004918643739074469,
0.2720911502838135,
0.019641203805804253,
-0.004610123112797737,
-0.025245824828743935,
-0.04208189621567726,
0.03363965079188347,
0.08092338591814041,
0.011685016565024853,
0.14688917994499207,
0.039959460496902466,
0.0969562903046608,
0.14318643510341644,
-0.09498801082372665,
-0.04587161913514137,
-0.009752066805958748,
-0.06815849989652634,
0.09952834248542786,
-0.05290902405977249,
0.07800545543432236,
0.1854114532470703,
0.031343430280685425,
0.05051742494106293,
-0.0894639790058136,
-0.0030398033559322357,
-0.16689714789390564,
-0.07412783056497574,
-0.06577390432357788,
-0.1499749720096588,
0.0023929874878376722,
-0.021295588463544846,
0.01351268868893385,
0.09835948050022125,
0.017387473955750465,
0.012840569950640202,
0.07702645659446716,
0.008713815361261368,
-0.0141759580001235,
0.05953492224216461,
-0.0324915312230587,
-0.04482387751340866,
-0.023818137124180794,
-0.01134231686592102,
0.10613935440778732,
-0.003911645617336035,
-0.03079327940940857,
-0.007788407150655985,
-0.08633267134428024,
0.08707385510206223,
-0.10968257486820221,
-0.07107260823249817,
-0.022333722561597824,
0.013797972351312637,
0.039191026240587234,
0.10982857644557953,
0.08131781220436096,
-0.021963009610772133,
0.06838806718587875,
0.1593082845211029,
-0.0812973901629448,
-0.1846536546945572,
-0.06716939061880112,
0.10331398248672485,
-0.04070046544075012,
0.054501619189977646,
-0.032552946358919144,
-0.060754116624593735,
-0.013080018572509289,
0.18812508881092072,
0.24745860695838928,
-0.06131580099463463,
0.07128166407346725,
-0.10078106820583344,
0.030486468225717545,
-0.06015457585453987,
0.004086195956915617,
0.17532390356063843,
0.21120697259902954,
-0.017733855172991753,
-0.051787637174129486,
-0.05512569099664688,
-0.041577037423849106,
-0.05666060000658035,
0.0739278718829155,
-0.029238274320960045,
-0.10006628185510635,
-0.014059768989682198,
0.08609231561422348,
-0.05201055854558945,
-0.04544348642230034,
-0.13358120620250702,
-0.09305989742279053,
-0.034924644976854324,
-0.0221288800239563,
0.1543291062116623,
0.10581326484680176,
-0.01872946321964264,
-0.07780298590660095,
-0.012783038429915905,
0.02464575320482254,
-0.012257560156285763,
-0.172241672873497,
0.022319253534078598,
0.007292294874787331,
-0.12282342463731766,
0.06639974564313889,
-0.0063707041554152966,
0.11291185021400452,
0.04148019477725029,
0.08466613292694092,
-0.06650616228580475,
0.14758317172527313,
-0.0014212351525202394,
-0.13384795188903809,
0.004731451626867056,
0.06570639461278915,
0.008801095187664032,
-0.014903816394507885,
0.030416296795010567,
-0.05924321711063385,
0.04495156928896904,
0.0008613734971731901,
-0.0855143666267395,
-0.061654552817344666,
-0.0026802788488566875,
-0.03376402705907822,
0.0651540532708168,
-0.02246820367872715,
-0.04582398757338524,
-0.013252063654363155,
-0.06344684958457947,
0.01664154976606369,
0.04853703826665878,
-0.18889155983924866,
-0.05841784551739693,
-0.05985235422849655,
0.0150965740904212,
-0.08745373785495758,
-0.015063509345054626,
-0.11101590842008591,
-0.04418070614337921,
-0.10690978914499283,
-0.027067964896559715,
-0.07146140187978745,
-0.002890702337026596,
0.11111016571521759,
0.02473466843366623,
0.006415280979126692,
-0.004504145588725805,
0.06440769135951996,
0.06715293973684311,
-0.12908290326595306,
-0.09594689309597015
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **bg** on **17.6k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **bg**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "bg", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-bg-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"bg",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"bg"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #bg #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in bg on 17.6k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in bg. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in bg on 17.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in bg. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #bg #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in bg on 17.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in bg. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #bg #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in bg on 17.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in bg. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07509108632802963,
0.11017512530088425,
-0.0030557040590792894,
0.011806764639914036,
0.07095559686422348,
-0.057480525225400925,
0.12747657299041748,
0.04444558173418045,
0.012957352213561535,
0.09712294489145279,
-0.018153540790081024,
-0.05070485174655914,
0.06442093849182129,
0.15111549198627472,
0.05999103561043739,
-0.25117027759552,
0.02506123296916485,
-0.06198801472783089,
0.050216373056173325,
0.05409872159361839,
0.11571323126554489,
-0.08383864164352417,
0.023163383826613426,
0.04423242434859276,
-0.042587220668792725,
0.03767280653119087,
-0.04635098576545715,
-0.08080384135246277,
0.053293000906705856,
0.044530078768730164,
-0.03785207122564316,
0.029007824137806892,
0.07573941349983215,
-0.17871229350566864,
0.03817221522331238,
0.030166426673531532,
0.02261492796242237,
0.007261655759066343,
0.09863777458667755,
0.018807969987392426,
0.17245694994926453,
-0.025320861488580704,
-0.007114351727068424,
0.0813116729259491,
-0.0643121749162674,
-0.10633894056081772,
-0.05914587900042534,
0.1743447184562683,
0.0845341756939888,
0.09945714473724365,
-0.08450470119714737,
0.07595829665660858,
-0.02553449384868145,
0.05053356662392616,
0.09085891395807266,
-0.1846846640110016,
-0.04661198705434799,
0.06673651933670044,
0.11038671433925629,
0.027848195284605026,
-0.09363728761672974,
0.07395030558109283,
0.039054736495018005,
-0.016652829945087433,
-0.05537727102637291,
-0.03581862896680832,
0.1227564737200737,
-0.09464912861585617,
-0.11354507505893707,
0.0014361909125000238,
0.17816725373268127,
0.05265016108751297,
-0.06838198751211166,
-0.1451476365327835,
0.012533373199403286,
0.1989794820547104,
-0.05032854899764061,
-0.08955632895231247,
0.008005548268556595,
0.022118432447314262,
0.030317755416035652,
-0.06327839195728302,
-0.06525088101625443,
-0.0037534446455538273,
0.020440518856048584,
0.10062175244092941,
0.006870859768241644,
-0.022756949067115784,
-0.0758112221956253,
0.009472591802477837,
-0.09382116049528122,
-0.12589770555496216,
-0.020446568727493286,
-0.07305822521448135,
-0.06755508482456207,
-0.033242080360651016,
0.0002754654851742089,
-0.08567079901695251,
0.03144819289445877,
0.10136707872152328,
0.06707208603620529,
0.046076126396656036,
-0.06789081543684006,
-0.03430796414613724,
0.11998036503791809,
0.062116675078868866,
-0.11563561111688614,
-0.029247397556900978,
0.03209676221013069,
-0.023835355415940285,
0.01266883872449398,
-0.02919749543070793,
-0.03727772831916809,
0.009496603161096573,
-0.019506379961967468,
0.05632606893777847,
0.0440034382045269,
-0.03283464163541794,
-0.043640706688165665,
-0.10075554996728897,
0.10702003538608551,
-0.08083263039588928,
0.02153044193983078,
0.04943780228495598,
-0.008242118172347546,
0.0676942840218544,
-0.05286438763141632,
0.06272206455469131,
-0.10835645347833633,
0.009926680475473404,
-0.028412021696567535,
-0.015827592462301254,
0.030320314690470695,
-0.020286336541175842,
0.03525406867265701,
0.0019553671590983868,
0.014395484700798988,
-0.11611045897006989,
-0.011910926550626755,
-0.0826832726597786,
-0.0329161137342453,
-0.07990504801273346,
-0.04145507141947746,
-0.04456697404384613,
0.01961684226989746,
-0.009342951700091362,
-0.011829257011413574,
0.010571187362074852,
-0.02079441398382187,
0.005848057102411985,
0.018320366740226746,
0.05416501313447952,
0.06130010262131691,
0.0790405124425888,
-0.03167088329792023,
-0.01643688790500164,
-0.09732651710510254,
0.1227845773100853,
-0.08165115118026733,
-0.027062993496656418,
-0.13651664555072784,
-0.02776983007788658,
-0.037454232573509216,
0.027809016406536102,
0.01500774547457695,
0.12542854249477386,
-0.16488637030124664,
-0.08340544253587723,
0.14440716803073883,
-0.11125637590885162,
-0.0002561220608185977,
0.17192739248275757,
-0.0038662711158394814,
0.07196474820375443,
0.1049451008439064,
0.21090763807296753,
0.012712717987596989,
-0.1581762433052063,
-0.02621738612651825,
-0.058492064476013184,
0.039417337626218796,
0.12854637205600739,
0.060862064361572266,
-0.05219833552837372,
0.07249944657087326,
-0.014007735066115856,
-0.024996710941195488,
-0.06428693979978561,
0.006517548114061356,
-0.045860324054956436,
0.01100639346987009,
-0.04465482756495476,
0.0395369827747345,
-0.0097946273162961,
-0.01609908603131771,
-0.014954814687371254,
-0.09589335322380066,
-0.05278230831027031,
0.127802774310112,
-0.06056582182645798,
0.01968226209282875,
-0.09173928201198578,
0.06830055266618729,
0.044481467455625534,
0.012880470603704453,
-0.12439554929733276,
0.11119531840085983,
0.034873075783252716,
-0.05098036676645279,
0.1469862014055252,
0.076316699385643,
-0.024772707372903824,
0.00040479988092556596,
-0.02587324194610119,
0.0191881712526083,
-0.022983161732554436,
0.005669769365340471,
-0.020756324753165245,
-0.10588251054286957,
0.005338371731340885,
-0.06946913152933121,
0.10720761120319366,
-0.14437901973724365,
-0.01672767847776413,
0.060934074223041534,
0.11864333599805832,
-0.006577009800821543,
-0.04476546496152878,
0.09851720929145813,
0.02876463532447815,
0.02654324285686016,
-0.01739044301211834,
0.01930829882621765,
-0.025213079527020454,
-0.007470522075891495,
0.054241202771663666,
-0.13783252239227295,
-0.1553819626569748,
0.10188550502061844,
0.04023626074194908,
-0.010207238607108593,
0.05186128616333008,
0.023318296298384666,
-0.023091519251465797,
-0.03483070060610771,
0.007457422092556953,
0.21968182921409607,
-0.009528200142085552,
0.06756291538476944,
-0.09310504794120789,
-0.02964744158089161,
0.011486946605145931,
-0.03821812942624092,
-0.07911494374275208,
0.08339958637952805,
-0.013816825114190578,
-0.10222805291414261,
-0.022683728486299515,
0.05962471663951874,
0.07918625324964523,
0.18859995901584625,
0.00813991017639637,
-0.09063378721475601,
-0.025658924132585526,
-0.05640976130962372,
-0.015383691526949406,
0.047381442040205,
-0.15282118320465088,
-0.022353166714310646,
0.029333166778087616,
-0.0018990356475114822,
0.04347783699631691,
-0.013336473144590855,
0.03379146754741669,
0.004132253583520651,
-0.05357865244150162,
-0.07096182554960251,
0.05044092983007431,
-0.03159121796488762,
0.035380661487579346,
-0.001975449500605464,
0.022635191679000854,
-0.037306442856788635,
-0.05581071972846985,
-0.13388799130916595,
0.08358078449964523,
-0.06831233203411102,
-0.31854745745658875,
-0.08536428958177567,
-0.062383972108364105,
-0.04319782555103302,
0.013225346803665161,
0.05432915687561035,
-0.11646132916212082,
-0.10667723417282104,
-0.06137002632021904,
0.12629859149456024,
-0.009627370163798332,
-0.06519008427858353,
0.11135387420654297,
-0.003642455441877246,
0.022474754601716995,
-0.1058289110660553,
0.023901795968413353,
-0.03364327549934387,
-0.031123625114560127,
-0.029931694269180298,
0.021869981661438942,
0.050338294357061386,
0.12824279069900513,
0.029035845771431923,
-0.007223037071526051,
0.013197371736168861,
0.21832804381847382,
-0.14448243379592896,
0.07516162097454071,
0.2174246609210968,
-0.057921528816223145,
-0.011369193904101849,
0.14699259400367737,
-0.00620661024004221,
-0.05389707535505295,
0.05171132832765579,
0.004205277189612389,
-0.015584457665681839,
-0.23347580432891846,
-0.12408451735973358,
-0.047520119696855545,
-0.027236051857471466,
0.0441318079829216,
0.03419266641139984,
-0.0038034662138670683,
0.012551414780318737,
-0.09224507957696915,
-0.03899281099438667,
0.056969210505485535,
0.04341192916035652,
0.14139172434806824,
0.012327014468610287,
0.05577470734715462,
-0.039118196815252304,
-0.014919684268534184,
0.11003077775239944,
-0.025223538279533386,
0.053035326302051544,
0.06091785803437233,
0.1032734289765358,
0.058876778930425644,
0.024437012150883675,
0.06199978291988373,
-0.028947310522198677,
-0.027137616649270058,
-0.005086097400635481,
-0.026494767516851425,
-0.06999877095222473,
0.02208464778959751,
0.041024547070264816,
0.12932036817073822,
-0.11891209334135056,
-0.1017308309674263,
0.0014142588479444385,
0.02484758198261261,
0.10733482986688614,
0.1002642884850502,
-0.01466958224773407,
-0.11480798572301865,
0.0443648062646389,
-0.10108072310686111,
-0.032103344798088074,
0.04204666242003441,
0.08586788922548294,
-0.16049368679523468,
0.08487281203269958,
0.07513744384050369,
0.08504456281661987,
-0.03630330413579941,
0.02739538438618183,
-0.06489494442939758,
0.05952237546443939,
-0.0013527284609153867,
0.06877578049898148,
-0.1725686937570572,
0.1060226708650589,
0.02082914672791958,
0.07909315079450607,
-0.06303233653306961,
0.020042868331074715,
0.04794556647539139,
0.00046811331412754953,
0.12802788615226746,
-0.006862352602183819,
-0.07880280166864395,
0.0021747706923633814,
-0.10520915687084198,
0.014887906610965729,
0.05630026012659073,
-0.06899617612361908,
0.052144743502140045,
0.002948024310171604,
-0.014779259450733662,
-0.03658460080623627,
-0.017347730696201324,
-0.2370515912771225,
-0.1374320685863495,
0.04758874699473381,
-0.008120867423713207,
0.06014443188905716,
-0.0398569218814373,
-0.07565871626138687,
-0.107023686170578,
0.11299946159124374,
-0.013212437741458416,
-0.027642464265227318,
-0.0659249797463417,
0.03287694975733757,
0.09854186326265335,
-0.06537985801696777,
0.029907315969467163,
0.040593214333057404,
0.13797658681869507,
-0.06380581855773926,
-0.050076816231012344,
0.02873147651553154,
-0.09350167959928513,
-0.13492843508720398,
0.007916910573840141,
0.18884848058223724,
0.12360508739948273,
0.05753529444336891,
0.08180316537618637,
0.015093524940311909,
0.0025766808539628983,
-0.08980000019073486,
0.014114114455878735,
0.030086880549788475,
-0.07813922315835953,
0.04620129615068436,
-0.010779534466564655,
-0.2643747925758362,
-0.14785240590572357,
-0.07899319380521774,
0.07925666868686676,
0.19981801509857178,
-0.02564997784793377,
0.18619976937770844,
0.27942338585853577,
-0.08434288948774338,
-0.21239176392555237,
-0.04491715133190155,
-0.002752574859187007,
0.025856653228402138,
0.0577780082821846,
-0.2108488529920578,
0.10603117942810059,
-0.0004895458114333451,
0.013656130991876125,
-0.033287834376096725,
-0.20986983180046082,
-0.1415545791387558,
0.16707640886306763,
-0.03155425190925598,
0.05321848392486572,
-0.03354456648230553,
-0.07683642953634262,
-0.027461828663945198,
-0.047065235674381256,
0.017424624413251877,
-0.08203800767660141,
0.08850996196269989,
0.039863962680101395,
0.019934343174099922,
0.023595426231622696,
0.01673346944153309,
0.1019517183303833,
0.07786866277456284,
-0.022517690435051918,
-0.088071808218956,
0.024447305127978325,
0.01707903854548931,
-0.013141704723238945,
0.08915545046329498,
0.030999403446912766,
0.01653018221259117,
-0.05081097409129143,
-0.08963717520236969,
-0.0773928090929985,
0.0640232264995575,
-0.06573421508073807,
-0.025318467989563942,
-0.06456567347049713,
0.0982476994395256,
0.011883983388543129,
0.00013003233470954,
-0.06555769592523575,
-0.10028226673603058,
-0.020688582211732864,
0.11635079234838486,
0.21646280586719513,
-0.04258173331618309,
0.006717876065522432,
-0.04503816366195679,
-0.04481518268585205,
0.04855112358927727,
-0.032647754997015,
0.05097559839487076,
0.048221465200185776,
0.03173639252781868,
0.08137687295675278,
-0.033480554819107056,
-0.1410405933856964,
0.03167804330587387,
0.04419172555208206,
-0.0804990604519844,
-0.16696149110794067,
-0.05097796767950058,
-0.0096884211525321,
-0.0153061393648386,
-0.02905961312353611,
0.19690650701522827,
-0.012791985645890236,
-0.06698738038539886,
0.012636320665478706,
0.057725679129362106,
-0.013105524703860283,
0.12302513420581818,
0.04712659493088722,
0.0402100645005703,
-0.08918710052967072,
0.052652642130851746,
0.11989985406398773,
-0.029157035052776337,
0.047483790665864944,
0.11156992614269257,
-0.04820948839187622,
-0.05521773546934128,
-0.10390947759151459,
0.00047497020568698645,
0.05718306452035904,
-0.03500916436314583,
0.007738869171589613,
-0.08922015130519867,
0.012966293841600418,
0.037907637655735016,
0.009907564148306847,
-0.04415104538202286,
-0.04427450895309448,
0.005318128503859043,
-0.08797662705183029,
0.08447997272014618,
0.10510557889938354,
-0.025091396644711494,
-0.11276423931121826,
0.08336400985717773,
0.010802130214869976,
0.07607424259185791,
-0.04157528653740883,
-0.06784180551767349,
-0.10218094289302826,
-0.007351825945079327,
-0.09940407425165176,
0.029374491423368454,
-0.14337463676929474,
-0.006055303383618593,
-0.0575115941464901,
-0.032648321241140366,
-0.008116981945931911,
0.06888611614704132,
-0.033612530678510666,
0.007150554563850164,
-0.023621033877134323,
0.08679487556219101,
-0.13299980759620667,
0.06378162652254105,
0.055890701711177826,
-0.04394763335585594,
0.12653352320194244,
0.021967604756355286,
-0.043791018426418304,
0.043048541992902756,
-0.22530122101306915,
-0.057036906480789185,
-0.02642916515469551,
0.04134746268391609,
-0.013464262709021568,
-0.1711338311433792,
0.007633678615093231,
0.016176773235201836,
0.017937537282705307,
-0.01795630343258381,
0.05907151475548744,
-0.0347217358648777,
-0.0326794758439064,
-0.06858908385038376,
-0.052009887993335724,
-0.042242083698511124,
0.06630953401327133,
0.06871660053730011,
0.009265003725886345,
0.09172271192073822,
-0.09276338666677475,
0.06469642370939255,
-0.07728438824415207,
0.019952161237597466,
-0.0291049312800169,
0.006762735079973936,
-0.06581311672925949,
-0.06769803166389465,
0.06806427240371704,
-0.015574897639453411,
0.08872295916080475,
0.011333348229527473,
-0.04004262387752533,
0.0445706769824028,
-0.05691581219434738,
-0.05817599967122078,
0.031417228281497955,
0.15995891392230988,
0.048719435930252075,
0.014771407470107079,
-0.006447033490985632,
-0.034963496029376984,
-0.00032561836997047067,
0.14501352608203888,
0.1305842250585556,
0.16712206602096558,
0.09115133434534073,
0.02298322506248951,
0.0760263055562973,
-0.04926062375307083,
-0.08357710391283035,
0.07746508717536926,
-0.07256738841533661,
0.03720899671316147,
-0.05599701777100563,
-0.04506240412592888,
0.07527823746204376,
-0.13760481774806976,
0.07325651496648788,
-0.01731698401272297,
-0.0800420269370079,
-0.09691430628299713,
-0.15083855390548706,
-0.060298189520835876,
-0.0536571741104126,
0.00892591755837202,
-0.11008358001708984,
0.03847109153866768,
0.001950837322510779,
0.03872128576040268,
-0.10361813753843307,
0.11622919142246246,
-0.09502887725830078,
-0.1314297914505005,
0.15616652369499207,
-0.03381100296974182,
-0.022558417171239853,
0.004750330466777086,
0.05498672276735306,
0.0194815956056118,
0.09141229093074799,
0.03811754658818245,
0.04723957180976868,
0.018243663012981415,
0.00892613735049963,
-0.10534758120775223,
-0.06525550037622452,
0.030599895864725113,
0.003176385536789894,
0.09355489164590836,
0.19992119073867798,
0.08220518380403519,
-0.0824434831738472,
0.014633845537900925,
0.15346617996692657,
0.027315979823470116,
-0.10969105362892151,
-0.13759079575538635,
0.03443986549973488,
-0.03873535990715027,
-0.0050139534287154675,
-0.0017937885131686926,
-0.09989726543426514,
0.017107438296079636,
0.20450501143932343,
0.16306108236312866,
-0.03246216103434563,
0.029137397184967995,
-0.008952013216912746,
0.012751935981214046,
0.01644331030547619,
0.07820746302604675,
0.08332614600658417,
0.1871892660856247,
-0.0037734664510935545,
0.045737072825431824,
-0.00667769368737936,
-0.09365643560886383,
-0.10376279056072235,
0.09996306151151657,
0.002281918190419674,
-0.04316258057951927,
-0.014895476400852203,
0.16950131952762604,
-0.12059921026229858,
-0.2124476581811905,
-0.11863663047552109,
-0.0500517375767231,
-0.11672881245613098,
0.013317827135324478,
-0.05823352187871933,
0.13472269475460052,
0.033950332552194595,
0.006345350295305252,
0.009222854860126972,
0.17742864787578583,
0.040794335305690765,
0.021691743284463882,
-0.034188445657491684,
0.11406038701534271,
-0.09599635750055313,
0.10680653899908066,
-0.008133177645504475,
0.047013863921165466,
0.038164108991622925,
0.03878585994243622,
-0.07060530036687851,
0.024258466437458992,
0.038994189351797104,
-0.018841225653886795,
0.04587177932262421,
0.19612066447734833,
-0.005399990826845169,
0.09847357124090195,
0.10691116750240326,
-0.05310983955860138,
0.023147892206907272,
-0.0003923587210010737,
0.008720156736671925,
-0.07200377434492111,
0.15718171000480652,
-0.14813566207885742,
0.13027334213256836,
0.0979391410946846,
-0.07358608394861221,
-0.0452575609087944,
-0.02375226840376854,
0.06029048562049866,
-0.05042305588722229,
0.0971071794629097,
-0.005833001807332039,
-0.16563132405281067,
0.03454530984163284,
-0.09144110232591629,
0.06564311683177948,
-0.2751699388027191,
-0.03800269216299057,
-0.04592585191130638,
-0.01274710800498724,
-0.001176829799078405,
0.13057564198970795,
0.08741692453622818,
-0.0420585460960865,
-0.006378280930221081,
-0.05662249028682709,
0.006464900448918343,
0.08057883381843567,
-0.09951788932085037,
-0.01985354721546173
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **cs** on **18.7k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **cs**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "cs", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-cs-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"cs",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"cs"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #cs #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in cs on 18.7k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in cs. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in cs on 18.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in cs. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #cs #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in cs on 18.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in cs. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #cs #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in cs on 18.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in cs. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.08272025734186172,
0.11085419356822968,
-0.002602613065391779,
0.003012516535818577,
0.07910037040710449,
-0.04796503111720085,
0.13602404296398163,
0.042111124843358994,
0.00839604064822197,
0.10294385999441147,
-0.01203513890504837,
-0.053884997963905334,
0.07542839646339417,
0.1283852458000183,
0.057505492120981216,
-0.2486129254102707,
0.04132203012704849,
-0.06253746151924133,
0.04672688618302345,
0.05027545243501663,
0.1199645847082138,
-0.08685532212257385,
0.029718047007918358,
0.05305543541908264,
-0.03812430426478386,
0.023869618773460388,
-0.045734867453575134,
-0.08156967163085938,
0.05231603607535362,
0.046592485159635544,
-0.02800014615058899,
0.03241826966404915,
0.0984034314751625,
-0.1882116198539734,
0.038075562566518784,
0.03780267760157585,
0.02759827859699726,
0.012383039109408855,
0.09757918119430542,
0.019654955714941025,
0.16426680982112885,
-0.021282507106661797,
-0.0015799483517184854,
0.08136483281850815,
-0.05418293923139572,
-0.08202426135540009,
-0.06616730988025665,
0.1576264351606369,
0.1018369272351265,
0.10794584453105927,
-0.0774964839220047,
0.07529987394809723,
-0.02585195191204548,
0.04428192228078842,
0.07477878034114838,
-0.1802303045988083,
-0.05439186096191406,
0.06261162459850311,
0.10911454260349274,
0.021206803619861603,
-0.08535972982645035,
0.0733238160610199,
0.05068744719028473,
-0.0163970198482275,
-0.06103788688778877,
-0.03582240641117096,
0.12134169042110443,
-0.104152612388134,
-0.11556637287139893,
-0.0035153361968696117,
0.17906221747398376,
0.061120398342609406,
-0.07283243536949158,
-0.15210850536823273,
0.01207164116203785,
0.19641809165477753,
-0.059187453240156174,
-0.0920514240860939,
0.008408252149820328,
0.01970260590314865,
0.050092924386262894,
-0.06837723404169083,
-0.0736239105463028,
-0.008934170939028263,
0.028199467808008194,
0.10042041540145874,
0.015539399348199368,
-0.019453471526503563,
-0.06764905899763107,
-0.0010117227211594582,
-0.08905313909053802,
-0.11817099153995514,
-0.0064434632658958435,
-0.06466612964868546,
-0.07628442347049713,
-0.03928566724061966,
-0.0008151130168698728,
-0.09895817935466766,
0.02852490544319153,
0.09988240897655487,
0.07245203852653503,
0.054083529859781265,
-0.05587116256356239,
-0.03364020958542824,
0.11984094232320786,
0.07663123309612274,
-0.1234469786286354,
-0.010341336019337177,
0.01766950637102127,
-0.015788504853844643,
0.004585571587085724,
-0.04252167418599129,
-0.041868843138217926,
0.014089473523199558,
-0.01934421993792057,
0.051894884556531906,
0.052555426955223083,
-0.034938063472509384,
-0.03750362992286682,
-0.09204611927270889,
0.11206807941198349,
-0.07755976915359497,
0.026581738144159317,
0.05190347135066986,
-0.0016245113220065832,
0.09152761101722717,
-0.06170322373509407,
0.08081021159887314,
-0.10614494234323502,
0.0014265794306993484,
-0.028187237679958344,
-0.004616789985448122,
0.021218610927462578,
-0.023536069318652153,
0.0330745168030262,
-0.0013597073266282678,
0.002706122351810336,
-0.11371874809265137,
0.007225616369396448,
-0.10430014878511429,
-0.030074095353484154,
-0.07881785184144974,
-0.04448793828487396,
-0.047710027545690536,
0.011011329479515553,
-0.0037006624042987823,
-0.004277708474546671,
0.006538541987538338,
-0.01929651014506817,
-0.009218602441251278,
0.004338770639151335,
0.042510516941547394,
0.05700398236513138,
0.07804099470376968,
-0.01907966658473015,
-0.01550743542611599,
-0.10541771352291107,
0.11620132625102997,
-0.0768466591835022,
-0.018986476585268974,
-0.13206321001052856,
-0.03654893487691879,
-0.037085097283124924,
0.03074907511472702,
0.010200985707342625,
0.1293535828590393,
-0.18230381608009338,
-0.06628352403640747,
0.12105955928564072,
-0.11743755638599396,
0.00790616124868393,
0.17702941596508026,
-0.0007301779696717858,
0.06418248265981674,
0.09886452555656433,
0.210826575756073,
0.018887469545006752,
-0.17379458248615265,
-0.01119332667440176,
-0.05055821314454079,
0.04274836555123329,
0.12648814916610718,
0.06423375010490417,
-0.06859945505857468,
0.06470991671085358,
-0.0174348633736372,
-0.03177523985505104,
-0.08247745037078857,
-0.005712720099836588,
-0.046880338340997696,
0.017501814290881157,
-0.0495678074657917,
0.019315259531140327,
-0.0039042658172547817,
-0.026057852432131767,
-0.014492779970169067,
-0.08927740901708603,
-0.08003377169370651,
0.12040155380964279,
-0.06951876729726791,
0.02490907534956932,
-0.10112335532903671,
0.06295473873615265,
0.06403497606515884,
0.004470401909202337,
-0.12940606474876404,
0.10835175961256027,
0.036680467426776886,
-0.03383801132440567,
0.14251846075057983,
0.07595107704401016,
-0.036197006702423096,
0.007772945333272219,
-0.012862171977758408,
0.020310768857598305,
-0.028462501242756844,
0.013812349177896976,
-0.02670116536319256,
-0.1025356575846672,
-0.009141020476818085,
-0.06537696719169617,
0.12552742660045624,
-0.13776308298110962,
-0.010857990011572838,
0.051993489265441895,
0.10871637612581253,
-0.015914522111415863,
-0.04137193039059639,
0.09127553552389145,
0.044533293694257736,
0.03150163218379021,
-0.022244660183787346,
0.018678171560168266,
-0.015574530698359013,
-0.0012133930576965213,
0.04987330362200737,
-0.15216688811779022,
-0.16551469266414642,
0.0960884541273117,
0.02380727417767048,
-0.019517088308930397,
0.06545943766832352,
0.025044342502951622,
-0.015735503286123276,
-0.04640427976846695,
0.0040863980539143085,
0.23456309735774994,
-0.012097850441932678,
0.06264251470565796,
-0.08311939984560013,
-0.00856322143226862,
0.018176142126321793,
-0.046020179986953735,
-0.08560357987880707,
0.07793634384870529,
0.010581405833363533,
-0.07382992655038834,
-0.0419735312461853,
0.04375820606946945,
0.07295797020196915,
0.14574651420116425,
0.007353346794843674,
-0.08874782174825668,
-0.030516592785716057,
-0.06167405843734741,
-0.01162630319595337,
0.04812213033437729,
-0.1356937140226364,
-0.0263375137001276,
0.02520531415939331,
0.004599832929670811,
0.04797346889972687,
-0.026951296254992485,
0.044977281242609024,
0.010019304230809212,
-0.05443327873945236,
-0.06814276427030563,
0.035925690084695816,
-0.03126728534698486,
0.03880157694220543,
-0.01013998407870531,
-0.0021228843834251165,
-0.04686404764652252,
-0.05895741656422615,
-0.14561495184898376,
0.0894630029797554,
-0.06695150583982468,
-0.30849650502204895,
-0.08705578744411469,
-0.04325789213180542,
-0.03182043880224228,
0.018959013745188713,
0.04558515176177025,
-0.10767471790313721,
-0.11042296886444092,
-0.07346640527248383,
0.11111819744110107,
-0.024719424545764923,
-0.06390227377414703,
0.11860854923725128,
-0.007759646978229284,
0.023232582956552505,
-0.09558981657028198,
0.016604896634817123,
-0.03772001713514328,
-0.03232870623469353,
-0.031342796981334686,
0.01975695788860321,
0.06762922555208206,
0.13685855269432068,
0.023012222722172737,
-0.005380676593631506,
0.008812978863716125,
0.2139960527420044,
-0.13830870389938354,
0.07930097728967667,
0.23524773120880127,
-0.05838147923350334,
-0.012744330801069736,
0.1426166146993637,
-0.007355217356234789,
-0.05424090102314949,
0.04284447804093361,
-0.0035953919868916273,
-0.024355025961995125,
-0.22084306180477142,
-0.12683789432048798,
-0.046378787606954575,
-0.02724449709057808,
0.04139787703752518,
0.01785990782082081,
0.0022326945327222347,
0.01651259884238243,
-0.0896461084485054,
-0.04547302797436714,
0.06144522875547409,
0.033111684024333954,
0.13644254207611084,
0.012540369294583797,
0.054014626890420914,
-0.04309548810124397,
-0.019677890464663506,
0.1013898029923439,
-0.03185276687145233,
0.036012329161167145,
0.07362937182188034,
0.11141172051429749,
0.061987534165382385,
0.04382329806685448,
0.05892941355705261,
-0.01962532289326191,
-0.015526287257671356,
-0.0038902745582163334,
-0.025494713336229324,
-0.06258371472358704,
0.012171049602329731,
0.0410383976995945,
0.1488836258649826,
-0.13424943387508392,
-0.12338298559188843,
0.02635146491229534,
0.014731742441654205,
0.12996405363082886,
0.09861490875482559,
-0.034814752638339996,
-0.09484545886516571,
0.034939516335725784,
-0.0841803327202797,
-0.035180043429136276,
0.05320343002676964,
0.08827146142721176,
-0.16215939819812775,
0.08616683632135391,
0.07901489734649658,
0.08857323229312897,
-0.0453108549118042,
0.029868854209780693,
-0.04930570349097252,
0.0504196472465992,
0.003111248603090644,
0.06986944377422333,
-0.1721964031457901,
0.1006883978843689,
0.017784174531698227,
0.08506249636411667,
-0.05673644691705704,
0.02455153502523899,
0.041070710867643356,
0.016692057251930237,
0.1272684782743454,
-0.013866398483514786,
-0.10003139823675156,
-0.008151939138770103,
-0.11948318034410477,
0.017873134464025497,
0.05936930701136589,
-0.06717321276664734,
0.05743078142404556,
-0.00020503567066043615,
-0.011214779689908028,
-0.034597791731357574,
-0.009167740121483803,
-0.2550412714481354,
-0.14303737878799438,
0.04899189993739128,
0.006052117794752121,
0.05659281834959984,
-0.039797425270080566,
-0.0724063515663147,
-0.13544198870658875,
0.10909867286682129,
-0.011515771970152855,
-0.013464352115988731,
-0.07493001967668533,
0.01416467223316431,
0.10274974256753922,
-0.05817611142992973,
0.018388502299785614,
0.04653496295213699,
0.152827650308609,
-0.06378652155399323,
-0.03776215389370918,
0.021750060841441154,
-0.10156270116567612,
-0.13454173505306244,
0.012831810861825943,
0.183308944106102,
0.11124975234270096,
0.06592915952205658,
0.0941644012928009,
0.019432658329606056,
-0.0014236742863431573,
-0.09848855435848236,
0.017150862142443657,
0.0276330579072237,
-0.06487578898668289,
0.04729599133133888,
0.0008925188449211419,
-0.27176806330680847,
-0.14686787128448486,
-0.059953171759843826,
0.07817479968070984,
0.18753163516521454,
-0.030654726549983025,
0.16642627120018005,
0.2693306803703308,
-0.08629854023456573,
-0.22292685508728027,
-0.03840453922748566,
-0.004287374671548605,
0.032331421971321106,
0.04651348292827606,
-0.19906021654605865,
0.09617164731025696,
-0.00534169003367424,
0.009094541892409325,
-0.06545831263065338,
-0.21774518489837646,
-0.13618344068527222,
0.169536292552948,
-0.029860207810997963,
0.03240662068128586,
-0.029363462701439857,
-0.06810013949871063,
-0.037089403718709946,
-0.05418866127729416,
0.019338451325893402,
-0.09441210329532623,
0.07191473245620728,
0.055282481014728546,
0.008281722664833069,
0.02532370202243328,
0.01316936407238245,
0.11437957733869553,
0.09296616911888123,
-0.02101338468492031,
-0.07684193551540375,
0.023714222013950348,
0.01118328794836998,
-0.013225983828306198,
0.10076954215765,
0.048889946192502975,
0.01732827164232731,
-0.04522310197353363,
-0.08345519006252289,
-0.06296224147081375,
0.05728331580758095,
-0.07273237407207489,
-0.00971894059330225,
-0.060798272490501404,
0.08345595002174377,
0.014755217358469963,
0.0005228191148489714,
-0.07054269313812256,
-0.09993568062782288,
-0.01746041141450405,
0.11682264506816864,
0.22644826769828796,
-0.05645109340548515,
0.004460200201719999,
-0.042172133922576904,
-0.04461231455206871,
0.04553394764661789,
-0.0028804594185203314,
0.04490639641880989,
0.05113482102751732,
0.022242460399866104,
0.08818718791007996,
-0.03201494738459587,
-0.1273026466369629,
0.03011123649775982,
0.03386975824832916,
-0.07085403800010681,
-0.17414490878582,
-0.04550013691186905,
-0.003485203254967928,
-0.023937053978443146,
-0.03499218821525574,
0.19073976576328278,
-0.013468497432768345,
-0.05513047054409981,
0.00433161249384284,
0.05908547341823578,
-0.0009191621211357415,
0.11587885767221451,
0.04674635827541351,
0.041176408529281616,
-0.0926152840256691,
0.05040406063199043,
0.1212458610534668,
-0.04057389125227928,
0.044091686606407166,
0.08916754275560379,
-0.051773011684417725,
-0.05431827902793884,
-0.09373095631599426,
-0.0018143914639949799,
0.06124812737107277,
-0.060999855399131775,
-0.005862249061465263,
-0.10661196708679199,
0.007185604423284531,
0.015580561943352222,
0.013349044136703014,
-0.04268214479088783,
-0.04372973367571831,
-0.004299016669392586,
-0.09992857277393341,
0.06543569266796112,
0.09244890511035919,
-0.029335269704461098,
-0.10767257213592529,
0.10680514574050903,
0.014224217273294926,
0.07986655831336975,
-0.037838976830244064,
-0.06171445921063423,
-0.08452244848012924,
-0.0018176946323364973,
-0.08635016530752182,
0.03609176725149155,
-0.13738054037094116,
-0.01560698077082634,
-0.04545679688453674,
-0.030743226408958435,
-0.010017666965723038,
0.06830073893070221,
-0.029853882268071175,
0.003567042760550976,
-0.03211783245205879,
0.0871771052479744,
-0.12455582618713379,
0.07185018807649612,
0.055925969034433365,
-0.04869775474071503,
0.10963144898414612,
0.017950868234038353,
-0.05226153880357742,
0.03796357661485672,
-0.2182968556880951,
-0.055630698800086975,
-0.03102271631360054,
0.04775363206863403,
-0.013398655690252781,
-0.16641274094581604,
0.00003020187432412058,
0.0213343296200037,
0.009463772177696228,
-0.019050588831305504,
0.04996832460165024,
-0.02374557964503765,
-0.02025887370109558,
-0.06862859427928925,
-0.05552142485976219,
-0.0359351821243763,
0.06540286540985107,
0.07575184106826782,
0.00997476652264595,
0.09523198008537292,
-0.08827097713947296,
0.07806829363107681,
-0.0786130502820015,
0.024336347356438637,
-0.030189819633960724,
0.026541195809841156,
-0.08335985988378525,
-0.0731540098786354,
0.08594673871994019,
-0.01584959775209427,
0.07832708954811096,
0.020402025431394577,
-0.01811007410287857,
0.04106840491294861,
-0.04308116436004639,
-0.048034463077783585,
0.04568661004304886,
0.13759878277778625,
0.056818217039108276,
0.017553912475705147,
0.005955056753009558,
-0.040533747524023056,
0.005360295996069908,
0.14379869401454926,
0.14736422896385193,
0.1565001904964447,
0.10227613896131516,
0.035869840532541275,
0.06690496951341629,
-0.051348451524972916,
-0.07609729468822479,
0.08828438818454742,
-0.07749032229185104,
0.03276129439473152,
-0.04846787080168724,
-0.06617984175682068,
0.07060616463422775,
-0.13896100223064423,
0.07240571826696396,
-0.0264254342764616,
-0.08692970126867294,
-0.1081715002655983,
-0.13756735622882843,
-0.06514754891395569,
-0.04301586374640465,
0.0025078924372792244,
-0.10940916836261749,
0.023633886128664017,
0.015741117298603058,
0.024110956117510796,
-0.09179877489805222,
0.11590857058763504,
-0.11816560477018356,
-0.11994067579507828,
0.15138284862041473,
-0.032507751137018204,
-0.010748454369604588,
-0.0010666650487110019,
0.03920383006334305,
0.02557276003062725,
0.09001057595014572,
0.05564205348491669,
0.047231171280145645,
0.02350616082549095,
0.02911168709397316,
-0.09546229988336563,
-0.06964253634214401,
0.03346046432852745,
-0.01769537851214409,
0.10133164376020432,
0.18756185472011566,
0.08842356503009796,
-0.0807681679725647,
0.013331583701074123,
0.13617856800556183,
0.025663288310170174,
-0.10905113816261292,
-0.1487782746553421,
0.02268054522573948,
-0.03515402600169182,
-0.0014804517850279808,
0.006522562354803085,
-0.09388162195682526,
0.02131030336022377,
0.20319856703281403,
0.16701768338680267,
-0.048479948192834854,
0.023979095742106438,
-0.013500439934432507,
0.007209098897874355,
0.018305696547031403,
0.08203085511922836,
0.08162327855825424,
0.17720071971416473,
-0.007735431659966707,
0.03922943025827408,
-0.01827056333422661,
-0.10001661628484726,
-0.11681035906076431,
0.09404788166284561,
0.00004595171776600182,
-0.03467073664069176,
-0.0032326802611351013,
0.18401917815208435,
-0.10622414201498032,
-0.2126605361700058,
-0.11421989649534225,
-0.04209958761930466,
-0.11257247626781464,
0.02717195823788643,
-0.05367453396320343,
0.13126453757286072,
0.04536421597003937,
-0.007162425201386213,
0.009268810972571373,
0.18093685805797577,
0.03328900411725044,
0.03417758643627167,
-0.02227056585252285,
0.10538645088672638,
-0.09565804153680801,
0.11808422207832336,
-0.004289756994694471,
0.05776878818869591,
0.033420003950595856,
0.040280431509017944,
-0.06799913942813873,
0.0319618359208107,
0.038618531078100204,
-0.0013526100665330887,
0.052076391875743866,
0.1699092537164688,
-0.007703661452978849,
0.09607299417257309,
0.10921841114759445,
-0.06815680861473083,
0.03235974535346031,
-0.03227045387029648,
0.004213227424770594,
-0.06391268223524094,
0.15148091316223145,
-0.14780361950397491,
0.1309177577495575,
0.09475897252559662,
-0.07318547368049622,
-0.04606013745069504,
-0.002217710018157959,
0.04700695723295212,
-0.059758126735687256,
0.10460768640041351,
-0.00826488807797432,
-0.17106805741786957,
0.028080232441425323,
-0.12347465753555298,
0.06963083893060684,
-0.25563687086105347,
-0.041920654475688934,
-0.047001224011182785,
-0.01946650631725788,
0.0048500592820346355,
0.11211131513118744,
0.07598206400871277,
-0.05162927880883217,
-0.01245187222957611,
-0.032826539129018784,
0.010782688856124878,
0.09634364396333694,
-0.09117407351732254,
-0.030551524832844734
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **da** on **13.6k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **da**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "da", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-da-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"da",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"da"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #da #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in da on 13.6k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in da. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in da on 13.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in da. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #da #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in da on 13.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in da. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #da #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in da on 13.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in da. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07837270200252533,
0.10377992689609528,
-0.002815814455971122,
0.007581332232803106,
0.07927871495485306,
-0.05066947638988495,
0.1349961906671524,
0.04179898649454117,
0.005691127851605415,
0.09908532351255417,
-0.014949479140341282,
-0.050320401787757874,
0.07634294033050537,
0.1272452175617218,
0.05877294763922691,
-0.25832197070121765,
0.038286346942186356,
-0.06803902238607407,
0.040977828204631805,
0.04562554135918617,
0.12300270795822144,
-0.08250435441732407,
0.030495313927531242,
0.05276641994714737,
-0.032588496804237366,
0.029543226584792137,
-0.04870905354619026,
-0.0828050822019577,
0.05372616648674011,
0.04541284590959549,
-0.030466875061392784,
0.028107330203056335,
0.09169524163007736,
-0.18405702710151672,
0.03680650517344475,
0.041881635785102844,
0.032833345234394073,
0.006655407138168812,
0.09608316421508789,
0.024394968524575233,
0.16098572313785553,
-0.019327308982610703,
-0.002774258377030492,
0.0809275209903717,
-0.05759422481060028,
-0.09633651375770569,
-0.06538158655166626,
0.14906367659568787,
0.10321511328220367,
0.10838938504457474,
-0.07836399972438812,
0.0685063973069191,
-0.016585253179073334,
0.03984085097908974,
0.0652865394949913,
-0.17246389389038086,
-0.05454437434673309,
0.06555163115262985,
0.10858415812253952,
0.015110253356397152,
-0.08805716782808304,
0.07150330394506454,
0.0492420494556427,
-0.011688240803778172,
-0.06765865534543991,
-0.03381931409239769,
0.1453537791967392,
-0.10742098838090897,
-0.11464414745569229,
0.003210693597793579,
0.1781800091266632,
0.05670832842588425,
-0.0774238109588623,
-0.1488008052110672,
0.013783416710793972,
0.20933355391025543,
-0.06017699092626572,
-0.101213239133358,
0.0074260905385017395,
0.02274533361196518,
0.05536135286092758,
-0.06569551676511765,
-0.06913100183010101,
-0.008235674351453781,
0.021783871576189995,
0.1093868538737297,
0.019474152475595474,
-0.021532630547881126,
-0.07750291377305984,
-0.0031847418285906315,
-0.09404181689023972,
-0.11851774156093597,
-0.006170996930450201,
-0.0692720040678978,
-0.06664453446865082,
-0.034972671419382095,
0.0017090525943785906,
-0.10688189417123795,
0.031311195343732834,
0.09363239258527756,
0.07184188067913055,
0.0561349019408226,
-0.06439631432294846,
-0.03470614179968834,
0.12171373516321182,
0.06793490797281265,
-0.12564924359321594,
-0.0136885279789567,
0.016427023336291313,
-0.018819333985447884,
0.009590821340680122,
-0.03748495504260063,
-0.03588530048727989,
0.015527671203017235,
-0.01900368183851242,
0.04659494757652283,
0.05676447972655296,
-0.03413533791899681,
-0.03731115907430649,
-0.09562394767999649,
0.10457438230514526,
-0.07912760972976685,
0.024437058717012405,
0.04762031137943268,
-0.004519268870353699,
0.09163441509008408,
-0.06531014293432236,
0.08593880385160446,
-0.10596447438001633,
0.00822809524834156,
-0.02498467266559601,
-0.012268192134797573,
0.022743381559848785,
-0.027053220197558403,
0.035074345767498016,
0.0003436679835431278,
0.008170166984200478,
-0.11527794599533081,
0.0000011989283166258247,
-0.10111743956804276,
-0.023441800847649574,
-0.08248719573020935,
-0.04435019940137863,
-0.04515703395009041,
0.01251941081136465,
-0.004862407688051462,
-0.0000959871758823283,
0.012432240881025791,
-0.015463044866919518,
-0.0062393611297011375,
0.013012615963816643,
0.045318473130464554,
0.0554608553647995,
0.0837537944316864,
-0.01564144901931286,
-0.012023511342704296,
-0.11100044846534729,
0.1194450780749321,
-0.07835201919078827,
-0.02087991125881672,
-0.1386231780052185,
-0.036546751856803894,
-0.04620829224586487,
0.03070284239947796,
0.01717175543308258,
0.12877421081066132,
-0.18368364870548248,
-0.06563477218151093,
0.11653552204370499,
-0.1233973503112793,
0.009911895729601383,
0.17775790393352509,
-0.00040611287113279104,
0.0775630921125412,
0.0986439660191536,
0.21719054877758026,
0.027425164356827736,
-0.17908132076263428,
-0.011606728658080101,
-0.05740209296345711,
0.04220263287425041,
0.13484060764312744,
0.06400318443775177,
-0.06629922240972519,
0.05887673795223236,
-0.01637667417526245,
-0.027324438095092773,
-0.07847955077886581,
-0.0008259953465312719,
-0.04626297578215599,
0.021145017817616463,
-0.045444224029779434,
0.028155256062746048,
-0.00415895925834775,
-0.024336233735084534,
-0.011577735655009747,
-0.0882469117641449,
-0.07825569063425064,
0.1193348839879036,
-0.06118331849575043,
0.02123664692044258,
-0.09995224326848984,
0.05738302692770958,
0.06344053149223328,
0.008137053810060024,
-0.12377642840147018,
0.11432907730340958,
0.033147335052490234,
-0.049307532608509064,
0.13936707377433777,
0.07134676724672318,
-0.033097539097070694,
0.007314466405659914,
-0.013426379300653934,
0.022493941709399223,
-0.03589916229248047,
0.014513682574033737,
-0.02483144775032997,
-0.09716242551803589,
-0.0023724574130028486,
-0.06871875375509262,
0.11551696807146072,
-0.1310943067073822,
-0.013050202280282974,
0.047331325709819794,
0.10766291618347168,
-0.01177714392542839,
-0.038873858749866486,
0.09369863569736481,
0.04357307776808739,
0.03310475870966911,
-0.021817296743392944,
0.019412150606513023,
-0.019350161775946617,
0.002837887266650796,
0.04551908001303673,
-0.1448022872209549,
-0.15949466824531555,
0.09343212842941284,
0.02772950939834118,
-0.013579776510596275,
0.06913108378648758,
0.025422142818570137,
-0.021479519084095955,
-0.047875773161649704,
0.00032423686934635043,
0.23874680697917938,
-0.008899154141545296,
0.05723589286208153,
-0.0797298327088356,
-0.0027215417940169573,
0.020766576752066612,
-0.0452607125043869,
-0.09264194965362549,
0.07644639909267426,
0.00484733609482646,
-0.07510809600353241,
-0.04088744893670082,
0.0496804341673851,
0.07616154849529266,
0.1499382108449936,
0.0045554847456514835,
-0.08407773077487946,
-0.031050028279423714,
-0.0611691027879715,
-0.012484784238040447,
0.040798842906951904,
-0.14106014370918274,
-0.02228384092450142,
0.02241036854684353,
0.010240699164569378,
0.050188228487968445,
-0.02020404115319252,
0.042148295789957047,
0.007991805672645569,
-0.05049268528819084,
-0.07011531293392181,
0.042178310453891754,
-0.03474561125040054,
0.03610255941748619,
-0.004620910156518221,
0.0012327531585469842,
-0.04414309188723564,
-0.056879062205553055,
-0.13907437026500702,
0.08613184839487076,
-0.06824234873056412,
-0.31197842955589294,
-0.08430266380310059,
-0.05042749270796776,
-0.03730081394314766,
0.01584225334227085,
0.04754750803112984,
-0.11015723645687103,
-0.10710851103067398,
-0.06332343816757202,
0.12728659808635712,
-0.03188271448016167,
-0.06510141491889954,
0.12092668563127518,
-0.007594550494104624,
0.025664877146482468,
-0.09678851813077927,
0.014874398708343506,
-0.03470702841877937,
-0.025728628039360046,
-0.02980995737016201,
0.019785616546869278,
0.06132758781313896,
0.12406228482723236,
0.02678098902106285,
-0.007320977747440338,
0.011129346676170826,
0.21626771986484528,
-0.14119000732898712,
0.08520110696554184,
0.23920300602912903,
-0.054766926914453506,
-0.009436869993805885,
0.1494375616312027,
-0.009509016759693623,
-0.058358196169137955,
0.04615100100636482,
0.0038028531707823277,
-0.022826790809631348,
-0.21958087384700775,
-0.12379459291696548,
-0.04686393961310387,
-0.023732269182801247,
0.041064128279685974,
0.018381360918283463,
-0.007936520501971245,
0.015596119686961174,
-0.08363921195268631,
-0.04023240879178047,
0.061740320175886154,
0.031067991629242897,
0.14647537469863892,
0.008671083487570286,
0.05190162733197212,
-0.04144598916172981,
-0.026104165241122246,
0.10108514130115509,
-0.02180538699030876,
0.04044158756732941,
0.07501528412103653,
0.0931900292634964,
0.06390571594238281,
0.036293551325798035,
0.05444083362817764,
-0.019637547433376312,
-0.01708083413541317,
-0.0033530762884765863,
-0.030830558389425278,
-0.06387024372816086,
0.01866309344768524,
0.042138054966926575,
0.13554295897483826,
-0.13669894635677338,
-0.11973126232624054,
0.03333833068609238,
0.010954032652080059,
0.12258449196815491,
0.09905092418193817,
-0.023587821051478386,
-0.09638407826423645,
0.038366954773664474,
-0.08367942273616791,
-0.03710565343499184,
0.05779772624373436,
0.0752507746219635,
-0.15670864284038544,
0.09236636012792587,
0.07221190631389618,
0.08573455363512039,
-0.03964005783200264,
0.0321396104991436,
-0.05518369749188423,
0.055389437824487686,
0.002696453360840678,
0.07096178829669952,
-0.18596574664115906,
0.10078013688325882,
0.016888782382011414,
0.08981918543577194,
-0.053769953548908234,
0.028231317177414894,
0.0424877293407917,
0.012048806063830853,
0.12623338401317596,
-0.011525108478963375,
-0.09948433935642242,
0.0056745377369225025,
-0.11656790971755981,
0.01876983605325222,
0.05686269327998161,
-0.06495997309684753,
0.05322342738509178,
0.00041423679795116186,
-0.005586084444075823,
-0.03528941050171852,
0.0071168928407132626,
-0.25619545578956604,
-0.13939914107322693,
0.05252322554588318,
-0.0018288957653567195,
0.05540782958269119,
-0.04033758118748665,
-0.07002995163202286,
-0.12420807033777237,
0.10148989409208298,
-0.0044849589467048645,
-0.018861250951886177,
-0.07207746803760529,
0.017494015395641327,
0.099420927464962,
-0.058231838047504425,
0.012451761402189732,
0.04654437676072121,
0.14571340382099152,
-0.06822055578231812,
-0.04239243268966675,
0.023790985345840454,
-0.10167988389730453,
-0.12615114450454712,
0.01475600991398096,
0.17512384057044983,
0.11155635118484497,
0.05970687046647072,
0.09195049852132797,
0.01837465539574623,
-0.002035661833360791,
-0.09669627249240875,
0.024980247020721436,
0.027738407254219055,
-0.07138663530349731,
0.03847287967801094,
0.0014166112523525953,
-0.26736804842948914,
-0.14985570311546326,
-0.06430809944868088,
0.07087000459432602,
0.18362504243850708,
-0.024496247991919518,
0.1649264693260193,
0.2759149670600891,
-0.08641690760850906,
-0.2271093875169754,
-0.04192313924431801,
0.000710353022441268,
0.02989538200199604,
0.03982055187225342,
-0.2087073177099228,
0.09865448623895645,
-0.0020017761271446943,
0.008287621662020683,
-0.05780230090022087,
-0.21903306245803833,
-0.13465222716331482,
0.17190948128700256,
-0.029983365908265114,
0.04699019342660904,
-0.030424773693084717,
-0.06688258051872253,
-0.03582081198692322,
-0.05369235947728157,
0.008805050514638424,
-0.09214777499437332,
0.07075782865285873,
0.0537835918366909,
0.017543252557516098,
0.026018239557743073,
0.014149684458971024,
0.11219900846481323,
0.08987390249967575,
-0.021701909601688385,
-0.08017666637897491,
0.017733722925186157,
0.0013129878789186478,
-0.012629210017621517,
0.10325057059526443,
0.0520084872841835,
0.018406949937343597,
-0.04332250356674194,
-0.08533498644828796,
-0.06403416395187378,
0.06160355731844902,
-0.07259754091501236,
-0.01629389449954033,
-0.05966339260339737,
0.09017311036586761,
0.011146550066769123,
-0.0010938538471236825,
-0.07492228597402573,
-0.09271709620952606,
-0.01721440814435482,
0.11955568939447403,
0.2158854752779007,
-0.04651184007525444,
0.005010642111301422,
-0.04220126196742058,
-0.04482894390821457,
0.05022365599870682,
-0.010143949650228024,
0.04457845911383629,
0.051693838089704514,
0.022184547036886215,
0.08763562887907028,
-0.03363814949989319,
-0.1332697570323944,
0.030841851606965065,
0.03421260416507721,
-0.06305824220180511,
-0.19112376868724823,
-0.04932194575667381,
0.002299693413078785,
-0.019930580630898476,
-0.03286579251289368,
0.19139418005943298,
-0.01732831820845604,
-0.05435984954237938,
0.004307386931031942,
0.06371606886386871,
-0.0033922395668923855,
0.12058842182159424,
0.04290919750928879,
0.03956787288188934,
-0.08994555473327637,
0.05993961915373802,
0.11969957500696182,
-0.043178267776966095,
0.04995034635066986,
0.08716127276420593,
-0.04797622561454773,
-0.0544305257499218,
-0.1090904027223587,
0.0016015549190342426,
0.06920208036899567,
-0.057191796600818634,
-0.00036048615584149957,
-0.10588008165359497,
0.007460984401404858,
-0.0013037034077569842,
0.008024885319173336,
-0.04556393250823021,
-0.047205183655023575,
-0.003552151843905449,
-0.09768058359622955,
0.06641345471143723,
0.09643575549125671,
-0.029147855937480927,
-0.112606480717659,
0.09735570847988129,
0.012383434921503067,
0.08040721714496613,
-0.03643075376749039,
-0.06290565431118011,
-0.08497075736522675,
-0.005457955878227949,
-0.09265048056840897,
0.034391652792692184,
-0.1355961114168167,
-0.012776567600667477,
-0.04570671170949936,
-0.03305540606379509,
-0.008411439135670662,
0.06730345636606216,
-0.029428686946630478,
0.0023943311534821987,
-0.03299248218536377,
0.08405064046382904,
-0.12105576694011688,
0.0657045990228653,
0.057808224111795425,
-0.04617619141936302,
0.10800416767597198,
0.020784107968211174,
-0.04935520142316818,
0.03371812775731087,
-0.20678922533988953,
-0.05113014951348305,
-0.030291656032204628,
0.042606279253959656,
-0.01511983573436737,
-0.17770390212535858,
-0.0016641488764435053,
0.015379863791167736,
0.0067702000960707664,
-0.020578667521476746,
0.05338587239384651,
-0.02858050726354122,
-0.01584620214998722,
-0.0689845010638237,
-0.05611950531601906,
-0.034824687987565994,
0.06480664014816284,
0.07534527778625488,
0.010259263217449188,
0.0949614867568016,
-0.08983280509710312,
0.07762546092271805,
-0.07370667159557343,
0.023593731224536896,
-0.032154083251953125,
0.026064472272992134,
-0.0668737143278122,
-0.07414434850215912,
0.08039461076259613,
-0.018552519381046295,
0.06873846799135208,
0.02202996052801609,
-0.026658739894628525,
0.040711767971515656,
-0.05476200953125954,
-0.06295189261436462,
0.04272598773241043,
0.13681505620479584,
0.054347723722457886,
0.01767319068312645,
-0.0038036364130675793,
-0.03868472948670387,
0.003173442790284753,
0.15855130553245544,
0.1364164799451828,
0.16995160281658173,
0.09171362966299057,
0.028282731771469116,
0.06914814561605453,
-0.04888041317462921,
-0.08698918670415878,
0.09300320595502853,
-0.07177740335464478,
0.03341410681605339,
-0.04754376411437988,
-0.0740143284201622,
0.0717824324965477,
-0.13599756360054016,
0.07547865808010101,
-0.02943013794720173,
-0.08390560746192932,
-0.10491009056568146,
-0.13542330265045166,
-0.06410714238882065,
-0.04152875393629074,
0.003868657164275646,
-0.10744111984968185,
0.024652063846588135,
0.0035828202962875366,
0.02710035629570484,
-0.09606573730707169,
0.12115795910358429,
-0.11907359212636948,
-0.12712720036506653,
0.15470534563064575,
-0.0344049409031868,
-0.017369795590639114,
0.0073120081797242165,
0.042471397668123245,
0.021875470876693726,
0.09398273378610611,
0.050387121737003326,
0.04825873300433159,
0.01856199838221073,
0.026984071359038353,
-0.10037487000226974,
-0.06546331197023392,
0.03175824135541916,
-0.015158934518694878,
0.10046211630105972,
0.18783296644687653,
0.08776745945215225,
-0.08462855964899063,
0.014102577231824398,
0.13870255649089813,
0.027067881077528,
-0.1099158376455307,
-0.1471392959356308,
0.03130251169204712,
-0.03476094454526901,
-0.00024154134734999388,
0.0023842172231525183,
-0.09566552937030792,
0.01767859235405922,
0.20260022580623627,
0.1706356555223465,
-0.03917275369167328,
0.020608479157090187,
-0.010770691558718681,
0.008298388682305813,
0.021583961322903633,
0.0834541842341423,
0.08543453365564346,
0.17799320816993713,
-0.007767855189740658,
0.04804406315088272,
-0.021256981417536736,
-0.098024383187294,
-0.11755861341953278,
0.10490339249372482,
0.006123469676822424,
-0.03625612333416939,
-0.004621946252882481,
0.18837390840053558,
-0.10563303530216217,
-0.22089122235774994,
-0.12347783148288727,
-0.04253683611750603,
-0.11587657034397125,
0.02198558859527111,
-0.061160579323768616,
0.1339745968580246,
0.05097157135605812,
-0.0047296443954110146,
0.013320154510438442,
0.17453822493553162,
0.03693598508834839,
0.03662766143679619,
-0.019590171054005623,
0.10648339986801147,
-0.07914602011442184,
0.10721192508935928,
-0.0019162277458235621,
0.04648678004741669,
0.0351889431476593,
0.03765999153256416,
-0.06923677027225494,
0.036089904606342316,
0.034542910754680634,
0.008683289401233196,
0.04904690012335777,
0.16818493604660034,
-0.007283580489456654,
0.08664767444133759,
0.11060067266225815,
-0.06449023634195328,
0.023706620559096336,
-0.01807890273630619,
0.002456006594002247,
-0.0626971498131752,
0.1636238694190979,
-0.15097494423389435,
0.12828227877616882,
0.09880315512418747,
-0.06993738561868668,
-0.04370851814746857,
-0.009146736934781075,
0.052462562918663025,
-0.059168729931116104,
0.09968171268701553,
-0.007873576134443283,
-0.16747383773326874,
0.025470640510320663,
-0.12066712230443954,
0.06810760498046875,
-0.2637476921081543,
-0.03868330642580986,
-0.04169659689068794,
-0.017118649557232857,
0.009483433328568935,
0.11027264595031738,
0.07828448712825775,
-0.05303208529949188,
-0.013900436460971832,
-0.03905376419425011,
0.0076143802143633366,
0.09473130106925964,
-0.08570192009210587,
-0.026404550299048424
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **de** on **23.2k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **de**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "de", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-de-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"de",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"de"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #de #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in de on 23.2k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in de. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in de on 23.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in de. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #de #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in de on 23.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in de. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #de #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in de on 23.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in de. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07432331144809723,
0.11460832506418228,
-0.002898714505136013,
0.004671144299209118,
0.07641338557004929,
-0.05014248192310333,
0.1399010866880417,
0.042742494493722916,
-0.00031130650313571095,
0.09627819061279297,
-0.014212964102625847,
-0.046039193868637085,
0.07324690371751785,
0.12984034419059753,
0.05981087312102318,
-0.25651997327804565,
0.03980821371078491,
-0.0640173926949501,
0.05173688009381294,
0.046456221491098404,
0.1231812983751297,
-0.08188850432634354,
0.02778134122490883,
0.053244516253471375,
-0.034812502562999725,
0.03559284284710884,
-0.04732098802924156,
-0.08094249665737152,
0.05516679212450981,
0.04673711210489273,
-0.029687242582440376,
0.028585093095898628,
0.09466755390167236,
-0.19059738516807556,
0.036107346415519714,
0.042628757655620575,
0.027317432686686516,
0.007599369622766972,
0.09874166548252106,
0.018440423533320427,
0.15971675515174866,
-0.026756713166832924,
-0.003054704051464796,
0.07899080961942673,
-0.05671058967709541,
-0.09407325088977814,
-0.0621933676302433,
0.15543419122695923,
0.10079164803028107,
0.10724949836730957,
-0.07890735566616058,
0.07332701236009598,
-0.01942134089767933,
0.042720526456832886,
0.07515919208526611,
-0.17599280178546906,
-0.052951861172914505,
0.054272789508104324,
0.10919027030467987,
0.01857324130833149,
-0.0845612958073616,
0.0718274936079979,
0.05173821002244949,
-0.011725900694727898,
-0.06553065776824951,
-0.0352952741086483,
0.14103418588638306,
-0.10612858831882477,
-0.11514109373092651,
0.0007028580876067281,
0.1708339899778366,
0.057047516107559204,
-0.07606611400842667,
-0.1495877206325531,
0.011542700231075287,
0.20969843864440918,
-0.0551467165350914,
-0.10027123987674713,
0.011287720873951912,
0.020167192444205284,
0.04847664386034012,
-0.06681840866804123,
-0.0700998455286026,
-0.008127516135573387,
0.022651853039860725,
0.1122046634554863,
0.0207112617790699,
-0.020052136853337288,
-0.07494937628507614,
0.0004783137992490083,
-0.09138698130846024,
-0.11696488410234451,
-0.005382957868278027,
-0.06573755294084549,
-0.06768111884593964,
-0.0348270982503891,
-0.001441681757569313,
-0.10110273957252502,
0.030536729842424393,
0.10165572166442871,
0.06620379537343979,
0.05326804146170616,
-0.05699566751718521,
-0.03404860198497772,
0.12899526953697205,
0.06828631460666656,
-0.12318252772092819,
-0.02260977029800415,
0.016322394832968712,
-0.01907304860651493,
0.008368303067982197,
-0.03646235913038254,
-0.03761553391814232,
0.014287360943853855,
-0.014151854440569878,
0.048758797347545624,
0.05779852718114853,
-0.03920520469546318,
-0.03735853359103203,
-0.09686503559350967,
0.10048091411590576,
-0.08045118302106857,
0.025053350254893303,
0.04770741984248161,
-0.005161503329873085,
0.09407256543636322,
-0.06639373302459717,
0.08268395811319351,
-0.11207883059978485,
0.001253289170563221,
-0.025035740807652473,
-0.008727102540433407,
0.020921943709254265,
-0.029071884229779243,
0.032014112919569016,
-0.0027795068453997374,
0.008147516287863255,
-0.11624732613563538,
0.0019365004263818264,
-0.10291960090398788,
-0.024497419595718384,
-0.0827777311205864,
-0.045320603996515274,
-0.049776606261730194,
0.017036613076925278,
-0.006637485232204199,
-0.005793056916445494,
0.006822895258665085,
-0.0187684353441,
-0.008501779288053513,
0.009160034358501434,
0.042297087609767914,
0.053551629185676575,
0.08264798671007156,
-0.017645852640271187,
-0.016181737184524536,
-0.0983610451221466,
0.12027860432863235,
-0.07592915743589401,
-0.022867301478981972,
-0.1371716558933258,
-0.03829202428460121,
-0.03707895800471306,
0.034354131668806076,
0.015411920845508575,
0.122876837849617,
-0.18013837933540344,
-0.07115451246500015,
0.11711685359477997,
-0.12113408744335175,
0.016319075599312782,
0.17837008833885193,
-0.0027087335474789143,
0.0750710740685463,
0.10251738876104355,
0.218318372964859,
0.020084211602807045,
-0.16975432634353638,
-0.008451119065284729,
-0.051618557423353195,
0.037792038172483444,
0.1341402530670166,
0.061944250017404556,
-0.06535912305116653,
0.0637819766998291,
-0.018492214381694794,
-0.029931604862213135,
-0.08027292788028717,
-0.0017644533654674888,
-0.04509122669696808,
0.021359078586101532,
-0.04618809372186661,
0.022432640194892883,
-0.007053190842270851,
-0.024807851761579514,
-0.014819018542766571,
-0.09132840484380722,
-0.05772610381245613,
0.1186172291636467,
-0.06561210751533508,
0.02348162792623043,
-0.09872572124004364,
0.05644099414348602,
0.06616786122322083,
0.004198313690721989,
-0.1232445240020752,
0.11735085397958755,
0.03227696940302849,
-0.05018724128603935,
0.142363503575325,
0.08539813756942749,
-0.03499238193035126,
0.008668401278555393,
-0.013012426905333996,
0.019554631784558296,
-0.034976422786712646,
0.015713326632976532,
-0.023921936750411987,
-0.10275290906429291,
-0.0036022812128067017,
-0.06779401004314423,
0.10834752023220062,
-0.12691788375377655,
-0.012957689352333546,
0.044306378811597824,
0.10905672609806061,
-0.01512828841805458,
-0.03893817216157913,
0.08801069110631943,
0.043757230043411255,
0.030796149745583534,
-0.02208958938717842,
0.020093008875846863,
-0.018049102276563644,
0.0019969993736594915,
0.046795736998319626,
-0.1511279195547104,
-0.15537628531455994,
0.09518264979124069,
0.018957095220685005,
-0.014896210283041,
0.07070772349834442,
0.022240985184907913,
-0.020262321457266808,
-0.04591851681470871,
-0.000004845657258556457,
0.2398463934659958,
-0.01216091588139534,
0.05995917692780495,
-0.081139475107193,
-0.009438570588827133,
0.01419945526868105,
-0.04732045903801918,
-0.09064028412103653,
0.07602372020483017,
0.0020274741109460592,
-0.0821286290884018,
-0.041070085018873215,
0.047393396496772766,
0.07203646004199982,
0.15124820172786713,
0.010155928321182728,
-0.08680455386638641,
-0.03230779618024826,
-0.06362909823656082,
-0.014649960212409496,
0.044743914157152176,
-0.142062708735466,
-0.026357268914580345,
0.022032521665096283,
0.009207747876644135,
0.0533909946680069,
-0.02555101364850998,
0.04484817385673523,
0.011201204732060432,
-0.051353856921195984,
-0.07539715617895126,
0.034353237599134445,
-0.03362724557518959,
0.035961054265499115,
-0.006434065289795399,
-0.0006049246294423938,
-0.0435912124812603,
-0.059881456196308136,
-0.14214588701725006,
0.08764854818582535,
-0.06554169952869415,
-0.3158978223800659,
-0.08518341183662415,
-0.04936937987804413,
-0.033767037093639374,
0.016792312264442444,
0.05094505473971367,
-0.10970793664455414,
-0.10963689535856247,
-0.06653702259063721,
0.12763437628746033,
-0.03333232179284096,
-0.06462593376636505,
0.11742929369211197,
-0.003629767568781972,
0.027600370347499847,
-0.10116224735975266,
0.015605518594384193,
-0.0363643616437912,
-0.029700007289648056,
-0.03291498124599457,
0.02053777128458023,
0.05954112485051155,
0.12523841857910156,
0.023221736773848534,
-0.00679873488843441,
0.009245607070624828,
0.21902185678482056,
-0.13677874207496643,
0.07787055522203445,
0.23565594851970673,
-0.05816971883177757,
-0.008820157498121262,
0.14345648884773254,
-0.008677800185978413,
-0.05546463653445244,
0.045907676219940186,
0.0064040133729577065,
-0.019014770165085793,
-0.22283782064914703,
-0.11950024217367172,
-0.039990633726119995,
-0.02660336345434189,
0.042892396450042725,
0.017396891489624977,
-0.00036369843292050064,
0.012759788893163204,
-0.08458315581083298,
-0.03903419151902199,
0.06266908347606659,
0.03313405066728592,
0.1456691324710846,
0.007651051040738821,
0.05520050600171089,
-0.04070539027452469,
-0.02612757310271263,
0.10241296887397766,
-0.02826208621263504,
0.036439668387174606,
0.07589594274759293,
0.09876935184001923,
0.06432066112756729,
0.03331572562456131,
0.05302447825670242,
-0.01905614137649536,
-0.016054682433605194,
-0.00038027786649763584,
-0.030499648302793503,
-0.06204879283905029,
0.018322831019759178,
0.04313061758875847,
0.1435438096523285,
-0.1358136236667633,
-0.11894549429416656,
0.02955791726708412,
0.014744854532182217,
0.12338760495185852,
0.10223621875047684,
-0.027195259928703308,
-0.09394913166761398,
0.04055871441960335,
-0.09338421374559402,
-0.03507434204220772,
0.05544266849756241,
0.08015037328004837,
-0.1591559797525406,
0.0885525718331337,
0.07871735095977783,
0.08856562525033951,
-0.04829855635762215,
0.03385717421770096,
-0.051588017493486404,
0.05581360682845116,
0.003914753440767527,
0.06967251747846603,
-0.1766190081834793,
0.10046099871397018,
0.015018955804407597,
0.08709429949522018,
-0.05304683744907379,
0.027591848745942116,
0.041237249970436096,
0.00846373476088047,
0.1262168139219284,
-0.01143945287913084,
-0.09676262736320496,
0.006099124904721975,
-0.11837708950042725,
0.019664641469717026,
0.05711102858185768,
-0.06419739127159119,
0.05567372217774391,
-0.00299274199642241,
-0.004341235384345055,
-0.034083131700754166,
-0.0047463420778512955,
-0.263765424489975,
-0.1400424987077713,
0.050157081335783005,
-0.006548832170665264,
0.05662788450717926,
-0.03971876949071884,
-0.07579074800014496,
-0.1264345794916153,
0.1148911714553833,
-0.010352672077715397,
-0.018979813903570175,
-0.07225372642278671,
0.022728336974978447,
0.09700111299753189,
-0.05826296657323837,
0.013637538067996502,
0.04914837330579758,
0.14536787569522858,
-0.0655365064740181,
-0.04208891838788986,
0.022050881758332253,
-0.09948613494634628,
-0.12399059534072876,
0.013483934104442596,
0.17428098618984222,
0.11551348119974136,
0.06247277185320854,
0.09331457316875458,
0.016632169485092163,
-0.001765342429280281,
-0.09678539633750916,
0.021939152851700783,
0.029257412999868393,
-0.0734715536236763,
0.038552287966012955,
0.0018279971554875374,
-0.273328959941864,
-0.14966881275177002,
-0.06040404364466667,
0.07543449848890305,
0.18343812227249146,
-0.025187430903315544,
0.16668155789375305,
0.2738514542579651,
-0.09047479182481766,
-0.2226414978504181,
-0.04126639664173126,
-0.0012195283779874444,
0.030258478596806526,
0.04321782663464546,
-0.20754556357860565,
0.09660553187131882,
-0.0006893195095472038,
0.011678678914904594,
-0.06285066157579422,
-0.21945816278457642,
-0.13583476841449738,
0.17379318177700043,
-0.02571638859808445,
0.04777992516756058,
-0.026100989431142807,
-0.0678471177816391,
-0.040350452065467834,
-0.05266644433140755,
0.012794223614037037,
-0.09590793401002884,
0.07213533669710159,
0.05523478239774704,
0.014812256209552288,
0.02346004731953144,
0.014598624780774117,
0.1138143539428711,
0.09112327545881271,
-0.020635435357689857,
-0.07924482226371765,
0.02124842256307602,
0.007175043690949678,
-0.012858315370976925,
0.10343363881111145,
0.04965563863515854,
0.01694314368069172,
-0.04436512291431427,
-0.08701399713754654,
-0.06160026788711548,
0.05917089059948921,
-0.07287576794624329,
-0.0161548163741827,
-0.05435517802834511,
0.08792244642972946,
0.014608511701226234,
-0.00045649794628843665,
-0.074286088347435,
-0.0951528251171112,
-0.024200811982154846,
0.11456556618213654,
0.2195814847946167,
-0.04932646080851555,
0.001086497912183404,
-0.04273513704538345,
-0.045923009514808655,
0.047771017998456955,
-0.0022563606034964323,
0.04556727781891823,
0.050854429602622986,
0.021271245554089546,
0.08802208304405212,
-0.031753163784742355,
-0.1315058022737503,
0.030659742653369904,
0.03537851572036743,
-0.06700091063976288,
-0.1870499551296234,
-0.0455855056643486,
-0.0021802133414894342,
-0.017753463238477707,
-0.03119605779647827,
0.19649170339107513,
-0.013817469589412212,
-0.05822937563061714,
0.0034060354810208082,
0.06107514724135399,
-0.007403321098536253,
0.11837222427129745,
0.042413000017404556,
0.03860735520720482,
-0.08981423825025558,
0.0564848855137825,
0.11922965943813324,
-0.037664663046598434,
0.04875979945063591,
0.08698483556509018,
-0.045158546417951584,
-0.05289740487933159,
-0.09904216229915619,
0.0004455660528037697,
0.058050766587257385,
-0.061755113303661346,
-0.007970823906362057,
-0.10501690208911896,
0.008494597859680653,
0.005600060801953077,
0.011928926222026348,
-0.041150402277708054,
-0.04811781644821167,
-0.00033299639471806586,
-0.09316937625408173,
0.06394552439451218,
0.10200185328722,
-0.02917095459997654,
-0.10899996757507324,
0.10698362439870834,
0.015632232651114464,
0.0770333930850029,
-0.03772399201989174,
-0.061594847589731216,
-0.08762428164482117,
-0.005140633322298527,
-0.10360643267631531,
0.03381514921784401,
-0.13593193888664246,
-0.012282203882932663,
-0.04487139731645584,
-0.031070388853549957,
-0.009469653479754925,
0.07020464539527893,
-0.02901136688888073,
0.0017818482592701912,
-0.029402634128928185,
0.08177335560321808,
-0.12647908926010132,
0.07222121953964233,
0.05648757144808769,
-0.04828358441591263,
0.10987569391727448,
0.019136499613523483,
-0.049455150961875916,
0.0354655496776104,
-0.20793798565864563,
-0.055848438292741776,
-0.03287011384963989,
0.042631324380636215,
-0.01333510223776102,
-0.17778755724430084,
0.00007275703683262691,
0.018581362441182137,
0.011029520072042942,
-0.021027998998761177,
0.05126427114009857,
-0.02712858095765114,
-0.014217430725693703,
-0.06783737242221832,
-0.06115974113345146,
-0.03678174689412117,
0.06278759241104126,
0.07251722365617752,
0.00883154571056366,
0.0975169762969017,
-0.08940214663743973,
0.07754931598901749,
-0.07576330006122589,
0.025975171476602554,
-0.028147507458925247,
0.026559479534626007,
-0.07402651011943817,
-0.07415272295475006,
0.08073609322309494,
-0.015310704708099365,
0.07764026522636414,
0.027280250564217567,
-0.02536461129784584,
0.04284591227769852,
-0.056927334517240524,
-0.054315950721502304,
0.04306132718920708,
0.13938689231872559,
0.05148916691541672,
0.01873033493757248,
-0.009987213648855686,
-0.043361399322748184,
0.0046134227886796,
0.14154449105262756,
0.14253771305084229,
0.17084741592407227,
0.09933282434940338,
0.03298678994178772,
0.06913334876298904,
-0.048150066286325455,
-0.0898437574505806,
0.09214227646589279,
-0.06629125773906708,
0.032868366688489914,
-0.047923602163791656,
-0.07252363860607147,
0.07094612717628479,
-0.13660649955272675,
0.07336508482694626,
-0.02496747300028801,
-0.08402620255947113,
-0.10962981730699539,
-0.1364985704421997,
-0.06438815593719482,
-0.042289163917303085,
0.0023778239265084267,
-0.10885657370090485,
0.024572355672717094,
0.003246550215408206,
0.026811925694346428,
-0.09591907262802124,
0.11085202544927597,
-0.11406400799751282,
-0.12625564634799957,
0.15150082111358643,
-0.03290421888232231,
-0.013229932636022568,
0.0026598151307553053,
0.04356740042567253,
0.02188774012029171,
0.09805899858474731,
0.050439998507499695,
0.04608006030321121,
0.019786186516284943,
0.02684723027050495,
-0.09725114703178406,
-0.06530701369047165,
0.031114688143134117,
-0.015441234223544598,
0.09596283733844757,
0.19319407641887665,
0.09065056592226028,
-0.08451607078313828,
0.01454238872975111,
0.13494518399238586,
0.025843068957328796,
-0.11612903326749802,
-0.14660128951072693,
0.039765503257513046,
-0.032594043761491776,
0.004593213554471731,
0.005309825297445059,
-0.09558654576539993,
0.018627364188432693,
0.1980748027563095,
0.1743127852678299,
-0.041385795921087265,
0.020638372749090195,
-0.009101777337491512,
0.008473538793623447,
0.021902630105614662,
0.07889950275421143,
0.08416061848402023,
0.17883025109767914,
-0.008720224723219872,
0.04305873438715935,
-0.020697401836514473,
-0.09582220017910004,
-0.11494994908571243,
0.09499291330575943,
0.00517552625387907,
-0.03356883302330971,
-0.0064553264528512955,
0.18642204999923706,
-0.10561579465866089,
-0.2177341878414154,
-0.11937258392572403,
-0.039111677557229996,
-0.11574125289916992,
0.02752460353076458,
-0.047860037535429,
0.1306830644607544,
0.052195955067873,
-0.004018009174615145,
0.012482251971960068,
0.18526804447174072,
0.03570425137877464,
0.031180866062641144,
-0.022300489246845245,
0.10760275274515152,
-0.08438216149806976,
0.11126773059368134,
-0.003954241517931223,
0.05303569510579109,
0.03441152721643448,
0.038906097412109375,
-0.06600192934274673,
0.03397400304675102,
0.03468809276819229,
0.003530802670866251,
0.04780230298638344,
0.16913220286369324,
-0.006457987241446972,
0.09353676438331604,
0.11035669595003128,
-0.0666324719786644,
0.02453562244772911,
-0.01827494241297245,
-0.0013963795499876142,
-0.06424954533576965,
0.15721356868743896,
-0.15001779794692993,
0.12561027705669403,
0.1001816838979721,
-0.07020774483680725,
-0.04321417212486267,
-0.008128134533762932,
0.05147799849510193,
-0.05610709264874458,
0.09758780896663666,
-0.006555199157446623,
-0.16841092705726624,
0.02823014184832573,
-0.12377360463142395,
0.06860358268022537,
-0.26147472858428955,
-0.043020643293857574,
-0.042732272297143936,
-0.016496721655130386,
0.0065949889831244946,
0.10997042804956436,
0.07611415535211563,
-0.048885244876146317,
-0.013115464709699154,
-0.041399989277124405,
0.007755518890917301,
0.09266702085733414,
-0.08435796201229095,
-0.029398035258054733
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **el** on **17.7k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **el**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "el", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-el-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"el",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"el"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #el #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in el on 17.7k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in el on 17.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #el #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in el on 17.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #el #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in el on 17.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07160304486751556,
0.1060849130153656,
-0.002911142772063613,
0.007007630076259375,
0.07611659914255142,
-0.04821193218231201,
0.13894620537757874,
0.04429014027118683,
0.006903833709657192,
0.09236842393875122,
-0.016418809071183205,
-0.03930599242448807,
0.07191023975610733,
0.12955810129642487,
0.05397651344537735,
-0.258204847574234,
0.03414138779044151,
-0.06313497573137283,
0.06414280831813812,
0.04410530999302864,
0.12393520772457123,
-0.0861048549413681,
0.03102187067270279,
0.05375572293996811,
-0.02523464895784855,
0.03922031819820404,
-0.04920324683189392,
-0.0879049301147461,
0.053623758256435394,
0.0543736107647419,
-0.029500124976038933,
0.022634971886873245,
0.09135674685239792,
-0.19769106805324554,
0.03536522015929222,
0.038854729384183884,
0.025558019056916237,
0.00722844060510397,
0.10134752094745636,
0.015106393955647945,
0.1704505831003189,
-0.025515856221318245,
-0.0009581138729117811,
0.078040711581707,
-0.057246286422014236,
-0.09004810452461243,
-0.061559729278087616,
0.15490639209747314,
0.10490704327821732,
0.10718800872564316,
-0.07722815871238708,
0.07751845568418503,
-0.0167186688631773,
0.048615891486406326,
0.07872705906629562,
-0.18039248883724213,
-0.05224573239684105,
0.056870073080062866,
0.11254047602415085,
0.02136053703725338,
-0.08825468271970749,
0.07498768717050552,
0.055167995393276215,
-0.013451660983264446,
-0.06535463780164719,
-0.03572037070989609,
0.12577053904533386,
-0.10545320063829422,
-0.11551647633314133,
-0.00003029661274922546,
0.17003092169761658,
0.05829716473817825,
-0.07380867749452591,
-0.1456555724143982,
0.013632748275995255,
0.21169611811637878,
-0.051277730613946915,
-0.09417527168989182,
0.011767512187361717,
0.015547345392405987,
0.04910145327448845,
-0.0689767524600029,
-0.06975705921649933,
-0.0008736505988053977,
0.018885865807533264,
0.11753758043050766,
0.022972969338297844,
-0.01898193173110485,
-0.07331814616918564,
-0.005559510551393032,
-0.10028740763664246,
-0.11766008287668228,
-0.007356736809015274,
-0.06470199674367905,
-0.06226949766278267,
-0.03480343148112297,
0.001797555829398334,
-0.09804308414459229,
0.026435794308781624,
0.10230255126953125,
0.06394575536251068,
0.05350412800908089,
-0.05118381604552269,
-0.03171170502901077,
0.1246209368109703,
0.07378944009542465,
-0.1111149787902832,
-0.021308330819010735,
0.01178671419620514,
-0.022838694974780083,
0.010323417373001575,
-0.03920069709420204,
-0.038367677479982376,
0.017646733671426773,
-0.023879138752818108,
0.04571201279759407,
0.05380973592400551,
-0.03880321606993675,
-0.03590509295463562,
-0.09443532675504684,
0.09263074398040771,
-0.08112983405590057,
0.0293522197753191,
0.04348718747496605,
-0.011470003984868526,
0.08537792414426804,
-0.06009663641452789,
0.08683297038078308,
-0.1094004437327385,
0.002434489084407687,
-0.02478885091841221,
-0.00399840297177434,
0.01808488555252552,
-0.031980033963918686,
0.0294182188808918,
-0.004605414811521769,
0.014305436983704567,
-0.12215575575828552,
-0.000031881740142125636,
-0.102884940803051,
-0.02343061938881874,
-0.07785696536302567,
-0.04317988082766533,
-0.050578661262989044,
0.02017359249293804,
-0.004419055767357349,
-0.007163078524172306,
0.008015673607587814,
-0.018662530928850174,
-0.009944235906004906,
0.013505489565432072,
0.042167872190475464,
0.0632302314043045,
0.08145484328269958,
-0.01623823679983616,
-0.020547159016132355,
-0.10224590450525284,
0.11606725305318832,
-0.07529596239328384,
-0.020628154277801514,
-0.1383972465991974,
-0.03740524873137474,
-0.025180887430906296,
0.03822036460042,
0.014532452449202538,
0.11952244490385056,
-0.16807231307029724,
-0.06948472559452057,
0.10235141962766647,
-0.11967179924249649,
0.006538939196616411,
0.17804820835590363,
-0.0009770808974280953,
0.07084342837333679,
0.10219419747591019,
0.22177182137966156,
0.023773599416017532,
-0.18004503846168518,
-0.01791086234152317,
-0.04901586100459099,
0.04223092272877693,
0.1291714310646057,
0.06643428653478622,
-0.05852144584059715,
0.05974249169230461,
-0.021678561344742775,
-0.02140340395271778,
-0.08012602478265762,
-0.006761312484741211,
-0.04445318505167961,
0.020838545635342598,
-0.0480431392788887,
0.02348542958498001,
-0.006805733311921358,
-0.023060103878378868,
-0.017011642456054688,
-0.09108158200979233,
-0.06082211807370186,
0.12163706868886948,
-0.06523793935775757,
0.023817261680960655,
-0.10510169714689255,
0.048070069402456284,
0.061803754419088364,
0.0002628499350976199,
-0.12027249485254288,
0.11531254649162292,
0.029269207268953323,
-0.04030155763030052,
0.14466488361358643,
0.07835227251052856,
-0.035057853907346725,
0.009466585703194141,
-0.014176631346344948,
0.013360083103179932,
-0.03446625545620918,
0.01210777834057808,
-0.03306354582309723,
-0.10149833559989929,
-0.005381576716899872,
-0.06658805161714554,
0.08678364753723145,
-0.13669107854366302,
-0.014433087781071663,
0.02779487334191799,
0.1083747074007988,
-0.015416165813803673,
-0.03310856968164444,
0.08807262033224106,
0.0468163900077343,
0.026906782761216164,
-0.02040993608534336,
0.01788814552128315,
-0.019247179850935936,
-0.006218136753886938,
0.051215969026088715,
-0.14996594190597534,
-0.15266010165214539,
0.0920516774058342,
0.01512401644140482,
-0.0061797332018613815,
0.07081323862075806,
0.02146458998322487,
-0.02158580906689167,
-0.04844047129154205,
0.0026518157683312893,
0.25089141726493835,
-0.016089098528027534,
0.06321024149656296,
-0.0823693573474884,
-0.009760675020515919,
0.012842451222240925,
-0.05339540168642998,
-0.09459341317415237,
0.08563490211963654,
-0.0010817046277225018,
-0.07282356172800064,
-0.041891809552907944,
0.046295251697301865,
0.0672629326581955,
0.15051689743995667,
0.014047252014279366,
-0.08399354666471481,
-0.026172585785388947,
-0.06741461902856827,
-0.014663566835224628,
0.036504972726106644,
-0.13436265289783478,
-0.023540010675787926,
0.024459028616547585,
0.007658816874027252,
0.052706580609083176,
-0.024831237271428108,
0.04346359893679619,
0.014712114818394184,
-0.050271175801754,
-0.07789002358913422,
0.03641872480511665,
-0.03425808623433113,
0.03618708252906799,
-0.015839993953704834,
-0.004980709403753281,
-0.04703330248594284,
-0.05707672983407974,
-0.14026764035224915,
0.08994300663471222,
-0.06598372757434845,
-0.3123987317085266,
-0.08730386197566986,
-0.059850528836250305,
-0.035064324736595154,
0.016383785754442215,
0.059029821306467056,
-0.11196233332157135,
-0.11351530253887177,
-0.06875265389680862,
0.13064439594745636,
-0.03639623522758484,
-0.06408300995826721,
0.12273567169904709,
-0.008497695438563824,
0.02222418412566185,
-0.10395537316799164,
0.01510756928473711,
-0.03740590810775757,
-0.03159095346927643,
-0.0281363558024168,
0.022611521184444427,
0.06056830659508705,
0.12372614443302155,
0.018981022760272026,
-0.008818399161100388,
0.005207397975027561,
0.2232547551393509,
-0.1355338990688324,
0.07499999552965164,
0.24886655807495117,
-0.05729498714208603,
-0.005755002144724131,
0.1402372121810913,
-0.010802689008414745,
-0.055847618728876114,
0.05015783756971359,
0.0074895163998007774,
-0.018053695559501648,
-0.2353275716304779,
-0.11703544855117798,
-0.03898857533931732,
-0.02828335016965866,
0.0471373125910759,
0.01807316578924656,
-0.0039949179627001286,
0.015545492060482502,
-0.08769987523555756,
-0.03376046195626259,
0.05870722234249115,
0.03431500867009163,
0.14103631675243378,
0.009757591411471367,
0.05366777256131172,
-0.04132115840911865,
-0.02441728301346302,
0.1002567857503891,
-0.027188269421458244,
0.041189566254615784,
0.0779302567243576,
0.09493395686149597,
0.0668618232011795,
0.028198685497045517,
0.050977107137441635,
-0.01890133135020733,
-0.02493618056178093,
-0.0022997003979980946,
-0.03041096217930317,
-0.06165101379156113,
0.017883438616991043,
0.04549495875835419,
0.1386399269104004,
-0.1272931545972824,
-0.12342411279678345,
0.03676769137382507,
0.014346893876791,
0.1116255521774292,
0.09389418363571167,
-0.03125574812293053,
-0.09080882370471954,
0.04254405200481415,
-0.09429418295621872,
-0.03502766788005829,
0.05497216805815697,
0.08449909090995789,
-0.15329553186893463,
0.09393775463104248,
0.08099787682294846,
0.09353702515363693,
-0.04359816387295723,
0.03154375031590462,
-0.058560315519571304,
0.05695253610610962,
0.0019386210478842258,
0.07322809845209122,
-0.18582868576049805,
0.10108543932437897,
0.01724870875477791,
0.09297119826078415,
-0.0506545789539814,
0.02791512943804264,
0.04114014655351639,
0.011845359578728676,
0.12550806999206543,
-0.011892136186361313,
-0.10602441430091858,
-0.0027250335551798344,
-0.11782742291688919,
0.020909147337079048,
0.05480669438838959,
-0.0661870539188385,
0.0565132237970829,
-0.00019546605471987277,
-0.003086596028879285,
-0.03571908921003342,
-0.006591917015612125,
-0.26894834637641907,
-0.13597239553928375,
0.04268022999167442,
0.001429807161912322,
0.05652778223156929,
-0.03492608666419983,
-0.07781554013490677,
-0.13526076078414917,
0.10540711134672165,
-0.008620679378509521,
-0.02124013751745224,
-0.07333128899335861,
0.019255271181464195,
0.09623853862285614,
-0.06520458310842514,
0.011502152308821678,
0.05573713779449463,
0.14504770934581757,
-0.07053908705711365,
-0.04105867072939873,
0.019030427560210228,
-0.10585512965917587,
-0.12257597595453262,
0.014989146962761879,
0.1682734191417694,
0.1123039647936821,
0.06407497823238373,
0.09447377920150757,
0.01435901504009962,
-0.007987315766513348,
-0.0966285988688469,
0.019931714981794357,
0.030490323901176453,
-0.062012359499931335,
0.04227267578244209,
0.002569513162598014,
-0.264322429895401,
-0.1548137664794922,
-0.06766889244318008,
0.073118656873703,
0.18735061585903168,
-0.019834019243717194,
0.15898357331752777,
0.2724880576133728,
-0.09633539617061615,
-0.21411076188087463,
-0.049002744257450104,
0.003601566655561328,
0.029903830960392952,
0.05145642161369324,
-0.20254451036453247,
0.09832984954118729,
-0.003979979548603296,
0.01306065171957016,
-0.06423347443342209,
-0.22453340888023376,
-0.13771750032901764,
0.17435435950756073,
-0.021735815331339836,
0.05001569166779518,
-0.026281634345650673,
-0.065815269947052,
-0.03860212862491608,
-0.048067886382341385,
0.0115476343780756,
-0.0888872966170311,
0.07424487173557281,
0.052724119275808334,
0.018388450145721436,
0.024510646238923073,
0.019216319546103477,
0.11511806398630142,
0.09282784909009933,
-0.01614641584455967,
-0.079236701130867,
0.01866060122847557,
-0.008977771736681461,
-0.019381152465939522,
0.10581963509321213,
0.04254290089011192,
0.015734249725937843,
-0.06402715295553207,
-0.08699313551187515,
-0.05485669896006584,
0.06157456710934639,
-0.07137534022331238,
-0.014334472827613354,
-0.05519973859190941,
0.0814276784658432,
0.015142939053475857,
0.0009599793120287359,
-0.05046522617340088,
-0.09441452473402023,
-0.018141884356737137,
0.11188304424285889,
0.21704690158367157,
-0.05183691158890724,
-0.007203971967101097,
-0.04452234506607056,
-0.04189283400774002,
0.050511304289102554,
-0.01166265457868576,
0.04468654841184616,
0.05495451018214226,
0.01726866513490677,
0.09261387586593628,
-0.032694295048713684,
-0.12967684864997864,
0.033274538815021515,
0.036873556673526764,
-0.06212608516216278,
-0.17800089716911316,
-0.049946900457143784,
0.006742045748978853,
-0.015532199293375015,
-0.03362056985497475,
0.1947515904903412,
-0.017901889979839325,
-0.05729508399963379,
-0.000006929085429874249,
0.060574762523174286,
-0.0010746826883405447,
0.1192609891295433,
0.04479563236236572,
0.039832886308431625,
-0.08770723640918732,
0.05698057636618614,
0.12173987179994583,
-0.04297654703259468,
0.0475691594183445,
0.08891744166612625,
-0.03731953352689743,
-0.053944915533065796,
-0.0913044661283493,
0.01327295321971178,
0.049306631088256836,
-0.06475380808115005,
-0.006776002701371908,
-0.10065790265798569,
0.010423185303807259,
0.012639623135328293,
0.012134484946727753,
-0.04727933928370476,
-0.04585844278335571,
0.00010220063995802775,
-0.08980737626552582,
0.0649169310927391,
0.0998789444565773,
-0.029317375272512436,
-0.11756984144449234,
0.10674797743558884,
0.01963072456419468,
0.08357708901166916,
-0.03529917448759079,
-0.05943150073289871,
-0.09151878952980042,
-0.0017346404492855072,
-0.08122821897268295,
0.045835573226213455,
-0.1314598172903061,
-0.006896546110510826,
-0.044533681124448776,
-0.037973906844854355,
-0.013529856689274311,
0.07364556193351746,
-0.028883561491966248,
0.0036094728857278824,
-0.024732161313295364,
0.09042666107416153,
-0.12435049563646317,
0.0649527981877327,
0.05317935720086098,
-0.04430830851197243,
0.10473977774381638,
0.013256395235657692,
-0.05058275908231735,
0.03558408096432686,
-0.21472154557704926,
-0.04818987473845482,
-0.027891524136066437,
0.04377327486872673,
-0.011197086423635483,
-0.1735181361436844,
0.0016191243194043636,
0.015215316787362099,
0.00955372303724289,
-0.023576917126774788,
0.03933945298194885,
-0.03372671455144882,
-0.02100047469139099,
-0.06526673585176468,
-0.06440507620573044,
-0.0355047881603241,
0.06354273110628128,
0.07066671550273895,
0.007237202487885952,
0.10259965807199478,
-0.08887648582458496,
0.0760091170668602,
-0.07608496397733688,
0.028632335364818573,
-0.024818575009703636,
0.022256558761000633,
-0.07023285329341888,
-0.0762893408536911,
0.0797208622097969,
-0.018668675795197487,
0.06758072972297668,
0.036611899733543396,
-0.018880868330597878,
0.03833140805363655,
-0.05290539562702179,
-0.06354137510061264,
0.03660454601049423,
0.139863058924675,
0.04423687607049942,
0.024872709065675735,
-0.0005652401596307755,
-0.04273436218500137,
0.015353331342339516,
0.14394164085388184,
0.14165371656417847,
0.16388078033924103,
0.10606998950242996,
0.03764202818274498,
0.06616612523794174,
-0.047922637313604355,
-0.09675253927707672,
0.07402552664279938,
-0.05869884788990021,
0.031081940978765488,
-0.04610690474510193,
-0.08254662901163101,
0.0816635712981224,
-0.13693484663963318,
0.07469627261161804,
-0.022448919713497162,
-0.08074292540550232,
-0.11355376988649368,
-0.13624824583530426,
-0.06589896231889725,
-0.04345687851309776,
0.005846843589097261,
-0.11389832943677902,
0.029179945588111877,
0.009399214759469032,
0.028378384187817574,
-0.09031049907207489,
0.1026119738817215,
-0.11743710190057755,
-0.1250706911087036,
0.1440882384777069,
-0.03635172173380852,
-0.007541893515735865,
-0.0018344614654779434,
0.04335693269968033,
0.0251027699559927,
0.09409274160861969,
0.05025192350149155,
0.04555952548980713,
0.023081811144948006,
0.03078584559261799,
-0.09570222347974777,
-0.06677602976560593,
0.02866514027118683,
-0.016404232010245323,
0.10527096688747406,
0.18900366127490997,
0.08944959938526154,
-0.08039146661758423,
0.01279875822365284,
0.15120719373226166,
0.02529933489859104,
-0.11396367102861404,
-0.14101514220237732,
0.040838904678821564,
-0.025731930509209633,
0.003256847383454442,
0.0012093851109966636,
-0.09574386477470398,
0.013426419347524643,
0.20257551968097687,
0.16103288531303406,
-0.032925236970186234,
0.018773769959807396,
-0.016888370737433434,
0.009660359472036362,
0.026943830773234367,
0.07571958005428314,
0.0884537473320961,
0.1740088015794754,
-0.010287068784236908,
0.04575599730014801,
-0.023245956748723984,
-0.08763978630304337,
-0.11388401687145233,
0.0985412448644638,
0.0038850486744195223,
-0.032999105751514435,
-0.0012915959814563394,
0.18407803773880005,
-0.10573837906122208,
-0.2133660763502121,
-0.12089721858501434,
-0.03972142934799194,
-0.11080216616392136,
0.027736229822039604,
-0.02893572486937046,
0.13468512892723083,
0.05172143876552582,
-0.0023122173734009266,
0.0047996388748288155,
0.16863378882408142,
0.037361372262239456,
0.028238648548722267,
-0.02746988646686077,
0.10341695696115494,
-0.0880483016371727,
0.1194101944565773,
-0.001571333035826683,
0.055547840893268585,
0.03204706311225891,
0.03536392003297806,
-0.0647212415933609,
0.03616830334067345,
0.03261221945285797,
0.005052839871495962,
0.05684736371040344,
0.16959364712238312,
-0.007792148273438215,
0.09992407262325287,
0.10699460655450821,
-0.07056495547294617,
0.022175002843141556,
-0.016062363982200623,
0.0032410358544439077,
-0.05669049546122551,
0.1571359485387802,
-0.1513122022151947,
0.12657198309898376,
0.09747226536273956,
-0.07038095593452454,
-0.04689861088991165,
-0.0053137135691940784,
0.05304893106222153,
-0.06103523075580597,
0.0988011509180069,
-0.00511699914932251,
-0.16658568382263184,
0.02713533490896225,
-0.12293693423271179,
0.07002703100442886,
-0.2555895149707794,
-0.04031134769320488,
-0.04639986902475357,
-0.021037481725215912,
-0.00004537462882581167,
0.10500302910804749,
0.08604307472705841,
-0.052063822746276855,
-0.014046740718185902,
-0.058259569108486176,
0.009510388597846031,
0.09333861619234085,
-0.0841636210680008,
-0.030453728511929512
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **en** on **24.1k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **en**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-en-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"en",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #en #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in en on 24.1k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in en. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in en on 24.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in en. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #en #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in en on 24.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in en. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #en #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in en on 24.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in en. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07524585723876953,
0.10978034138679504,
-0.0029073297046124935,
0.007135975174605846,
0.07632200419902802,
-0.04943352937698364,
0.13773536682128906,
0.04188951849937439,
-0.0011877079959958792,
0.09648466855287552,
-0.013109155930578709,
-0.04789901152253151,
0.07209809124469757,
0.13011164963245392,
0.059967342764139175,
-0.25438785552978516,
0.037503134459257126,
-0.064515620470047,
0.05153120681643486,
0.04737352952361107,
0.12118767201900482,
-0.08367480337619781,
0.028023501858115196,
0.056266214698553085,
-0.03467731922864914,
0.034858811646699905,
-0.04935804754495621,
-0.08279534429311752,
0.056128744035959244,
0.048541367053985596,
-0.02895556204020977,
0.025332771241664886,
0.0937308743596077,
-0.19031833112239838,
0.036279406398534775,
0.04041323810815811,
0.02765408717095852,
0.008417810313403606,
0.09687063097953796,
0.019501768052577972,
0.15292349457740784,
-0.02456062287092209,
-0.0034317488316446543,
0.07854587584733963,
-0.054243408143520355,
-0.0949745699763298,
-0.05827759578824043,
0.15208706259727478,
0.09883453696966171,
0.11025052517652512,
-0.08085285872220993,
0.07742944359779358,
-0.018466731533408165,
0.04424932599067688,
0.07631728053092957,
-0.18014319241046906,
-0.05316944792866707,
0.04763806238770485,
0.10702203214168549,
0.01948300190269947,
-0.087374247610569,
0.0697009488940239,
0.05462576448917389,
-0.011789814569056034,
-0.06539086997509003,
-0.03524605929851532,
0.13907063007354736,
-0.1086120754480362,
-0.11566642671823502,
0.0030864612199366093,
0.1736038327217102,
0.0601256787776947,
-0.07586246728897095,
-0.14848075807094574,
0.011674907989799976,
0.21280117332935333,
-0.05285260081291199,
-0.09242317080497742,
0.009214435704052448,
0.017092136666178703,
0.04627445712685585,
-0.0683358684182167,
-0.07218761742115021,
-0.0038291008677333593,
0.021955909207463264,
0.1157999187707901,
0.02336813323199749,
-0.018541716039180756,
-0.07370402663946152,
-0.003302094293758273,
-0.0879841074347496,
-0.1172199621796608,
-0.004900448024272919,
-0.06504826247692108,
-0.06880269944667816,
-0.033991001546382904,
-0.0001756849669618532,
-0.09757357835769653,
0.02966982312500477,
0.09624407440423965,
0.05696070194244385,
0.054407406598329544,
-0.056188859045505524,
-0.03158339112997055,
0.12828896939754486,
0.0656200498342514,
-0.1195363700389862,
-0.01832967810332775,
0.012524491176009178,
-0.021105526015162468,
0.00804587360471487,
-0.03723634034395218,
-0.040178749710321426,
0.018370231613516808,
-0.02060575783252716,
0.04547582194209099,
0.0554753802716732,
-0.038725972175598145,
-0.03605623543262482,
-0.09498792886734009,
0.09100104123353958,
-0.0802718922495842,
0.026314271613955498,
0.048640359193086624,
-0.0070756589993834496,
0.08862870186567307,
-0.06647612899541855,
0.08058232814073563,
-0.11358893662691116,
0.0016848042141646147,
-0.026893505826592445,
-0.006549803074449301,
0.022794071584939957,
-0.029568130150437355,
0.032710421830415726,
-0.005784300155937672,
0.00973542407155037,
-0.11703089624643326,
0.008512942120432854,
-0.10146544128656387,
-0.02278056927025318,
-0.08174993097782135,
-0.04645239934325218,
-0.051503464579582214,
0.0171351358294487,
-0.00594373932108283,
-0.005510400049388409,
0.00887256395071745,
-0.018777476623654366,
-0.010255186818540096,
0.009523722343146801,
0.04491551220417023,
0.05651632696390152,
0.08137594908475876,
-0.01918957009911537,
-0.018836643546819687,
-0.090746209025383,
0.11695175617933273,
-0.07394643127918243,
-0.021866489201784134,
-0.13620483875274658,
-0.043094050139188766,
-0.038870032876729965,
0.03413018211722374,
0.014810830354690552,
0.12266603857278824,
-0.17211180925369263,
-0.0725569799542427,
0.11897440254688263,
-0.11974145472049713,
0.011256307363510132,
0.18050165474414825,
-0.001776009565219283,
0.07483319938182831,
0.10005161166191101,
0.21776291728019714,
0.014068706892430782,
-0.173676535487175,
-0.013725250028073788,
-0.052769120782613754,
0.03884485363960266,
0.13019979000091553,
0.06286686658859253,
-0.06325394660234451,
0.06120825931429863,
-0.019427182152867317,
-0.022091781720519066,
-0.0793338194489479,
-0.004352451767772436,
-0.04588637128472328,
0.02160809375345707,
-0.04541690647602081,
0.02281654253602028,
-0.007090109866112471,
-0.023377563804388046,
-0.015942299738526344,
-0.09187895804643631,
-0.06424741446971893,
0.11753558367490768,
-0.06662420928478241,
0.02559378370642662,
-0.10188335180282593,
0.057259202003479004,
0.06396902352571487,
0.004878062289208174,
-0.12387891858816147,
0.1158420667052269,
0.03125785291194916,
-0.050330858677625656,
0.14414231479167938,
0.0824742466211319,
-0.03570019453763962,
0.012061436660587788,
-0.013349439948797226,
0.016243407502770424,
-0.03302605822682381,
0.013589972630143166,
-0.023699305951595306,
-0.10683047026395798,
-0.003336695022881031,
-0.06696809828281403,
0.1128922775387764,
-0.1308480054140091,
-0.01460980623960495,
0.04006745293736458,
0.1045370101928711,
-0.014738941565155983,
-0.03779112547636032,
0.08883347362279892,
0.04662283882498741,
0.029537184163928032,
-0.019359666854143143,
0.020471543073654175,
-0.016642941161990166,
-0.0017454179469496012,
0.04851161316037178,
-0.14831991493701935,
-0.15898968279361725,
0.0946357324719429,
0.013396376743912697,
-0.014660505577921867,
0.0692533627152443,
0.020545747131109238,
-0.02121870033442974,
-0.05077502503991127,
0.0046089510433375835,
0.24381306767463684,
-0.01229090616106987,
0.06359308212995529,
-0.0820566937327385,
-0.008539346046745777,
0.01715666800737381,
-0.05250193551182747,
-0.09089702367782593,
0.08241964876651764,
0.0030672834254801273,
-0.08271902054548264,
-0.042969346046447754,
0.04709816724061966,
0.0678132101893425,
0.1538439840078354,
0.011866030283272266,
-0.08670978993177414,
-0.03198901191353798,
-0.06498408317565918,
-0.014511696062982082,
0.042951226234436035,
-0.13967756927013397,
-0.02419602870941162,
0.0224323607981205,
0.008581054396927357,
0.05279069021344185,
-0.025008099153637886,
0.0441923514008522,
0.011448773555457592,
-0.04993546009063721,
-0.0776849314570427,
0.03301091119647026,
-0.032131630927324295,
0.03496016189455986,
-0.008637926541268826,
0.0012109656818211079,
-0.044791143387556076,
-0.05959074944257736,
-0.14168213307857513,
0.08846790343523026,
-0.06593049317598343,
-0.3190077841281891,
-0.08889766037464142,
-0.05883350595831871,
-0.032683201134204865,
0.012692187912762165,
0.0538669154047966,
-0.11375688761472702,
-0.10785376280546188,
-0.06818453222513199,
0.1253427416086197,
-0.032801464200019836,
-0.06296832859516144,
0.1207268163561821,
-0.003090905025601387,
0.02857312001287937,
-0.09840024262666702,
0.015876220539212227,
-0.04040020331740379,
-0.029991598799824715,
-0.028519466519355774,
0.021589452400803566,
0.06108003854751587,
0.12366782128810883,
0.022777333855628967,
-0.005814714357256889,
0.009639770723879337,
0.22304469347000122,
-0.13570620119571686,
0.07694441825151443,
0.231391042470932,
-0.05911128222942352,
-0.006609967909753323,
0.1441173255443573,
-0.00732400780543685,
-0.05282019451260567,
0.04671145975589752,
0.005736181046813726,
-0.019537057727575302,
-0.22572648525238037,
-0.12133989483118057,
-0.04178433120250702,
-0.02843266725540161,
0.04810931533575058,
0.017773441970348358,
-0.001679125940427184,
0.011747115291655064,
-0.08302715420722961,
-0.04218614846467972,
0.05838443338871002,
0.03382793068885803,
0.14610815048217773,
0.010652284137904644,
0.053043756633996964,
-0.04093615710735321,
-0.02661709114909172,
0.10204807668924332,
-0.027607372030615807,
0.04589993879199028,
0.07497327029705048,
0.10118676722049713,
0.06707292795181274,
0.03651982173323631,
0.054944198578596115,
-0.019735997542738914,
-0.019392792135477066,
-0.0011459346860647202,
-0.029586559161543846,
-0.06241780146956444,
0.02180720679461956,
0.04467594996094704,
0.14179763197898865,
-0.1316222846508026,
-0.12031025439500809,
0.03361804410815239,
0.01393230352550745,
0.1219015121459961,
0.10220115631818771,
-0.027662407606840134,
-0.09413386881351471,
0.03780379146337509,
-0.09215656667947769,
-0.03215281665325165,
0.05503179132938385,
0.08644787967205048,
-0.15630793571472168,
0.08925379067659378,
0.07884284108877182,
0.09008663147687912,
-0.04565737023949623,
0.033315882086753845,
-0.0527438260614872,
0.05692783370614052,
0.002562337787821889,
0.07092247158288956,
-0.16790181398391724,
0.10920833051204681,
0.014483815990388393,
0.08904654532670975,
-0.05106104537844658,
0.027005014941096306,
0.04016627371311188,
0.009537814185023308,
0.12766635417938232,
-0.009861108846962452,
-0.10051088035106659,
-0.0005930177285335958,
-0.11709829419851303,
0.018877072259783745,
0.056055132299661636,
-0.057190362364053726,
0.05759105458855629,
-0.0005558777484111488,
-0.0055167111568152905,
-0.03451353311538696,
-0.006530571263283491,
-0.25871697068214417,
-0.14194750785827637,
0.04556779935956001,
0.001168140908703208,
0.050667986273765564,
-0.03823129087686539,
-0.07808198779821396,
-0.12415661662817001,
0.10904427617788315,
-0.007584515027701855,
-0.018163328990340233,
-0.07157458364963531,
0.025155531242489815,
0.09808582812547684,
-0.061803899705410004,
0.01297888346016407,
0.048954904079437256,
0.14611728489398956,
-0.06647540628910065,
-0.037794675678014755,
0.01912497542798519,
-0.10255923122167587,
-0.12090790271759033,
0.01376087311655283,
0.17279304563999176,
0.11560574173927307,
0.06260793656110764,
0.09320847690105438,
0.017112446948885918,
-0.007783613633364439,
-0.09757761657238007,
0.02556486241519451,
0.020255733281373978,
-0.07241416722536087,
0.040027420967817307,
-0.0003522685437928885,
-0.2747938930988312,
-0.1517411768436432,
-0.06276613473892212,
0.07917818427085876,
0.1858150213956833,
-0.025093305855989456,
0.16212758421897888,
0.2815300226211548,
-0.09103494137525558,
-0.21754679083824158,
-0.045130278915166855,
0.00045805241097696126,
0.029388150200247765,
0.04279942810535431,
-0.21094954013824463,
0.10019729286432266,
-0.0038737310096621513,
0.013296060264110565,
-0.06650881469249725,
-0.22129154205322266,
-0.13566718995571136,
0.17099730670452118,
-0.025428922846913338,
0.0515734888613224,
-0.022429347038269043,
-0.06803087145090103,
-0.03610998019576073,
-0.04662732779979706,
0.01083910558372736,
-0.09208749979734421,
0.0746225044131279,
0.05578090623021126,
0.014998755417764187,
0.021685993298888206,
0.01431166660040617,
0.11057040840387344,
0.09090730547904968,
-0.025060052052140236,
-0.07885566353797913,
0.023062674328684807,
0.00013544071407523006,
-0.011356021277606487,
0.10419569164514542,
0.04866711050271988,
0.016084834933280945,
-0.0455123670399189,
-0.0867648497223854,
-0.06211153045296669,
0.06070132926106453,
-0.07094687223434448,
-0.01454326044768095,
-0.053126297891139984,
0.08902252465486526,
0.017060529440641403,
0.000672169029712677,
-0.07441382855176926,
-0.09362136572599411,
-0.020348310470581055,
0.11517016589641571,
0.21602576971054077,
-0.05478329956531525,
-0.004177316557615995,
-0.042296092957258224,
-0.045394089072942734,
0.047933705151081085,
-0.0020994555670768023,
0.04342586174607277,
0.0518791601061821,
0.020706888288259506,
0.08980441093444824,
-0.03045116551220417,
-0.13005061447620392,
0.03224451094865799,
0.03755843639373779,
-0.06900240480899811,
-0.18702781200408936,
-0.046868156641721725,
-0.002071887021884322,
-0.01765701361000538,
-0.03328455239534378,
0.19572097063064575,
-0.01501396019011736,
-0.05435212701559067,
0.0021679976489394903,
0.06123558059334755,
-0.004147892817854881,
0.11807085573673248,
0.045705635100603104,
0.03887651488184929,
-0.08934041857719421,
0.053864773362874985,
0.12228832393884659,
-0.03934153541922569,
0.0505654513835907,
0.09112387895584106,
-0.045725539326667786,
-0.05329551920294762,
-0.10043203085660934,
-0.0017174073727801442,
0.06086186319589615,
-0.06497318297624588,
-0.007763942237943411,
-0.10240456461906433,
0.010228476487100124,
0.0018759805243462324,
0.012135859578847885,
-0.04653168469667435,
-0.04868731275200844,
-0.001960815628990531,
-0.09490514546632767,
0.06624483317136765,
0.09978779405355453,
-0.030073650181293488,
-0.10812367498874664,
0.10239102691411972,
0.019698645919561386,
0.08224254101514816,
-0.036493174731731415,
-0.06422408670186996,
-0.08701232075691223,
-0.003264738479629159,
-0.09193611145019531,
0.03819043189287186,
-0.1348242163658142,
-0.01282608974725008,
-0.04584377259016037,
-0.03183011710643768,
-0.012828596867620945,
0.07221659272909164,
-0.028569180518388748,
0.002199051668867469,
-0.026348458603024483,
0.08515554666519165,
-0.1266544759273529,
0.07275289297103882,
0.05727669969201088,
-0.0468139611184597,
0.10964508354663849,
0.018214518204331398,
-0.054651908576488495,
0.03549150004982948,
-0.20786772668361664,
-0.05538385733962059,
-0.031874869018793106,
0.04477521777153015,
-0.012506049126386642,
-0.17958861589431763,
0.0007873409194871783,
0.018620245158672333,
0.012027936987578869,
-0.022420356050133705,
0.0486157163977623,
-0.028257159516215324,
-0.014444568194448948,
-0.06704505532979965,
-0.06506718695163727,
-0.037291184067726135,
0.06339960545301437,
0.07079467177391052,
0.005305255763232708,
0.1019267663359642,
-0.0893387645483017,
0.0783378854393959,
-0.07448437809944153,
0.026532942429184914,
-0.025631029158830643,
0.025099672377109528,
-0.0722111165523529,
-0.07671672105789185,
0.0790577307343483,
-0.0180363729596138,
0.07394066452980042,
0.025052931159734726,
-0.02619875967502594,
0.04148653522133827,
-0.05394608899950981,
-0.0650937557220459,
0.041756272315979004,
0.13732920587062836,
0.05043691396713257,
0.017518872395157814,
-0.005740041844546795,
-0.046383246779441833,
0.006076418329030275,
0.15045857429504395,
0.14172112941741943,
0.1663973480463028,
0.11126045137643814,
0.03318600356578827,
0.07157540321350098,
-0.04563935101032257,
-0.09008710831403732,
0.09394260495901108,
-0.06767500936985016,
0.03629998862743378,
-0.04406997933983803,
-0.06863284111022949,
0.07052772492170334,
-0.13157734274864197,
0.07321203500032425,
-0.024366118013858795,
-0.0842682495713234,
-0.11005876213312149,
-0.13534723222255707,
-0.06596639007329941,
-0.04179450497031212,
0.0043016052804887295,
-0.11052943021059036,
0.024746764451265335,
0.004901668056845665,
0.03004574216902256,
-0.09282800555229187,
0.11348243802785873,
-0.11084450036287308,
-0.12512536346912384,
0.15062883496284485,
-0.034593962132930756,
-0.012024171650409698,
0.0005201519816182554,
0.04520783945918083,
0.02341456338763237,
0.09771560877561569,
0.0506439208984375,
0.04780833050608635,
0.020137421786785126,
0.02892748825252056,
-0.09802024811506271,
-0.06523450464010239,
0.03016010858118534,
-0.015589013695716858,
0.09930023550987244,
0.18804627656936646,
0.08986600488424301,
-0.08228646963834763,
0.01264166459441185,
0.13873212039470673,
0.02603217028081417,
-0.11841780692338943,
-0.14732183516025543,
0.030135057866573334,
-0.032008133828639984,
0.001746801775880158,
0.004482854623347521,
-0.09442280977964401,
0.017911572009325027,
0.20358262956142426,
0.17538142204284668,
-0.038967180997133255,
0.02036435343325138,
-0.008057065308094025,
0.008860187605023384,
0.02472546510398388,
0.07951253652572632,
0.08669968694448471,
0.1799306720495224,
-0.009099060669541359,
0.04476573318243027,
-0.025304965674877167,
-0.09333329647779465,
-0.11472054570913315,
0.09466671943664551,
0.007090693339705467,
-0.033432964235544205,
-0.007660529110580683,
0.18650327622890472,
-0.10588894784450531,
-0.21893921494483948,
-0.12077803164720535,
-0.04070734977722168,
-0.11717019975185394,
0.02772718295454979,
-0.03697725012898445,
0.13324867188930511,
0.05333583801984787,
-0.004619500134140253,
0.009607477113604546,
0.1794896423816681,
0.03570224344730377,
0.030414557084441185,
-0.026098543778061867,
0.10745938867330551,
-0.09268511086702347,
0.11476337164640427,
-0.0027095230761915445,
0.05331633612513542,
0.034223273396492004,
0.036550939083099365,
-0.06510787457227707,
0.033843088895082474,
0.033860623836517334,
0.004387563094496727,
0.04933500289916992,
0.17052988708019257,
-0.005643139593303204,
0.09348728507757187,
0.10960821807384491,
-0.06555711477994919,
0.02422042191028595,
-0.018306970596313477,
0.001400493667460978,
-0.0603041797876358,
0.1563732922077179,
-0.149607852101326,
0.12543465197086334,
0.10225579142570496,
-0.0702933743596077,
-0.04647080600261688,
-0.0065346150659024715,
0.050748053938150406,
-0.05688410624861717,
0.09516775608062744,
-0.005436416249722242,
-0.17206403613090515,
0.027653394266963005,
-0.1262504905462265,
0.07127604633569717,
-0.2633621394634247,
-0.04369226098060608,
-0.04244716465473175,
-0.016261912882328033,
0.004435976035892963,
0.11000749468803406,
0.08124434947967529,
-0.04872380569577217,
-0.012649071402847767,
-0.03826135769486427,
0.00921208132058382,
0.09003494679927826,
-0.0832391306757927,
-0.030244095250964165
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **es** on **21.4k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **es**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "es", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-es-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"es",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"es"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #es #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in es on 21.4k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in es. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in es on 21.4k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in es. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #es #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in es on 21.4k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in es. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #es #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in es on 21.4k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in es. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07606666535139084,
0.10414794087409973,
-0.0028572010342031717,
0.0057523492723703384,
0.07923198491334915,
-0.047180771827697754,
0.13819535076618195,
0.04258883371949196,
0.010138978250324726,
0.09592173248529434,
-0.013525627553462982,
-0.047179143875837326,
0.0725967213511467,
0.1344117373228073,
0.06026139110326767,
-0.2578192949295044,
0.03973821550607681,
-0.06334128975868225,
0.051124654710292816,
0.047257211059331894,
0.12435312569141388,
-0.08340685069561005,
0.029277533292770386,
0.053245894610881805,
-0.031106427311897278,
0.03662608936429024,
-0.04890575632452965,
-0.08464393019676208,
0.05506279692053795,
0.0489494726061821,
-0.029074832797050476,
0.026577001437544823,
0.09505215287208557,
-0.19099953770637512,
0.03527288883924484,
0.03872215002775192,
0.027223875746130943,
0.008491965010762215,
0.09958811104297638,
0.01821970008313656,
0.15339508652687073,
-0.025546954944729805,
-0.003286645980551839,
0.07647741585969925,
-0.05418216437101364,
-0.09712429344654083,
-0.05882885679602623,
0.1507083922624588,
0.1031741350889206,
0.11023681610822678,
-0.07952523976564407,
0.07119975239038467,
-0.01792161911725998,
0.04357696697115898,
0.08140739798545837,
-0.17652542889118195,
-0.050644002854824066,
0.04797949641942978,
0.10802839696407318,
0.02488233707845211,
-0.08326715230941772,
0.07093644142150879,
0.05453092232346535,
-0.010916686616837978,
-0.06447537243366241,
-0.03375614807009697,
0.1363285779953003,
-0.1049349457025528,
-0.11335340142250061,
0.0008244984783232212,
0.1786699742078781,
0.058302201330661774,
-0.075629323720932,
-0.14890393614768982,
0.01047095749527216,
0.2095251977443695,
-0.048893384635448456,
-0.10046043992042542,
0.01456338819116354,
0.018032554537057877,
0.05490536987781525,
-0.06637611985206604,
-0.07364671677350998,
-0.005778048187494278,
0.022914860397577286,
0.10988534986972809,
0.0214251559227705,
-0.019192738458514214,
-0.0721636563539505,
-0.0018581494223326445,
-0.08700040727853775,
-0.11730408668518066,
-0.0066542611457407475,
-0.06243428587913513,
-0.06886137276887894,
-0.03377191349864006,
0.0033000081311911345,
-0.09849856048822403,
0.025283431634306908,
0.09598037600517273,
0.05697406828403473,
0.05484062433242798,
-0.045698072761297226,
-0.03197995573282242,
0.1311054825782776,
0.06782831996679306,
-0.1164064109325409,
-0.021527022123336792,
0.0099952956661582,
-0.023166822269558907,
0.007654991466552019,
-0.03928861767053604,
-0.03946397826075554,
0.016873283311724663,
-0.017158828675746918,
0.04411553964018822,
0.0530281662940979,
-0.04092197120189667,
-0.03762361407279968,
-0.09472691267728806,
0.0982944592833519,
-0.08211614936590195,
0.026810569688677788,
0.04381652921438217,
-0.006242664530873299,
0.09880515933036804,
-0.0645206943154335,
0.08029315620660782,
-0.11191331595182419,
0.006994720082730055,
-0.02785326912999153,
-0.006575346924364567,
0.019528597593307495,
-0.02720862254500389,
0.030925875529646873,
-0.001622611191123724,
0.008397064171731472,
-0.11843134462833405,
-0.0030484935268759727,
-0.10476651787757874,
-0.02305152639746666,
-0.08078914880752563,
-0.04566759616136551,
-0.04916590824723244,
0.01804191805422306,
-0.004449278581887484,
-0.009630703367292881,
0.007107659708708525,
-0.019778771325945854,
-0.012286658398807049,
0.010192117653787136,
0.0434405580163002,
0.05757142975926399,
0.08271481841802597,
-0.018816763535141945,
-0.019409334287047386,
-0.10456646233797073,
0.11814278364181519,
-0.07557792216539383,
-0.024312863126397133,
-0.13482768833637238,
-0.03817564249038696,
-0.0352863073348999,
0.038596853613853455,
0.010234817862510681,
0.12468957155942917,
-0.1770615428686142,
-0.07114791870117188,
0.113766148686409,
-0.12411724030971527,
0.011037159711122513,
0.17983345687389374,
0.00043218713835813105,
0.07370581477880478,
0.10258178412914276,
0.22090940177440643,
0.007916180416941643,
-0.1675245314836502,
-0.015682706609368324,
-0.050257984548807144,
0.0431814007461071,
0.1304892748594284,
0.065451480448246,
-0.06543805450201035,
0.06682170182466507,
-0.019313715398311615,
-0.02941937744617462,
-0.0837213471531868,
-0.003572163637727499,
-0.04200773686170578,
0.02089877240359783,
-0.04497669264674187,
0.020809408277273178,
-0.008944083005189896,
-0.021488236263394356,
-0.016597947105765343,
-0.09529882669448853,
-0.05902022495865822,
0.1192028596997261,
-0.06622012704610825,
0.02525903843343258,
-0.10197754204273224,
0.05799034982919693,
0.06832825392484665,
0.0022294512018561363,
-0.1254936307668686,
0.11345464736223221,
0.03257107734680176,
-0.04974708333611488,
0.14199495315551758,
0.08262909948825836,
-0.03661167994141579,
0.011405807919800282,
-0.013466744683682919,
0.015526686795055866,
-0.03313212841749191,
0.013761231675744057,
-0.02742798812687397,
-0.10454923659563065,
-0.005159035325050354,
-0.0672588050365448,
0.10348813980817795,
-0.1320076584815979,
-0.014858078211545944,
0.04003501683473587,
0.10676465183496475,
-0.015279478393495083,
-0.03886264190077782,
0.08323273062705994,
0.04367615655064583,
0.02778937853872776,
-0.02083495818078518,
0.016056835651397705,
-0.017580872401595116,
0.002061180304735899,
0.04651998355984688,
-0.1505911648273468,
-0.15377254784107208,
0.0939604863524437,
0.014845835976302624,
-0.014917507767677307,
0.06985458731651306,
0.02381610870361328,
-0.019507868215441704,
-0.05142384395003319,
-0.0017750540282577276,
0.24585089087486267,
-0.014400657266378403,
0.0611836314201355,
-0.0810413509607315,
-0.008968504145741463,
0.012460154481232166,
-0.052186161279678345,
-0.09061481058597565,
0.0785098448395729,
0.001749154063872993,
-0.07989221811294556,
-0.03767172619700432,
0.046690575778484344,
0.07141963392496109,
0.153080552816391,
0.011375623755156994,
-0.0870446041226387,
-0.03114446811378002,
-0.06546293199062347,
-0.01797916181385517,
0.04379216954112053,
-0.1391076296567917,
-0.024429386481642723,
0.022493645548820496,
0.010800105519592762,
0.0544012151658535,
-0.02510106936097145,
0.04392694681882858,
0.013919647783041,
-0.05117147043347359,
-0.0756877213716507,
0.03034328855574131,
-0.037153393030166626,
0.03529169037938118,
-0.012393563985824585,
0.0033610910177230835,
-0.04583187773823738,
-0.05870744585990906,
-0.1446114182472229,
0.08984924107789993,
-0.06809184700250626,
-0.31523385643959045,
-0.08718061447143555,
-0.04652939736843109,
-0.02793213538825512,
0.01748393289744854,
0.05604688078165054,
-0.11293603479862213,
-0.11105941981077194,
-0.06866057217121124,
0.12689019739627838,
-0.03184739500284195,
-0.06466764956712723,
0.1218186616897583,
-0.008143390528857708,
0.023760106414556503,
-0.09905092418193817,
0.01364185567945242,
-0.038638170808553696,
-0.030237769708037376,
-0.031461238861083984,
0.02103666588664055,
0.05927538499236107,
0.1268633008003235,
0.022264134138822556,
-0.008634516969323158,
0.0070883710868656635,
0.21869857609272003,
-0.13476009666919708,
0.07906763255596161,
0.23848988115787506,
-0.05720583721995354,
-0.007780935615301132,
0.1380261331796646,
-0.009788905270397663,
-0.054432399570941925,
0.04494704306125641,
0.003320671385154128,
-0.020839311182498932,
-0.22998760640621185,
-0.11935282498598099,
-0.038865793496370316,
-0.029202451929450035,
0.041292622685432434,
0.019376007840037346,
-0.001331459148786962,
0.016536105424165726,
-0.08528386056423187,
-0.040736258029937744,
0.057863593101501465,
0.03155818581581116,
0.1385473906993866,
0.008838766254484653,
0.054430216550827026,
-0.04095642268657684,
-0.021903568878769875,
0.10235992074012756,
-0.027455676347017288,
0.044147416949272156,
0.07601454854011536,
0.10032667964696884,
0.06427550315856934,
0.03251659870147705,
0.05272919684648514,
-0.01644706539809704,
-0.01783805899322033,
-0.0007153882761485875,
-0.033171724528074265,
-0.059418659657239914,
0.017906343564391136,
0.04780924320220947,
0.13974393904209137,
-0.1312916874885559,
-0.12145361304283142,
0.03392625227570534,
0.01452289056032896,
0.12003608793020248,
0.09595179557800293,
-0.029560398310422897,
-0.09403441101312637,
0.04326966032385826,
-0.09237487614154816,
-0.034302689135074615,
0.054602183401584625,
0.08683651685714722,
-0.15841983258724213,
0.09085849672555923,
0.0806758925318718,
0.08840840309858322,
-0.04540622606873512,
0.03301892429590225,
-0.04971423000097275,
0.05561388283967972,
0.00276502245105803,
0.071614570915699,
-0.17447449266910553,
0.10537733882665634,
0.013205785304307938,
0.09052539616823196,
-0.052109405398368835,
0.03048804961144924,
0.039041779935359955,
0.009899115189909935,
0.12998691201210022,
-0.010288698598742485,
-0.11279886215925217,
0.004055617842823267,
-0.11379089951515198,
0.020844383165240288,
0.05529558286070824,
-0.06596291810274124,
0.05559397488832474,
-0.0004978655488230288,
-0.0023993358481675386,
-0.033377617597579956,
-0.0027773603796958923,
-0.26137974858283997,
-0.14303886890411377,
0.04585939645767212,
-0.0022399388253688812,
0.05343924090266228,
-0.03865087032318115,
-0.0764387995004654,
-0.12817171216011047,
0.10785974562168121,
-0.010829444043338299,
-0.022237099707126617,
-0.07250628620386124,
0.018240878358483315,
0.09613592177629471,
-0.06306808441877365,
0.012934502214193344,
0.05407599359750748,
0.14735311269760132,
-0.06551560759544373,
-0.03742414712905884,
0.024300143122673035,
-0.10455583781003952,
-0.12358790636062622,
0.01248008944094181,
0.16932761669158936,
0.11719202995300293,
0.0631580576300621,
0.09230940043926239,
0.017369339242577553,
-0.003290525171905756,
-0.09729794412851334,
0.02512112818658352,
0.03169328719377518,
-0.07289977371692657,
0.044611357152462006,
-0.0009735078783705831,
-0.2697094678878784,
-0.15004248917102814,
-0.0665183961391449,
0.07434599101543427,
0.1850065290927887,
-0.02477203495800495,
0.16472399234771729,
0.27838295698165894,
-0.09475331008434296,
-0.21840807795524597,
-0.04371190443634987,
0.0029588835313916206,
0.02951211668550968,
0.051031481474637985,
-0.21017839014530182,
0.10078466683626175,
0.0006998926983214915,
0.011704321950674057,
-0.06548108905553818,
-0.21464885771274567,
-0.1365620344877243,
0.17116566002368927,
-0.026389554142951965,
0.045965783298015594,
-0.028940411284565926,
-0.06795424968004227,
-0.03971891477704048,
-0.04569747671484947,
0.011745646595954895,
-0.08509796112775803,
0.07161926478147507,
0.05724399536848068,
0.0165802501142025,
0.02127145044505596,
0.016875969246029854,
0.11694009602069855,
0.09494026750326157,
-0.021599886938929558,
-0.07917085289955139,
0.022010866552591324,
-0.002417033538222313,
-0.013223640620708466,
0.1046784520149231,
0.048305485397577286,
0.013111566193401814,
-0.051546044647693634,
-0.08701449632644653,
-0.05934536084532738,
0.0592174232006073,
-0.06994467973709106,
-0.015253045596182346,
-0.054075825959444046,
0.08518577367067337,
0.01587182842195034,
-0.0002852703910320997,
-0.06667324155569077,
-0.09462277591228485,
-0.01890353113412857,
0.11608710885047913,
0.21724091470241547,
-0.05705741420388222,
-0.010130832903087139,
-0.042713332921266556,
-0.04426657035946846,
0.04781994968652725,
-0.0037782322615385056,
0.043350186198949814,
0.05203363671898842,
0.018663151189684868,
0.09114249050617218,
-0.029818447306752205,
-0.13093556463718414,
0.03345654159784317,
0.03634338453412056,
-0.06493507325649261,
-0.18806959688663483,
-0.046373747289180756,
-0.002783210715278983,
-0.01646745577454567,
-0.03341316059231758,
0.19320905208587646,
-0.013319870457053185,
-0.056542351841926575,
-0.00022471862030215561,
0.06362247467041016,
-0.002668660366907716,
0.1235022321343422,
0.04068500176072121,
0.039432454854249954,
-0.0894106924533844,
0.05933443084359169,
0.11995202302932739,
-0.041849154978990555,
0.050004154443740845,
0.08973291516304016,
-0.043433062732219696,
-0.05113906040787697,
-0.09951002895832062,
-0.0011239615269005299,
0.0551023930311203,
-0.06571204960346222,
-0.008498633280396461,
-0.10633759945631027,
0.008633670397102833,
0.00003974882565671578,
0.013166665099561214,
-0.04630661383271217,
-0.04628439247608185,
0.0027977132704108953,
-0.09231119602918625,
0.06247671693563461,
0.10508311539888382,
-0.026306357234716415,
-0.11355786770582199,
0.10396528989076614,
0.01914960891008377,
0.07932675629854202,
-0.03630214184522629,
-0.059946123510599136,
-0.08545606583356857,
-0.00429243128746748,
-0.09112442284822464,
0.03897798806428909,
-0.13424964249134064,
-0.016373150050640106,
-0.04177401587367058,
-0.033120427280664444,
-0.011704030446708202,
0.072984978556633,
-0.02952396869659424,
0.0035526289138942957,
-0.025645915418863297,
0.08588908612728119,
-0.12460041046142578,
0.06858082860708237,
0.05497174710035324,
-0.04814106971025467,
0.10783416777849197,
0.02065056748688221,
-0.05270978435873985,
0.036097202450037,
-0.21883709728717804,
-0.05304521322250366,
-0.02955261804163456,
0.04539348930120468,
-0.012595439329743385,
-0.1764153242111206,
0.0003191278374288231,
0.01803968846797943,
0.010360829532146454,
-0.024372681975364685,
0.0459284670650959,
-0.027282921597361565,
-0.014284025877714157,
-0.06714218854904175,
-0.0633796751499176,
-0.0361563116312027,
0.0614674873650074,
0.0754285603761673,
0.0047393254935741425,
0.10036617517471313,
-0.08849167078733444,
0.07678672671318054,
-0.07870068401098251,
0.027923420071601868,
-0.025793462991714478,
0.028790684416890144,
-0.07446200400590897,
-0.0745774507522583,
0.07978994399309158,
-0.01828693598508835,
0.0770876482129097,
0.026739217340946198,
-0.02334810420870781,
0.04255378618836403,
-0.05365297198295593,
-0.052968233823776245,
0.04172957316040993,
0.1332966536283493,
0.047259364277124405,
0.020315464586019516,
-0.00803039688616991,
-0.04214905947446823,
0.00821712240576744,
0.1433905065059662,
0.14415797591209412,
0.16444499790668488,
0.0976196900010109,
0.03887316957116127,
0.06517141312360764,
-0.048946402966976166,
-0.09263519942760468,
0.08420824259519577,
-0.06311018019914627,
0.03178970515727997,
-0.043498098850250244,
-0.0718185156583786,
0.07208392769098282,
-0.13791318237781525,
0.07414581626653671,
-0.02778506465256214,
-0.08440644294023514,
-0.1122380793094635,
-0.1387290507555008,
-0.06362362205982208,
-0.045340295881032944,
0.0013157575158402324,
-0.11244422942399979,
0.027405686676502228,
0.014223601669073105,
0.02723727747797966,
-0.0906209722161293,
0.10941962152719498,
-0.11679182946681976,
-0.12635833024978638,
0.14735858142375946,
-0.033188410103321075,
-0.0100559638813138,
-0.0015935878036543727,
0.04353194311261177,
0.023431284353137016,
0.09769322723150253,
0.049451857805252075,
0.04653345048427582,
0.01916677877306938,
0.028345294296741486,
-0.09417487680912018,
-0.0635673776268959,
0.03003862127661705,
-0.016227833926677704,
0.09606137871742249,
0.18823188543319702,
0.0885566771030426,
-0.08477168530225754,
0.014065469615161419,
0.14305870234966278,
0.022781966254115105,
-0.11030781269073486,
-0.1438000351190567,
0.029890546575188637,
-0.0293976329267025,
0.006716554053127766,
0.006250655744224787,
-0.09517272561788559,
0.016505979001522064,
0.1960717737674713,
0.170951709151268,
-0.03837412968277931,
0.017892910167574883,
-0.01505242194980383,
0.009266617707908154,
0.022485103458166122,
0.07838903367519379,
0.08452944457530975,
0.18744352459907532,
-0.00935045164078474,
0.050037045031785965,
-0.022837042808532715,
-0.09070347249507904,
-0.11575471609830856,
0.09815112501382828,
0.004263237584382296,
-0.02981926128268242,
-0.004784092307090759,
0.1837647259235382,
-0.10272075980901718,
-0.2203121781349182,
-0.12013594806194305,
-0.040451474487781525,
-0.11416701972484589,
0.03037416562438011,
-0.03504510596394539,
0.13324567675590515,
0.05529436841607094,
-0.0030389309395104647,
0.008151562884449959,
0.18243557214736938,
0.03850696608424187,
0.026326457038521767,
-0.022616606205701828,
0.10654525458812714,
-0.09492037445306778,
0.11812295019626617,
-0.0024434938095510006,
0.05564803257584572,
0.03198005259037018,
0.03785380348563194,
-0.06490225344896317,
0.0328989215195179,
0.03263131156563759,
0.003780132858082652,
0.0542881116271019,
0.16735662519931793,
-0.007062640972435474,
0.09493063390254974,
0.10998039692640305,
-0.0688248723745346,
0.025519898161292076,
-0.012206654995679855,
-0.0016676303930580616,
-0.059193965047597885,
0.1539059430360794,
-0.15285535156726837,
0.1266956329345703,
0.09574536979198456,
-0.0708872601389885,
-0.04592623561620712,
-0.005500779487192631,
0.05360458046197891,
-0.057008400559425354,
0.10130607336759567,
-0.0017276550643146038,
-0.17326757311820984,
0.026982109993696213,
-0.12117864191532135,
0.0716005265712738,
-0.25510722398757935,
-0.04211721196770668,
-0.044678498059511185,
-0.02006763406097889,
0.0032338034361600876,
0.10644836723804474,
0.08216322958469391,
-0.04879464581608772,
-0.014448253437876701,
-0.03902481496334076,
0.007473082281649113,
0.09496711939573288,
-0.08124331384897232,
-0.030621996149420738
] |
null | null |
transformers
|
# Wav2Vec2-Base-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the es unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "es", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-es-voxpopuli
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"es",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"es"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Base-VoxPopuli
Facebook's Wav2Vec2 base model pretrained on the es unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the es unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the es unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
69,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the es unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.06132237985730171,
0.03788529708981514,
-0.004406030755490065,
-0.006515822373330593,
0.12124720215797424,
-0.03508797660470009,
0.07438238710165024,
0.0015673661837354302,
0.04274729639291763,
0.01289646327495575,
0.01243497896939516,
0.02729753963649273,
0.08873029798269272,
0.12489082664251328,
-0.015384494327008724,
-0.2853838801383972,
0.07195086032152176,
0.016083160415291786,
0.07149793952703476,
0.046414464712142944,
0.11077480763196945,
-0.0747208222746849,
0.04850305989384651,
0.052633773535490036,
-0.08186231553554535,
0.02268109656870365,
0.02255052514374256,
-0.08766105771064758,
0.1265319138765335,
0.11189577728509903,
0.08544232696294785,
0.0581880584359169,
0.031060749664902687,
-0.16270512342453003,
0.0329279899597168,
0.0639236643910408,
-0.05936994031071663,
-0.015180354937911034,
0.1200297549366951,
-0.024217788130044937,
0.21041515469551086,
-0.029483003541827202,
-0.03547292575240135,
0.08692552149295807,
-0.13073644042015076,
-0.13200637698173523,
-0.06904909759759903,
0.12840141355991364,
0.1405540108680725,
0.06879293918609619,
-0.0797971561551094,
0.034344859421253204,
-0.034235287457704544,
0.06677378714084625,
0.07502730935811996,
-0.289094477891922,
-0.03438284620642662,
0.10778471827507019,
0.07588484138250351,
-0.027869058772921562,
-0.1001429408788681,
0.09652090072631836,
0.027947252616286278,
-0.009232265874743462,
-0.016832998022437096,
-0.09205763041973114,
0.03335806727409363,
-0.08285962790250778,
-0.11452864855527878,
0.011727644130587578,
0.18339048326015472,
0.03919649124145508,
-0.06165293976664543,
-0.06656337529420853,
-0.031212380155920982,
0.19087857007980347,
-0.0578022226691246,
-0.17087635397911072,
0.011148842051625252,
0.039497725665569305,
0.08015213161706924,
-0.17031021416187286,
-0.06530202925205231,
-0.0023069947492331266,
-0.056195586919784546,
0.09685458242893219,
0.02759495936334133,
-0.020259136334061623,
-0.06383024156093597,
0.010573459789156914,
-0.09438750147819519,
-0.07030139863491058,
0.004141397774219513,
-0.10347609221935272,
-0.08842901885509491,
-0.02731335163116455,
-0.0815497413277626,
-0.0753728523850441,
-0.030122576281428337,
0.08668484538793564,
0.010886481031775475,
0.06210684776306152,
-0.08598780632019043,
0.035860490053892136,
0.005180263426154852,
0.09383232891559601,
-0.1276363730430603,
-0.018948646262288094,
0.006830520462244749,
-0.04534708708524704,
-0.0020958951208740473,
-0.03590385615825653,
-0.0793527364730835,
-0.07034087181091309,
-0.02644084393978119,
0.06327390670776367,
0.00425997469574213,
0.023169346153736115,
-0.0654090866446495,
-0.09232740849256516,
0.040827278047800064,
-0.07372267544269562,
0.0208265520632267,
0.028339790180325508,
0.009861779399216175,
0.20494385063648224,
0.0031492840498685837,
0.06506428122520447,
-0.1443319022655487,
0.017870021983981133,
-0.02164742909371853,
0.0027005523443222046,
-0.010227377526462078,
-0.04580763354897499,
0.03165549784898758,
-0.038229797035455704,
-0.006717842072248459,
-0.1335781365633011,
-0.09023352712392807,
-0.08356867730617523,
0.0008243839256465435,
-0.03687280789017677,
-0.06567452102899551,
-0.041440363973379135,
-0.00005056379086454399,
-0.020081985741853714,
-0.028185781091451645,
-0.025715604424476624,
-0.01980511099100113,
0.0019144549733027816,
-0.02327563241124153,
0.0783441886305809,
-0.0547202005982399,
0.08152090013027191,
0.009573602117598057,
-0.021009040996432304,
-0.14487485587596893,
0.1281205117702484,
-0.0790996178984642,
-0.05688278004527092,
-0.14869539439678192,
-0.035956572741270065,
-0.06689688563346863,
0.06739094108343124,
-0.009731069207191467,
0.16074468195438385,
-0.19955892860889435,
-0.11687593162059784,
0.2493124157190323,
-0.11818588525056839,
0.015037349425256252,
0.17559148371219635,
0.01895150914788246,
0.03767317906022072,
0.1682078093290329,
0.14094172418117523,
0.028162790462374687,
-0.11397654563188553,
0.04354492947459221,
-0.04450244456529617,
-0.014850574545562267,
0.058384306728839874,
0.050967469811439514,
-0.013162339106202126,
-0.0013153335312381387,
-0.010743594728410244,
-0.07020694762468338,
-0.05395558476448059,
-0.00018631649436429143,
-0.059039875864982605,
0.03517037257552147,
-0.02197234518826008,
0.08533287793397903,
-0.0092476112768054,
-0.008158358745276928,
0.023526284843683243,
-0.09281301498413086,
-0.02362019196152687,
0.067043237388134,
-0.05473939701914787,
0.07627186924219131,
-0.10848762840032578,
0.05209464579820633,
0.11650409549474716,
0.06206590309739113,
-0.13611580431461334,
0.052823230624198914,
-0.022140171378850937,
0.08419019728899002,
0.09383206814527512,
0.18527397513389587,
-0.028135957196354866,
-0.04521667957305908,
-0.08609182387590408,
0.024527927860617638,
-0.025803588330745697,
-0.04180498048663139,
-0.023417940363287926,
-0.09981810301542282,
-0.027845369651913643,
-0.043167635798454285,
0.061833541840314865,
-0.17456220090389252,
0.008671419695019722,
0.07547306269407272,
0.07605484873056412,
0.016204122453927994,
0.017714211717247963,
0.003198655555024743,
0.09730175882577896,
0.03549804911017418,
0.00620865635573864,
0.07099373638629913,
-0.009197120554745197,
-0.05955604091286659,
0.11883717775344849,
-0.06734836846590042,
0.023879192769527435,
0.13891518115997314,
-0.10872317105531693,
-0.006259716581553221,
0.006572567392140627,
0.030635302886366844,
0.0018032192019745708,
0.007603161036968231,
-0.019773295149207115,
0.21580760180950165,
0.025892052799463272,
0.08036941289901733,
-0.08337775617837906,
0.02289525978267193,
-0.016602635383605957,
-0.03335283324122429,
-0.06557803601026535,
0.06453729420900345,
0.03673747554421425,
-0.09321661293506622,
0.014536320231854916,
0.12165577709674835,
0.0017555446829646826,
0.14822779595851898,
0.02080000936985016,
-0.01935938000679016,
0.013904127292335033,
-0.06135682016611099,
-0.024647720158100128,
-0.010288623161613941,
-0.16512514650821686,
-0.01756894215941429,
0.02288213185966015,
0.014191553927958012,
0.06915609538555145,
-0.053842782974243164,
-0.0010802990291267633,
0.011524006724357605,
-0.07545642554759979,
-0.042360685765743256,
0.04257216677069664,
-0.011184205301105976,
0.07088487595319748,
-0.049156565219163895,
-0.0038627295289188623,
-0.014951669611036777,
-0.029594600200653076,
-0.10411708801984787,
0.11960925906896591,
-0.06141271814703941,
-0.36568063497543335,
-0.09791601449251175,
-0.10116352885961533,
-0.08753994852304459,
0.04391175135970116,
0.04634276032447815,
-0.10508151352405548,
-0.07538533955812454,
0.0031065559014678,
0.17097841203212738,
-0.030731631442904472,
-0.07974468916654587,
0.05109969899058342,
0.00290492782369256,
-0.010465680621564388,
-0.09002947807312012,
0.0044751414097845554,
-0.03414859622716904,
-0.12973976135253906,
-0.012812763452529907,
-0.027269182726740837,
0.03781900554895401,
0.1426451951265335,
0.036201443523168564,
-0.025395020842552185,
-0.02620779350399971,
0.20244187116622925,
-0.13065439462661743,
0.08410361409187317,
0.26221781969070435,
-0.023388074710965157,
0.020337073132395744,
0.14200568199157715,
-0.015386505052447319,
-0.08173488080501556,
-0.005235705990344286,
0.07039293646812439,
-0.0038269900251179934,
-0.26691460609436035,
-0.12740802764892578,
-0.061055369675159454,
-0.030694805085659027,
0.026482028886675835,
-0.00990793202072382,
0.018460338935256004,
0.04395130276679993,
-0.10679686814546585,
-0.018919195979833603,
0.053585972636938095,
0.02627181075513363,
0.2144990861415863,
-0.044333577156066895,
0.14198830723762512,
-0.023628797382116318,
-0.026178691536188126,
0.06925014406442642,
0.03244489058852196,
0.08355318754911423,
0.11895796656608582,
0.05968708172440529,
0.08662807941436768,
0.01651323214173317,
0.02313382923603058,
-0.0006605401868000627,
0.0010621523251757026,
-0.028514504432678223,
-0.05455297231674194,
-0.0206422358751297,
-0.044112008064985275,
0.02566627599298954,
0.10575275868177414,
-0.16947095096111298,
-0.12383798509836197,
0.01615607552230358,
0.034626562148332596,
0.13744552433490753,
0.055386386811733246,
-0.08115655928850174,
-0.036582838743925095,
0.05238309130072594,
-0.08454912900924683,
-0.04902157932519913,
0.06500491499900818,
0.0782066136598587,
-0.1478845179080963,
0.15112601220607758,
0.0326392762362957,
0.09762117266654968,
-0.019212355837225914,
0.057815779000520706,
-0.1608664095401764,
-0.02100953459739685,
0.03275506943464279,
0.06849499046802521,
-0.24906018376350403,
0.21467992663383484,
0.02035992033779621,
0.06449536979198456,
-0.07045698165893555,
0.006181024014949799,
0.04816604033112526,
0.14274872839450836,
0.13329660892486572,
-0.005899224895983934,
-0.06934534013271332,
0.017616882920265198,
-0.01147246640175581,
0.039332110434770584,
0.03519671782851219,
-0.023494578897953033,
0.04761141166090965,
-0.0032802794594317675,
0.015773069113492966,
-0.010045741684734821,
0.11980313062667847,
-0.2304663360118866,
-0.15250180661678314,
0.025939147919416428,
0.012365823611617088,
0.11710137873888016,
-0.005594082176685333,
-0.06737440079450607,
-0.11073749512434006,
0.10523553937673569,
-0.005269431043416262,
-0.0199443232268095,
-0.10383863002061844,
0.030162528157234192,
0.026119980961084366,
-0.10004781186580658,
0.034625567495822906,
0.05863282456994057,
0.13148444890975952,
-0.10314736515283585,
-0.06213346868753433,
0.049719054251909256,
-0.09306085854768753,
-0.05930892750620842,
0.04542512819170952,
0.174928680062294,
0.1036604642868042,
0.03220982104539871,
0.11521673202514648,
-0.04061978682875633,
0.04892084375023842,
-0.11199773848056793,
0.07086020708084106,
0.013637784868478775,
-0.017953947186470032,
0.02387840300798416,
-0.05797629803419113,
-0.25327467918395996,
-0.10961343348026276,
-0.017011087387800217,
0.16860468685626984,
0.18875594437122345,
0.022405575960874557,
0.15038913488388062,
0.24703289568424225,
-0.09939652681350708,
-0.2510254681110382,
-0.04634822532534599,
-0.02324332669377327,
0.038453105837106705,
0.02310493029654026,
-0.2699624300003052,
0.05937030538916588,
0.059509143233299255,
0.007021116092801094,
-0.0813862755894661,
-0.2056545615196228,
-0.12952299416065216,
0.1985333263874054,
0.019500557333230972,
0.15165899693965912,
-0.0941932275891304,
-0.04747519642114639,
-0.09206850081682205,
-0.073610819876194,
0.07906893640756607,
-0.12925580143928528,
0.08377742767333984,
0.05681612715125084,
-0.005164116621017456,
0.005494642071425915,
0.05766402184963226,
0.12289004772901535,
0.06883849948644638,
0.014613019302487373,
-0.02793615125119686,
0.03045339323580265,
0.031565360724925995,
0.027404919266700745,
0.027934765443205833,
0.02586177922785282,
-0.034161124378442764,
-0.056498464196920395,
-0.11106191575527191,
-0.10026974976062775,
0.09571763873100281,
-0.059695642441511154,
-0.004302890971302986,
-0.01903625950217247,
0.10159574449062347,
0.017825908958911896,
0.014501449652016163,
-0.05475608631968498,
-0.14634515345096588,
0.035225141793489456,
0.12608689069747925,
0.23339878022670746,
-0.13564954698085785,
0.006321758031845093,
-0.037876155227422714,
-0.04495515674352646,
0.08923973888158798,
-0.00224783131852746,
0.06926611810922623,
0.037126053124666214,
-0.01393022108823061,
0.09404400736093521,
0.019700782373547554,
-0.07357435673475266,
0.008301849476993084,
0.0451011061668396,
-0.06322778016328812,
-0.24431106448173523,
-0.06508220732212067,
-0.005565222818404436,
0.02103734202682972,
0.014315803535282612,
0.18892212212085724,
-0.008790770545601845,
-0.06010415777564049,
-0.019044548273086548,
0.044679563492536545,
-0.04004669561982155,
0.05135217308998108,
0.029396699741482735,
0.032383326441049576,
-0.11884419620037079,
0.06548858433961868,
0.08869316428899765,
-0.15015044808387756,
0.06711173802614212,
0.03823615983128548,
-0.047466620802879333,
-0.09420137107372284,
-0.13346485793590546,
0.03890572488307953,
-0.017813950777053833,
-0.08503808826208115,
0.028826192021369934,
-0.16412289440631866,
0.03770729899406433,
0.12275142222642899,
0.03197905421257019,
-0.015746060758829117,
-0.057637039572000504,
-0.04862815886735916,
-0.017840437591075897,
-0.0012976126745343208,
0.12117787450551987,
-0.061729591339826584,
-0.13890759646892548,
0.14312341809272766,
0.012643405236303806,
0.07314753532409668,
-0.046594250947237015,
-0.03850717097520828,
-0.14064337313175201,
0.010256703943014145,
-0.1616012305021286,
0.018048113211989403,
-0.12244029343128204,
0.00017647678032517433,
-0.043000802397727966,
-0.018946025520563126,
-0.04627066105604172,
0.02448696456849575,
-0.10135885328054428,
0.017129862681031227,
-0.004036398138850927,
0.08267185837030411,
-0.1078885942697525,
0.07657241821289062,
0.08119937032461166,
-0.024501211941242218,
0.0691782683134079,
0.016025975346565247,
-0.03178290277719498,
0.10871491581201553,
-0.1687856763601303,
-0.05103516951203346,
0.04175671562552452,
0.03259343281388283,
-0.00504562770947814,
-0.14696408808231354,
0.012404599227011204,
0.029085105285048485,
0.038298655301332474,
-0.005287728272378445,
0.07739349454641342,
-0.05484538897871971,
-0.008663441054522991,
-0.04254349321126938,
-0.09657754004001617,
-0.00692899152636528,
0.07546093314886093,
0.10783912241458893,
0.0006485123303718865,
0.11461664736270905,
-0.0811559334397316,
0.047276418656110764,
-0.10175595432519913,
0.06706061959266663,
-0.040961142629384995,
-0.024465830996632576,
0.058752045035362244,
-0.14287222921848297,
0.05409255623817444,
0.0018448489718139172,
0.09946487098932266,
-0.007467167917639017,
0.01781085878610611,
0.022852526977658272,
-0.10041900724172592,
-0.11367934197187424,
0.04059262201189995,
0.13091330230236053,
0.08134536445140839,
-0.007397628389298916,
0.020836668089032173,
0.0008659055456519127,
0.035376034677028656,
0.2029009461402893,
0.20863112807273865,
0.19764217734336853,
0.0316678062081337,
0.09777643531560898,
0.017612576484680176,
-0.05735351890325546,
-0.029909107834100723,
0.008088282309472561,
-0.086646169424057,
0.024685971438884735,
-0.053350500762462616,
-0.05268390476703644,
0.08755175024271011,
-0.14795638620853424,
0.12320483475923538,
0.02361880987882614,
-0.08366382122039795,
-0.16045968234539032,
-0.16596679389476776,
-0.06568731367588043,
-0.0963723286986351,
-0.014714204706251621,
-0.13025446236133575,
-0.016114039346575737,
0.0175306499004364,
0.021621432155370712,
-0.11815638840198517,
0.09768778830766678,
-0.143490269780159,
-0.16742399334907532,
0.1796099692583084,
-0.03515126183629036,
0.007617873139679432,
-0.009849736467003822,
0.00429940689355135,
0.004851487465202808,
0.08918780833482742,
0.01530435774475336,
0.038496993482112885,
-0.0145879490301013,
0.05457339435815811,
-0.07602077722549438,
-0.04981798678636551,
-0.000522415793966502,
0.02602536603808403,
0.11607204377651215,
0.19557514786720276,
0.03678184375166893,
-0.05836709216237068,
0.008859462104737759,
0.14629407227039337,
0.024706721305847168,
-0.08870453387498856,
-0.1439661830663681,
0.06118026003241539,
0.03788185119628906,
0.03169132024049759,
-0.0407363697886467,
-0.06571339815855026,
0.008342611603438854,
0.2849138379096985,
0.15815141797065735,
-0.05818142369389534,
0.02638392336666584,
0.002250625053420663,
0.03941318392753601,
0.08083225041627884,
0.10240282118320465,
0.08458168804645538,
0.15632067620754242,
-0.033847350627183914,
-0.01226737629622221,
0.012518282979726791,
-0.06534484028816223,
-0.07790403813123703,
0.15796427428722382,
0.0387289896607399,
-0.08015213161706924,
0.018345586955547333,
0.15252263844013214,
-0.15741196274757385,
-0.09120368212461472,
-0.08132095634937286,
-0.053839653730392456,
-0.10292372107505798,
-0.02271675318479538,
-0.0478929840028286,
0.10653585940599442,
0.10449468344449997,
-0.00761332968249917,
-0.032901741564273834,
0.19101016223430634,
0.05203825980424881,
-0.016523044556379318,
-0.0508209727704525,
0.12135792523622513,
-0.03550386428833008,
0.06557603925466537,
-0.01875138282775879,
0.04016800597310066,
0.0496823713183403,
0.047083474695682526,
-0.0035039710346609354,
0.05152866989374161,
-0.006660157814621925,
0.03495464846491814,
0.07508514821529388,
0.1267777681350708,
0.004579426720738411,
0.04561168700456619,
0.09687381237745285,
-0.14171265065670013,
0.034461453557014465,
0.04653504490852356,
-0.04100695624947548,
-0.0018388463649898767,
0.17201226949691772,
-0.1995985060930252,
0.05688036233186722,
0.15161427855491638,
-0.029293881729245186,
-0.01585116982460022,
-0.040291693061590195,
0.06645109504461288,
-0.016458958387374878,
0.05475566163659096,
-0.03305698186159134,
-0.13995210826396942,
0.004634747747331858,
-0.06196758896112442,
0.03253528103232384,
-0.18630672991275787,
0.007574028335511684,
-0.03875865042209625,
-0.0004542423412203789,
-0.05346344783902168,
0.09550781548023224,
0.012862982228398323,
-0.05343835800886154,
0.01740637607872486,
-0.10318463295698166,
0.034049589186906815,
0.09727046638727188,
-0.08890959620475769,
-0.042581576853990555
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **et** on **10.6k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **et**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "et", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-et-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"et",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"et"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #et #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in et on 10.6k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in et. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in et on 10.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in et. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #et #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in et on 10.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in et. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #et #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in et on 10.6k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in et. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07505329698324203,
0.10469432920217514,
-0.0028013107366859913,
0.0060276766307652,
0.07829052209854126,
-0.04886259883642197,
0.13685618340969086,
0.04307026416063309,
0.005063355900347233,
0.09409962594509125,
-0.015044915489852428,
-0.04509710147976875,
0.07121676206588745,
0.13315334916114807,
0.059804536402225494,
-0.25483256578445435,
0.0385742112994194,
-0.06385063380002975,
0.052966464310884476,
0.04762904345989227,
0.12273883819580078,
-0.08475243300199509,
0.03043476864695549,
0.05542850121855736,
-0.03193452209234238,
0.03448091819882393,
-0.04885520413517952,
-0.08143681287765503,
0.055345866829156876,
0.047947391867637634,
-0.030964815989136696,
0.026463840156793594,
0.09358357638120651,
-0.18806137144565582,
0.03695838525891304,
0.042822256684303284,
0.028117263689637184,
0.008465495891869068,
0.0970555767416954,
0.01809270866215229,
0.15800002217292786,
-0.01858983375132084,
-0.0033699132036417723,
0.07844521850347519,
-0.05447812378406525,
-0.09120238572359085,
-0.05945922061800957,
0.1489536464214325,
0.10280898213386536,
0.10854306817054749,
-0.08059428632259369,
0.0810113474726677,
-0.01973828859627247,
0.045227378606796265,
0.07714538276195526,
-0.17595860362052917,
-0.052433475852012634,
0.05008787289261818,
0.10959818959236145,
0.0183793343603611,
-0.08840779960155487,
0.07067816704511642,
0.055984701961278915,
-0.010956590995192528,
-0.06397998332977295,
-0.03453667089343071,
0.13031180202960968,
-0.10607466846704483,
-0.11713151633739471,
0.0030736245680600405,
0.1759794056415558,
0.059806060045957565,
-0.07704214751720428,
-0.14814062416553497,
0.013220119290053844,
0.21137316524982452,
-0.05330070108175278,
-0.09689266234636307,
0.009275158867239952,
0.01530076377093792,
0.04892030358314514,
-0.06879577785730362,
-0.07379667460918427,
-0.0061767189763486385,
0.021847380325198174,
0.10676156729459763,
0.02327539771795273,
-0.020275522023439407,
-0.07517183572053909,
0.00007203923451015726,
-0.0943102315068245,
-0.11541052907705307,
-0.007840568199753761,
-0.06566125154495239,
-0.0696413442492485,
-0.03748017176985741,
0.0016011109109967947,
-0.09628802537918091,
0.0280457716435194,
0.10154155641794205,
0.06636504828929901,
0.055912017822265625,
-0.05451598018407822,
-0.03322786092758179,
0.12792254984378815,
0.0651915967464447,
-0.12001816928386688,
-0.018713731318712234,
0.011857028119266033,
-0.019416799768805504,
0.00543247302994132,
-0.036844197660684586,
-0.03690134733915329,
0.016096433624625206,
-0.01764950342476368,
0.04563372582197189,
0.058155741542577744,
-0.03520979359745979,
-0.03470182418823242,
-0.0964144840836525,
0.09912552684545517,
-0.08033951371908188,
0.02449653297662735,
0.04800991341471672,
-0.008668512105941772,
0.09008049964904785,
-0.0647398829460144,
0.08384277671575546,
-0.11122238636016846,
0.001944086980074644,
-0.02686312235891819,
-0.0059971255250275135,
0.02206871286034584,
-0.031030211597681046,
0.03284488245844841,
-0.005375218577682972,
0.00734493276104331,
-0.11839020252227783,
0.002603767905384302,
-0.10177095234394073,
-0.022084614261984825,
-0.08002713322639465,
-0.043305475264787674,
-0.051049958914518356,
0.01615178771317005,
-0.006628566421568394,
-0.005885538179427385,
0.012324475683271885,
-0.01984252780675888,
-0.010364404879510403,
0.007934058085083961,
0.043744999915361404,
0.05693687126040459,
0.08283526450395584,
-0.01853925734758377,
-0.019881607964634895,
-0.09851501137018204,
0.1134176254272461,
-0.07561369240283966,
-0.02037532813847065,
-0.1381748765707016,
-0.040637169033288956,
-0.04067583009600639,
0.034155137836933136,
0.01389332301914692,
0.12569408118724823,
-0.17545202374458313,
-0.07038739323616028,
0.10961148887872696,
-0.12058278918266296,
0.00983148068189621,
0.1818847358226776,
-0.0005867895670235157,
0.07703913748264313,
0.10085813701152802,
0.22260534763336182,
0.019352681934833527,
-0.17576099932193756,
-0.01317425724118948,
-0.05217446759343147,
0.0365375280380249,
0.13146613538265228,
0.06177106499671936,
-0.06426140666007996,
0.06444185227155685,
-0.01894335262477398,
-0.02596934512257576,
-0.08023663610219955,
-0.004497547168284655,
-0.0450250580906868,
0.022554917261004448,
-0.04700015112757683,
0.027823200449347496,
-0.005586749874055386,
-0.023166535422205925,
-0.015756627544760704,
-0.09112681448459625,
-0.06348627060651779,
0.11827851831912994,
-0.06654128432273865,
0.02599821239709854,
-0.10134989023208618,
0.059054289013147354,
0.06731369346380234,
0.003608139231801033,
-0.12527157366275787,
0.11465544998645782,
0.031975552439689636,
-0.0493265800178051,
0.1452099084854126,
0.08438525348901749,
-0.03388197720050812,
0.009460783563554287,
-0.014158166013658047,
0.018053200095891953,
-0.03182506933808327,
0.01448122225701809,
-0.026252683252096176,
-0.1060696616768837,
-0.004017206374555826,
-0.06831622868776321,
0.11317724734544754,
-0.13415725529193878,
-0.012100202031433582,
0.04126784950494766,
0.10640712827444077,
-0.014317238703370094,
-0.038517579436302185,
0.08935478329658508,
0.04351941496133804,
0.02815113589167595,
-0.020192377269268036,
0.0196559838950634,
-0.01972748152911663,
0.001813194015994668,
0.04636523500084877,
-0.15068760514259338,
-0.15848103165626526,
0.09512801468372345,
0.015864582732319832,
-0.01655672676861286,
0.06801053136587143,
0.021879112347960472,
-0.021975889801979065,
-0.048037752509117126,
-0.00017351623682770878,
0.24097990989685059,
-0.014688984490931034,
0.06030381843447685,
-0.0802764892578125,
-0.01034558191895485,
0.015284226275980473,
-0.049179963767528534,
-0.09113127738237381,
0.07998199015855789,
0.0046194386668503284,
-0.08031356334686279,
-0.043023884296417236,
0.05320258066058159,
0.06920047849416733,
0.1555020809173584,
0.0115878377109766,
-0.08535469323396683,
-0.029555464163422585,
-0.06371955573558807,
-0.01646457053720951,
0.04114818200469017,
-0.1462574005126953,
-0.02463231422007084,
0.02330472320318222,
0.009677082300186157,
0.0524267852306366,
-0.024734269827604294,
0.04609198495745659,
0.008581127040088177,
-0.050243839621543884,
-0.07827255129814148,
0.03415331989526749,
-0.0335964597761631,
0.036175742745399475,
-0.009877349250018597,
-0.00014085471048019826,
-0.0459505170583725,
-0.05904976278543472,
-0.14417976140975952,
0.09003477543592453,
-0.06760071963071823,
-0.3118727207183838,
-0.08323061466217041,
-0.05572791025042534,
-0.03105468861758709,
0.014826114289462566,
0.05345265939831734,
-0.11205174028873444,
-0.10971405357122421,
-0.06910160183906555,
0.12690715491771698,
-0.029306650161743164,
-0.06326474994421005,
0.11787210404872894,
-0.005509261507540941,
0.024524347856640816,
-0.101305291056633,
0.016718601807951927,
-0.0400066077709198,
-0.03300223872065544,
-0.029007768258452415,
0.01907406933605671,
0.058922991156578064,
0.127149760723114,
0.022397050634026527,
-0.006217132788151503,
0.00630080234259367,
0.22038082778453827,
-0.13709239661693573,
0.08087849617004395,
0.23643967509269714,
-0.05620824173092842,
-0.007724842056632042,
0.14394326508045197,
-0.010522476397454739,
-0.05180814862251282,
0.04756205901503563,
0.004762992728501558,
-0.02022247202694416,
-0.22545674443244934,
-0.11990055441856384,
-0.04153139144182205,
-0.025397518649697304,
0.04515531286597252,
0.01662052609026432,
0.0002168216451536864,
0.014804120175540447,
-0.0865476205945015,
-0.04203270375728607,
0.06295356154441833,
0.0334567166864872,
0.14077982306480408,
0.008425856940448284,
0.054477982223033905,
-0.04166349396109581,
-0.024523086845874786,
0.10120339691638947,
-0.02945633977651596,
0.04591949284076691,
0.07455762475728989,
0.09618956595659256,
0.06555028259754181,
0.03813304379582405,
0.05237367004156113,
-0.014644601382315159,
-0.020489178597927094,
-0.0020920068491250277,
-0.03104252554476261,
-0.06195592135190964,
0.023359501734375954,
0.044697027653455734,
0.14549779891967773,
-0.13658569753170013,
-0.11894731223583221,
0.03084838017821312,
0.014591369777917862,
0.11715471744537354,
0.10337275266647339,
-0.028604285791516304,
-0.09228330850601196,
0.037481073290109634,
-0.09165505319833755,
-0.033443108201026917,
0.054252736270427704,
0.08464537560939789,
-0.1602403223514557,
0.09231686592102051,
0.07847241312265396,
0.08848275989294052,
-0.03863158077001572,
0.03324974700808525,
-0.05158615857362747,
0.05626978725194931,
0.0006958851008675992,
0.0710822269320488,
-0.17450912296772003,
0.10571592301130295,
0.01421735342592001,
0.08914418518543243,
-0.05382354184985161,
0.02502669207751751,
0.04368864744901657,
0.010548588819801807,
0.12624458968639374,
-0.011657904833555222,
-0.0994143933057785,
0.0029617585241794586,
-0.11437001079320908,
0.019321197643876076,
0.057386428117752075,
-0.06000211462378502,
0.056328583508729935,
-0.0006499867886304855,
-0.00583120109513402,
-0.033503174781799316,
-0.002314175246283412,
-0.2613023519515991,
-0.13827668130397797,
0.0473240427672863,
-0.002364054322242737,
0.050912898033857346,
-0.03729454427957535,
-0.07803250849246979,
-0.1302081048488617,
0.10666295886039734,
-0.011669117957353592,
-0.017479633912444115,
-0.07157300412654877,
0.019577285274863243,
0.09744662791490555,
-0.06158324331045151,
0.014981331303715706,
0.050375983119010925,
0.14487439393997192,
-0.06461972743272781,
-0.03947475180029869,
0.022334236651659012,
-0.10163046419620514,
-0.12457673996686935,
0.015660692006349564,
0.17312301695346832,
0.11435678601264954,
0.06469408422708511,
0.09327232092618942,
0.016978397965431213,
-0.006005585193634033,
-0.09616249054670334,
0.02326081693172455,
0.024844300001859665,
-0.07314147800207138,
0.04587199166417122,
-0.0013783029280602932,
-0.28128498792648315,
-0.15241029858589172,
-0.06336227059364319,
0.07459927350282669,
0.1841239631175995,
-0.02394864335656166,
0.16523776948451996,
0.2792734205722809,
-0.0907491073012352,
-0.22402113676071167,
-0.039499200880527496,
0.0011390818981453776,
0.027412714436650276,
0.04772917181253433,
-0.2059810757637024,
0.1004653126001358,
-0.004433632828295231,
0.013559866696596146,
-0.06166333332657814,
-0.21102802455425262,
-0.13821367919445038,
0.1723572462797165,
-0.025296717882156372,
0.044554002583026886,
-0.022665007039904594,
-0.06536035239696503,
-0.03620782867074013,
-0.05453232303261757,
0.012439972721040249,
-0.09752482920885086,
0.07220089435577393,
0.05609425529837608,
0.01559407077729702,
0.02190389856696129,
0.0145178297534585,
0.11076740175485611,
0.09431978315114975,
-0.02140955626964569,
-0.08042449504137039,
0.01618826389312744,
0.003593489993363619,
-0.012432177551090717,
0.10660439729690552,
0.04959363490343094,
0.014216874726116657,
-0.04424522444605827,
-0.08520469069480896,
-0.06398124247789383,
0.06259185820817947,
-0.07133302092552185,
-0.014439446851611137,
-0.05540781468153,
0.08772380650043488,
0.013504203408956528,
0.0003004697500728071,
-0.07067926973104477,
-0.09392551332712173,
-0.017684033140540123,
0.11498472839593887,
0.22167716920375824,
-0.05648748204112053,
0.0005504961009137332,
-0.04347347468137741,
-0.044579096138477325,
0.04572798311710358,
-0.004191340412944555,
0.04334237053990364,
0.052071020007133484,
0.022188283503055573,
0.09121015667915344,
-0.030572179704904556,
-0.1317431777715683,
0.03189803287386894,
0.03768077865242958,
-0.06670018285512924,
-0.1869996190071106,
-0.046878401190042496,
-0.004458174109458923,
-0.021785324439406395,
-0.029591860249638557,
0.19371354579925537,
-0.014016564004123211,
-0.05508098006248474,
0.0006566076772287488,
0.06166253983974457,
-0.00219556107185781,
0.12267198413610458,
0.0455610565841198,
0.03889614716172218,
-0.08958262205123901,
0.05479291081428528,
0.1185237392783165,
-0.0386028029024601,
0.05020670220255852,
0.08869393914937973,
-0.04613617807626724,
-0.055081807076931,
-0.10014912486076355,
0.0009273809846490622,
0.06259901076555252,
-0.062145110219717026,
-0.004873448051512241,
-0.10123714059591293,
0.008599383756518364,
0.002232976956292987,
0.012375795282423496,
-0.04711709916591644,
-0.04641493409872055,
-0.0009898713324218988,
-0.09240429848432541,
0.06613428890705109,
0.10237758606672287,
-0.02928243577480316,
-0.11038263887166977,
0.10782431066036224,
0.01691475696861744,
0.08275867998600006,
-0.03718530014157295,
-0.06373895704746246,
-0.08564577251672745,
-0.004537672735750675,
-0.08793959766626358,
0.036374542862176895,
-0.13551892340183258,
-0.01350135263055563,
-0.0438787043094635,
-0.0347772054374218,
-0.01310541108250618,
0.07218942791223526,
-0.031322214752435684,
0.003513326635584235,
-0.026987750083208084,
0.08424149453639984,
-0.12639202177524567,
0.07059507817029953,
0.05689658224582672,
-0.046452462673187256,
0.1100061759352684,
0.01965673640370369,
-0.05341193825006485,
0.04043196141719818,
-0.21179689466953278,
-0.05763551592826843,
-0.03197867423295975,
0.0458417683839798,
-0.013076310977339745,
-0.17603352665901184,
-0.0017735291039571166,
0.01623576693236828,
0.01529004704207182,
-0.021722499281167984,
0.04813789203763008,
-0.02719874493777752,
-0.014890587888658047,
-0.06904783844947815,
-0.06397117674350739,
-0.03768642246723175,
0.06414953619241714,
0.06764453649520874,
0.005226682405918837,
0.10304711759090424,
-0.08792107552289963,
0.07658391445875168,
-0.0768408551812172,
0.027220802381634712,
-0.0267360620200634,
0.023418597877025604,
-0.07175969332456589,
-0.07534375041723251,
0.08095414191484451,
-0.01855875924229622,
0.06864091753959656,
0.026703612878918648,
-0.027066562324762344,
0.042304374277591705,
-0.05311411991715431,
-0.05971525236964226,
0.04030369594693184,
0.13326191902160645,
0.04781804978847504,
0.01803961955010891,
-0.009280277416110039,
-0.04326264187693596,
0.007051282562315464,
0.14665737748146057,
0.14520639181137085,
0.16736210882663727,
0.09868413954973221,
0.03493969514966011,
0.06846921890974045,
-0.049699824303388596,
-0.08793777227401733,
0.08690426498651505,
-0.0669153481721878,
0.037464141845703125,
-0.04590452089905739,
-0.0709417536854744,
0.07277042418718338,
-0.13609063625335693,
0.07306896895170212,
-0.02832951955497265,
-0.08529277890920639,
-0.10962413251399994,
-0.13968117535114288,
-0.06392468512058258,
-0.04237116500735283,
0.002957309130579233,
-0.10939956456422806,
0.026336614042520523,
0.007017380557954311,
0.029996655881404877,
-0.0918368548154831,
0.11553224921226501,
-0.11960954964160919,
-0.1265259087085724,
0.1516113579273224,
-0.03610476478934288,
-0.013054920360445976,
0.0013716771500185132,
0.04566625878214836,
0.02529313787817955,
0.09465990215539932,
0.05052270367741585,
0.04611392319202423,
0.017648261040449142,
0.029295271262526512,
-0.09693966805934906,
-0.0661720335483551,
0.03018968366086483,
-0.015556810423731804,
0.09812651574611664,
0.18587346374988556,
0.08985873311758041,
-0.08252644538879395,
0.013459144160151482,
0.13961561024188995,
0.025258993729948997,
-0.11848016828298569,
-0.14749018847942352,
0.0346447229385376,
-0.028780726715922356,
0.0008872911566868424,
0.005162529181689024,
-0.09551189839839935,
0.019856516271829605,
0.20381654798984528,
0.17564424872398376,
-0.03733086213469505,
0.02029835619032383,
-0.011217981576919556,
0.008884682320058346,
0.024645371362566948,
0.0799175277352333,
0.08719904720783234,
0.17705456912517548,
-0.008493851870298386,
0.04736968129873276,
-0.01898430846631527,
-0.09297399967908859,
-0.11575841158628464,
0.09493141621351242,
0.006903642322868109,
-0.03435852751135826,
-0.005620185285806656,
0.18489113450050354,
-0.10114974528551102,
-0.2227419763803482,
-0.1266021430492401,
-0.03965839743614197,
-0.11554689705371857,
0.02635638788342476,
-0.03942158445715904,
0.13504637777805328,
0.053909335285425186,
-0.0050746542401611805,
0.01119125448167324,
0.1757461279630661,
0.036782681941986084,
0.028186587616801262,
-0.024965522810816765,
0.10974825918674469,
-0.09572740644216537,
0.1145128607749939,
-0.003078548237681389,
0.05404994264245033,
0.03504094481468201,
0.03504117205739021,
-0.06552128493785858,
0.03386792913079262,
0.033683158457279205,
0.004213492851704359,
0.051665447652339935,
0.16934546828269958,
-0.005779041908681393,
0.08941740542650223,
0.10781121999025345,
-0.06555987894535065,
0.02581523172557354,
-0.014343034476041794,
0.00013072098954580724,
-0.05959334224462509,
0.1601054072380066,
-0.1508234441280365,
0.12744197249412537,
0.09922041743993759,
-0.07094603776931763,
-0.04502176120877266,
-0.008994176983833313,
0.04886801540851593,
-0.05693734437227249,
0.09338656067848206,
-0.007430474739521742,
-0.17200103402137756,
0.027096981182694435,
-0.13031530380249023,
0.0691145807504654,
-0.2605404257774353,
-0.04125817492604256,
-0.04429001361131668,
-0.01874055154621601,
0.00460016168653965,
0.1095646470785141,
0.08414305001497269,
-0.049101971089839935,
-0.01195972878485918,
-0.038527246564626694,
0.0073104919865727425,
0.09433352947235107,
-0.08540094643831253,
-0.030901461839675903
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **fi** on **14.2k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **fi**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "fi", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-fi-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"fi",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"fi"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #fi #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in fi on 14.2k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fi. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in fi on 14.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fi. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #fi #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in fi on 14.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fi. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #fi #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in fi on 14.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fi. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07835821807384491,
0.11275938153266907,
-0.0030143079347908497,
0.008187904953956604,
0.07695046812295914,
-0.05171467363834381,
0.13641931116580963,
0.04390270635485649,
0.003976451698690653,
0.09749335795640945,
-0.018743714317679405,
-0.053762711584568024,
0.07682640850543976,
0.13591423630714417,
0.0538911446928978,
-0.25412487983703613,
0.03914457559585571,
-0.06598079204559326,
0.04367811977863312,
0.04596050828695297,
0.12313011288642883,
-0.07860951125621796,
0.030244147405028343,
0.05845415219664574,
-0.04338204860687256,
0.03002297319471836,
-0.04366350546479225,
-0.07998308539390564,
0.05456540733575821,
0.055930379778146744,
-0.02953055500984192,
0.021904049441218376,
0.09000267833471298,
-0.18909503519535065,
0.03792239725589752,
0.04097369313240051,
0.030210787430405617,
0.011763307265937328,
0.10280503332614899,
0.02280295081436634,
0.1619921624660492,
-0.011549209244549274,
-0.005633647087961435,
0.08229462802410126,
-0.055196113884449005,
-0.0815754383802414,
-0.06413683295249939,
0.15729552507400513,
0.09515159577131271,
0.10686315596103668,
-0.07739920169115067,
0.06912967562675476,
-0.017431359738111496,
0.04575907066464424,
0.07805566489696503,
-0.17762187123298645,
-0.05589883401989937,
0.0545177161693573,
0.10383618623018265,
0.009988617151975632,
-0.08845008909702301,
0.07048668712377548,
0.05286598950624466,
-0.011831031180918217,
-0.06494756042957306,
-0.0349029079079628,
0.1293967366218567,
-0.10958210378885269,
-0.11722273379564285,
0.004208782687783241,
0.16552208364009857,
0.05806660279631615,
-0.07905849069356918,
-0.14643996953964233,
0.012109937146306038,
0.21363875269889832,
-0.059251777827739716,
-0.08977013826370239,
0.007380496244877577,
0.018555819988250732,
0.06399591267108917,
-0.07252036035060883,
-0.07023004442453384,
-0.007888862863183022,
0.02062038704752922,
0.11492560803890228,
0.021208802238106728,
-0.015606190077960491,
-0.07290005683898926,
-0.003907516598701477,
-0.10704222321510315,
-0.1144641786813736,
-0.007167460862547159,
-0.06579434871673584,
-0.0717247799038887,
-0.03574838116765022,
-0.003629053710028529,
-0.09459231048822403,
0.03028867579996586,
0.0932542160153389,
0.059611398726701736,
0.05028221756219864,
-0.05441206693649292,
-0.032286033034324646,
0.12166690826416016,
0.05550622567534447,
-0.12288941442966461,
-0.012696338817477226,
0.016780493780970573,
-0.020941399037837982,
0.0011519492836669087,
-0.04002838209271431,
-0.036268576979637146,
0.013888183981180191,
-0.02472107484936714,
0.04487767070531845,
0.058182332664728165,
-0.03550656512379646,
-0.032586678862571716,
-0.09433703124523163,
0.0922161266207695,
-0.08217009902000427,
0.026395494118332863,
0.04954603314399719,
-0.007856922224164009,
0.08893541246652603,
-0.06452227383852005,
0.08300089836120605,
-0.11160115152597427,
0.003320856485515833,
-0.025792162865400314,
-0.0046526421792805195,
0.022420503199100494,
-0.029254432767629623,
0.03269575908780098,
-0.007339489180594683,
0.008082237094640732,
-0.11756344139575958,
0.003015677910298109,
-0.10232177376747131,
-0.02205476723611355,
-0.08289145678281784,
-0.04897312447428703,
-0.042025018483400345,
0.010427745059132576,
-0.006306306459009647,
-0.00269884429872036,
0.015104036778211594,
-0.017921628430485725,
-0.0071287257596850395,
0.009754998609423637,
0.042937614023685455,
0.05156758055090904,
0.07913698256015778,
-0.013723373413085938,
-0.017230652272701263,
-0.09670107066631317,
0.11420601606369019,
-0.0777352824807167,
-0.02354683168232441,
-0.13667002320289612,
-0.0354660302400589,
-0.03691249340772629,
0.033111102879047394,
0.013496652245521545,
0.12884747982025146,
-0.18002718687057495,
-0.06924422085285187,
0.11857052892446518,
-0.12601536512374878,
0.01292246300727129,
0.18056216835975647,
0.0016262801364064217,
0.07396239787340164,
0.09850030392408371,
0.22137588262557983,
0.026386210694909096,
-0.18239617347717285,
-0.011525429785251617,
-0.05072186514735222,
0.040040887892246246,
0.13210159540176392,
0.06279852986335754,
-0.06063747778534889,
0.05799088254570961,
-0.018223201856017113,
-0.03420742228627205,
-0.075129434466362,
-0.005022226832807064,
-0.04765661805868149,
0.01660134457051754,
-0.0513012558221817,
0.0221160426735878,
-0.007682268042117357,
-0.021545635536313057,
-0.00947342999279499,
-0.08360794186592102,
-0.06312162429094315,
0.12010543793439865,
-0.061905425041913986,
0.024051927030086517,
-0.10168079286813736,
0.0685231164097786,
0.06518229842185974,
0.005129232537001371,
-0.12798987329006195,
0.11737032234668732,
0.028469519689679146,
-0.050975121557712555,
0.14045500755310059,
0.08160009980201721,
-0.03143032267689705,
0.012805340811610222,
-0.013301447033882141,
0.02502736821770668,
-0.030166031792759895,
0.01385781355202198,
-0.026528920978307724,
-0.105583555996418,
-0.008396671153604984,
-0.06663981825113297,
0.12302284687757492,
-0.14551964402198792,
-0.010169094428420067,
0.0440865196287632,
0.10954608768224716,
-0.011544610373675823,
-0.04010949656367302,
0.09052965044975281,
0.041342929005622864,
0.03227776288986206,
-0.019325165078043938,
0.019691357389092445,
-0.013496965169906616,
0.006399539764970541,
0.048387572169303894,
-0.14337079226970673,
-0.14472240209579468,
0.09461256116628647,
0.025266053155064583,
-0.01894506625831127,
0.0650985911488533,
0.021665284410119057,
-0.01938435435295105,
-0.053017664700746536,
0.0029667990747839212,
0.23411189019680023,
-0.01101623009890318,
0.06346377730369568,
-0.08019395172595978,
-0.005835512187331915,
0.020583467558026314,
-0.052285581827163696,
-0.08759382367134094,
0.07993811368942261,
0.0018128658412024379,
-0.07467718422412872,
-0.042793575674295425,
0.040523696690797806,
0.06722689419984818,
0.14916585385799408,
0.008937851525843143,
-0.08905699849128723,
-0.030846478417515755,
-0.06019657105207443,
-0.010898715816438198,
0.043103016912937164,
-0.1360502392053604,
-0.0227106511592865,
0.024124663323163986,
0.0043869842775166035,
0.05081351101398468,
-0.02408244088292122,
0.042556069791316986,
0.012063274160027504,
-0.052802667021751404,
-0.07321125268936157,
0.03504154831171036,
-0.03154146671295166,
0.03914995491504669,
-0.007706951815634966,
-0.0019444598583504558,
-0.04931172356009483,
-0.05937712639570236,
-0.14480191469192505,
0.08569085597991943,
-0.06207023561000824,
-0.3087140619754791,
-0.08708260953426361,
-0.06114676222205162,
-0.03384159505367279,
0.014121875166893005,
0.04743426293134689,
-0.10536700487136841,
-0.10803250968456268,
-0.0715198963880539,
0.1259036511182785,
-0.029137661680579185,
-0.06066255271434784,
0.11648649722337723,
-0.004812229890376329,
0.02898460626602173,
-0.09477606415748596,
0.017715614289045334,
-0.036291223019361496,
-0.02586328610777855,
-0.032231517136096954,
0.02300928719341755,
0.06347905099391937,
0.12280730903148651,
0.02231607772409916,
-0.003552358830347657,
0.011064541526138783,
0.2338392436504364,
-0.13729731738567352,
0.08201225847005844,
0.2394251674413681,
-0.061870936304330826,
-0.007781358901411295,
0.14574654400348663,
-0.005296561401337385,
-0.0528467521071434,
0.04175791144371033,
0.006569490302354097,
-0.018668638542294502,
-0.21588706970214844,
-0.12046387046575546,
-0.044404905289411545,
-0.024598058313131332,
0.04405533894896507,
0.019283808767795563,
0.00086311245104298,
0.013838976621627808,
-0.08249642699956894,
-0.041482049971818924,
0.06215789169073105,
0.03082539327442646,
0.13895343244075775,
0.010595888830721378,
0.05197504535317421,
-0.04262310639023781,
-0.021839560940861702,
0.10501831024885178,
-0.029454154893755913,
0.04372651129961014,
0.0717041939496994,
0.09160878509283066,
0.06465081125497818,
0.04013006016612053,
0.0598343126475811,
-0.016270743682980537,
-0.024134323000907898,
-0.003926586825400591,
-0.026972636580467224,
-0.06508401781320572,
0.02224489487707615,
0.043199922889471054,
0.13768616318702698,
-0.13250835239887238,
-0.12122639268636703,
0.02411350980401039,
0.010169870220124722,
0.1232718750834465,
0.10187878459692001,
-0.023343075066804886,
-0.0980042889714241,
0.036170490086078644,
-0.09480241686105728,
-0.03821876272559166,
0.05267108231782913,
0.09079493582248688,
-0.16057252883911133,
0.0880407840013504,
0.074491485953331,
0.09176328033208847,
-0.04169663041830063,
0.03139850124716759,
-0.05120934173464775,
0.05651453509926796,
0.0014055768260732293,
0.06836243718862534,
-0.17411087453365326,
0.10110834240913391,
0.01671307347714901,
0.0884210616350174,
-0.05204365402460098,
0.026299569755792618,
0.04450830817222595,
0.018585890531539917,
0.1275397390127182,
-0.009668711572885513,
-0.08761248737573624,
-0.01051977276802063,
-0.11707759648561478,
0.01637452095746994,
0.055649012327194214,
-0.06286773830652237,
0.058041226118803024,
-0.003354177577421069,
-0.005569830536842346,
-0.0374566949903965,
-0.003731175558641553,
-0.25555187463760376,
-0.1421087235212326,
0.05281129479408264,
0.002130670240148902,
0.05217543989419937,
-0.03987499698996544,
-0.07780177891254425,
-0.13023874163627625,
0.1114201471209526,
-0.009524796158075333,
-0.017700808122754097,
-0.0769868940114975,
0.020804105326533318,
0.09944146126508713,
-0.058003902435302734,
0.01817072182893753,
0.046419814229011536,
0.15080514550209045,
-0.06769869476556778,
-0.040483683347702026,
0.01698501594364643,
-0.10146912932395935,
-0.12131364643573761,
0.01670306921005249,
0.17372150719165802,
0.11135511845350266,
0.0628506988286972,
0.09408707916736603,
0.017724523320794106,
-0.004123349208384752,
-0.09815756231546402,
0.017265692353248596,
0.027725310996174812,
-0.07281839102506638,
0.038804348558187485,
0.0010419407626613975,
-0.2671933174133301,
-0.14825977385044098,
-0.06504315882921219,
0.07589132338762283,
0.18799427151679993,
-0.025698507204651833,
0.16403034329414368,
0.26907965540885925,
-0.08807946741580963,
-0.21991907060146332,
-0.039756711572408676,
0.0027783215045928955,
0.03127165883779526,
0.04176032543182373,
-0.20284707844257355,
0.09717534482479095,
-0.003187391674146056,
0.011048790998756886,
-0.049182139337062836,
-0.21451425552368164,
-0.1329803615808487,
0.16944724321365356,
-0.02184276655316353,
0.043679460883140564,
-0.029260417446494102,
-0.07136833667755127,
-0.03411197289824486,
-0.061763785779476166,
0.015256513841450214,
-0.10223188996315002,
0.07235396653413773,
0.05648556351661682,
0.013942699879407883,
0.026651721447706223,
0.015407178550958633,
0.1148742064833641,
0.0864470973610878,
-0.023506689816713333,
-0.08023399859666824,
0.021623220294713974,
-0.001345722354017198,
-0.01132507249712944,
0.10564917325973511,
0.05033569410443306,
0.016423096880316734,
-0.04699518159031868,
-0.08594893664121628,
-0.06435433775186539,
0.06303958594799042,
-0.07286131381988525,
-0.012026865966618061,
-0.05438908934593201,
0.08957816660404205,
0.013992637395858765,
0.000815384613815695,
-0.08373688906431198,
-0.09746009856462479,
-0.014991520904004574,
0.1261107176542282,
0.21107153594493866,
-0.04999750480055809,
-0.004202701151371002,
-0.03958369791507721,
-0.04504331573843956,
0.04739363119006157,
-0.003563692094758153,
0.043168358504772186,
0.051236119121313095,
0.024689292535185814,
0.09002619236707687,
-0.03387996926903725,
-0.13053856790065765,
0.02637341432273388,
0.036852121353149414,
-0.07156059145927429,
-0.18477043509483337,
-0.048846691846847534,
-0.003665697993710637,
-0.022113360464572906,
-0.03568665310740471,
0.19612166285514832,
-0.020002340897917747,
-0.05271342396736145,
0.004621479660272598,
0.059161726385354996,
-0.006144174374639988,
0.12102603912353516,
0.04850592464208603,
0.038973368704319,
-0.09339209645986557,
0.049282390624284744,
0.11626945436000824,
-0.03574937582015991,
0.04560237377882004,
0.08759755641222,
-0.0489586777985096,
-0.05440770834684372,
-0.1002020537853241,
-0.0027933157980442047,
0.061231110244989395,
-0.06267646700143814,
-0.00046012323582544923,
-0.10810868442058563,
0.009012838825583458,
0.01434059627354145,
0.012216232717037201,
-0.045171622186899185,
-0.04636013135313988,
-0.0043365140445530415,
-0.09582222998142242,
0.06482939422130585,
0.09916210174560547,
-0.03075389936566353,
-0.1140957772731781,
0.11355867236852646,
0.013243664987385273,
0.08343584090471268,
-0.03778683766722679,
-0.060459334403276443,
-0.0863785669207573,
-0.005924362689256668,
-0.08788292109966278,
0.038964904844760895,
-0.13347797095775604,
-0.009766346774995327,
-0.047232940793037415,
-0.03504699841141701,
-0.009377541951835155,
0.0684703066945076,
-0.030107924714684486,
0.00397200882434845,
-0.0312691405415535,
0.07899930328130722,
-0.12405434995889664,
0.07204756140708923,
0.05868503451347351,
-0.048402443528175354,
0.10876498371362686,
0.02273845113813877,
-0.05278833955526352,
0.036637768149375916,
-0.20683400332927704,
-0.05429144203662872,
-0.030925143510103226,
0.04242013767361641,
-0.011196861043572426,
-0.17910894751548767,
-0.0008727851673029363,
0.01635904610157013,
0.013927629217505455,
-0.01914479024708271,
0.05921529605984688,
-0.025750746950507164,
-0.02002173662185669,
-0.06750216335058212,
-0.055600572377443314,
-0.034842126071453094,
0.06345696747303009,
0.07662706077098846,
0.009402123279869556,
0.0978924110531807,
-0.0856047198176384,
0.07959498465061188,
-0.07550450414419174,
0.027487220242619514,
-0.028386764228343964,
0.023818746209144592,
-0.06709712743759155,
-0.07370644062757492,
0.08019217103719711,
-0.015748348087072372,
0.07501886785030365,
0.02435232885181904,
-0.020249342545866966,
0.04072760045528412,
-0.059814970940351486,
-0.05747988820075989,
0.04121970385313034,
0.14122799038887024,
0.057422176003456116,
0.018674960359930992,
-0.004099322482943535,
-0.043249212205410004,
0.0029116161167621613,
0.14912980794906616,
0.14267998933792114,
0.165162593126297,
0.09667492657899857,
0.031682975590229034,
0.07075001299381256,
-0.04803658276796341,
-0.07814694941043854,
0.0891498401761055,
-0.07449611276388168,
0.03451678156852722,
-0.050728414207696915,
-0.0633312538266182,
0.07573367655277252,
-0.13567522168159485,
0.07637382298707962,
-0.025043847039341927,
-0.0847867801785469,
-0.10985447466373444,
-0.13983912765979767,
-0.0631013810634613,
-0.036782704293727875,
0.0012518917210400105,
-0.1088646948337555,
0.03133078292012215,
0.005929172039031982,
0.028446264564990997,
-0.08961402624845505,
0.11504793167114258,
-0.11998865753412247,
-0.12384805828332901,
0.15504682064056396,
-0.03532874584197998,
-0.01480600330978632,
0.000862975837662816,
0.04454290494322777,
0.025414101779460907,
0.09438338875770569,
0.05142056196928024,
0.04881476238369942,
0.02216997556388378,
0.03122996911406517,
-0.0971963182091713,
-0.06670504808425903,
0.030793072655797005,
-0.01715083234012127,
0.10147462040185928,
0.1860845685005188,
0.08820848166942596,
-0.08052250742912292,
0.010525966063141823,
0.1457940936088562,
0.023055898025631905,
-0.11750613898038864,
-0.15147359669208527,
0.022542282938957214,
-0.03611186519265175,
0.0008470662287436426,
0.0005940611008554697,
-0.09506186097860336,
0.018053095787763596,
0.21514110267162323,
0.17023341357707977,
-0.0472005195915699,
0.023447906598448753,
-0.006181266158819199,
0.007933170534670353,
0.02365638129413128,
0.07791333645582199,
0.08327700197696686,
0.17920474708080292,
-0.009375553578138351,
0.04577462375164032,
-0.019754325971007347,
-0.0971757024526596,
-0.11870379000902176,
0.10565976798534393,
0.0035112362820655107,
-0.03262702375650406,
-0.008158575743436813,
0.1838369518518448,
-0.10368356853723526,
-0.22745972871780396,
-0.12502723932266235,
-0.038604747503995895,
-0.11699505895376205,
0.024044010788202286,
-0.0541488453745842,
0.13445520401000977,
0.0494912713766098,
-0.006423599552363157,
0.011820665560662746,
0.18132704496383667,
0.03621033579111099,
0.03346193954348564,
-0.031492359936237335,
0.10935017466545105,
-0.08658397942781448,
0.10691740363836288,
-0.0007234505610540509,
0.05038992315530777,
0.031636592000722885,
0.035077355802059174,
-0.07062656432390213,
0.031094299629330635,
0.03418802469968796,
-0.00381763675250113,
0.045642510056495667,
0.17497673630714417,
-0.005678069777786732,
0.0845789983868599,
0.10626799613237381,
-0.06678163260221481,
0.024652130901813507,
-0.019504765048623085,
0.0004212821659166366,
-0.0614369697868824,
0.15518821775913239,
-0.14924539625644684,
0.12509077787399292,
0.10586065798997879,
-0.07172121852636337,
-0.043054938316345215,
-0.009829983115196228,
0.05150938034057617,
-0.05793270096182823,
0.09365271031856537,
-0.0069418856874108315,
-0.1698901504278183,
0.02671164833009243,
-0.12814834713935852,
0.06973209232091904,
-0.25860661268234253,
-0.043260153383016586,
-0.04661889746785164,
-0.016964927315711975,
0.006611193995922804,
0.10933239758014679,
0.08776397258043289,
-0.04888423532247543,
-0.011026868596673012,
-0.046998780220746994,
0.011202389374375343,
0.09376244992017746,
-0.08185997605323792,
-0.029437901452183723
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **fr** on **22.8k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **fr**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "fr", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-fr-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"fr",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"fr"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #fr #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in fr on 22.8k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fr. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in fr on 22.8k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fr. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #fr #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in fr on 22.8k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fr. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #fr #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in fr on 22.8k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in fr. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07711395621299744,
0.10316306352615356,
-0.0028962523210793734,
0.010290213860571384,
0.07701979577541351,
-0.05247604846954346,
0.12916789948940277,
0.04075612500309944,
0.004587925039231777,
0.10119981318712234,
-0.014181402511894703,
-0.05417405068874359,
0.07165813446044922,
0.13062353432178497,
0.052206750959157944,
-0.25422361493110657,
0.03689411282539368,
-0.06513313204050064,
0.041655831038951874,
0.04722105711698532,
0.1224484071135521,
-0.08024788647890091,
0.030545884743332863,
0.05990810692310333,
-0.04518613964319229,
0.03004860319197178,
-0.044550418853759766,
-0.07832789421081543,
0.05236965790390968,
0.056569524109363556,
-0.03360600769519806,
0.02750616893172264,
0.09395329654216766,
-0.18969562649726868,
0.037594906985759735,
0.040895819664001465,
0.02508363500237465,
0.0133206220343709,
0.10331641137599945,
0.017161153256893158,
0.1700422614812851,
-0.014179570600390434,
-0.009566092863678932,
0.08248431980609894,
-0.055238619446754456,
-0.0929301455616951,
-0.06336498260498047,
0.16438573598861694,
0.08900652080774307,
0.11006472259759903,
-0.07816090434789658,
0.07053881138563156,
-0.02188437059521675,
0.047486625611782074,
0.07511046528816223,
-0.17509838938713074,
-0.05610404163599014,
0.05469956248998642,
0.09721225500106812,
0.00810147076845169,
-0.0903257429599762,
0.07022031396627426,
0.053284887224435806,
-0.012327946722507477,
-0.06497296690940857,
-0.03674110397696495,
0.13141442835330963,
-0.10973053425550461,
-0.11879466474056244,
0.0024571185931563377,
0.17070616781711578,
0.06250152736902237,
-0.07684002816677094,
-0.14241018891334534,
0.010113051161170006,
0.20907184481620789,
-0.060252558439970016,
-0.09522996097803116,
0.006804307922720909,
0.01913038082420826,
0.06002845987677574,
-0.07288166135549545,
-0.07159468531608582,
-0.0054477727971971035,
0.023048195987939835,
0.1208764985203743,
0.021090636029839516,
-0.015986667945981026,
-0.0686522051692009,
-0.005816919729113579,
-0.10561451315879822,
-0.11453312635421753,
-0.006746205501258373,
-0.06444545835256577,
-0.06863100826740265,
-0.03544751554727554,
-0.003940993454307318,
-0.0972432792186737,
0.031158700585365295,
0.09867289662361145,
0.04863402619957924,
0.053514089435338974,
-0.05582534894347191,
-0.035436101257801056,
0.12305912375450134,
0.054340776056051254,
-0.122175432741642,
-0.010324886068701744,
0.01775343157351017,
-0.01969062350690365,
-0.0002724790829233825,
-0.03673244267702103,
-0.03465801849961281,
0.017621546983718872,
-0.02320043183863163,
0.038176946341991425,
0.05029141902923584,
-0.03434135764837265,
-0.030559740960597992,
-0.09451636672019958,
0.10305893421173096,
-0.08035308867692947,
0.024392010644078255,
0.05497003346681595,
-0.00907533522695303,
0.0894615650177002,
-0.06128428503870964,
0.08135496079921722,
-0.10997382551431656,
0.004378513433039188,
-0.025591585785150528,
-0.005286558996886015,
0.02454223670065403,
-0.027612123638391495,
0.03138740360736847,
-0.00105628976598382,
0.007064443081617355,
-0.11447232216596603,
0.007064148783683777,
-0.10490331053733826,
-0.020772194489836693,
-0.08530306071043015,
-0.044673558324575424,
-0.04304039850831032,
0.01063633430749178,
-0.003563494887202978,
0.00038178323302417994,
0.013550189323723316,
-0.01751650869846344,
-0.009015350602567196,
0.012784790247678757,
0.041146695613861084,
0.05162554606795311,
0.08219567686319351,
-0.015475411899387836,
-0.019234834238886833,
-0.09610599279403687,
0.11548180878162384,
-0.08169962465763092,
-0.02846614271402359,
-0.13749901950359344,
-0.03925387188792229,
-0.04810401052236557,
0.03475799411535263,
0.010455108247697353,
0.12867046892642975,
-0.1661422997713089,
-0.0706813707947731,
0.11661704629659653,
-0.12335852533578873,
0.017434943467378616,
0.1794859766960144,
0.0004904770758002996,
0.06880892813205719,
0.09373115003108978,
0.21915720403194427,
0.028631186112761497,
-0.17870211601257324,
-0.009790213778614998,
-0.042931362986564636,
0.03984815254807472,
0.1309383511543274,
0.06383507698774338,
-0.06264690309762955,
0.05620378255844116,
-0.016856390982866287,
-0.029003214091062546,
-0.07683274894952774,
-0.00631084805354476,
-0.04305766150355339,
0.01740902103483677,
-0.04725545644760132,
0.016109328716993332,
-0.0023920058738440275,
-0.01787143014371395,
-0.010650141164660454,
-0.08577888458967209,
-0.06607413291931152,
0.12005414813756943,
-0.06253877282142639,
0.02041875384747982,
-0.09846839308738708,
0.06396979093551636,
0.06499435752630234,
0.005433111917227507,
-0.13285253942012787,
0.12245173752307892,
0.031274642795324326,
-0.048375003039836884,
0.14458705484867096,
0.08945711702108383,
-0.028248390182852745,
0.011890633031725883,
-0.01361988764256239,
0.02436692640185356,
-0.028993355110287666,
0.01182006485760212,
-0.02368900552392006,
-0.1085851863026619,
-0.006588305812329054,
-0.06880124658346176,
0.12583619356155396,
-0.14988557994365692,
-0.01241297647356987,
0.05119328573346138,
0.11464624851942062,
-0.013693579472601414,
-0.041228991001844406,
0.08889534324407578,
0.037812426686286926,
0.032609403133392334,
-0.01726575754582882,
0.02040964737534523,
-0.015243429690599442,
0.008347800001502037,
0.04780946299433708,
-0.13967403769493103,
-0.14877913892269135,
0.09375937283039093,
0.02251587063074112,
-0.01936899498105049,
0.06119895353913307,
0.018023649230599403,
-0.016555245965719223,
-0.04796246066689491,
0.004426519386470318,
0.23236088454723358,
-0.013858439400792122,
0.06707291305065155,
-0.0792972669005394,
-0.007970034144818783,
0.022220540791749954,
-0.050850532948970795,
-0.09136262536048889,
0.08443395793437958,
0.0019005814101547003,
-0.08725191652774811,
-0.03851081430912018,
0.03404741361737251,
0.07141204178333282,
0.14858904480934143,
0.007293033413589001,
-0.08753842115402222,
-0.03196094557642937,
-0.0564352385699749,
-0.012683690525591373,
0.041564855724573135,
-0.13334250450134277,
-0.022113841027021408,
0.02575181983411312,
0.003375013591721654,
0.0540788359940052,
-0.026146389544010162,
0.0450628362596035,
0.01258768979460001,
-0.05175888165831566,
-0.06801683455705643,
0.03828767314553261,
-0.03470302000641823,
0.03698123246431351,
-0.013054123148322105,
-0.0026078831870108843,
-0.047556642442941666,
-0.057955820113420486,
-0.14344336092472076,
0.08625153452157974,
-0.061994705349206924,
-0.3136201500892639,
-0.08726788312196732,
-0.04990680515766144,
-0.03902944549918175,
0.013620332814753056,
0.046066366136074066,
-0.10822074860334396,
-0.11000075191259384,
-0.07083378732204437,
0.11950109899044037,
-0.032924752682447433,
-0.061977364122867584,
0.11542505025863647,
-0.006696843076497316,
0.0297439843416214,
-0.0951688140630722,
0.016759049147367477,
-0.041789207607507706,
-0.025257956236600876,
-0.03056795708835125,
0.022704634815454483,
0.06304824352264404,
0.12121722102165222,
0.023704752326011658,
-0.000738331291358918,
0.014464561827480793,
0.22854110598564148,
-0.13576118648052216,
0.07954239845275879,
0.23844274878501892,
-0.056298788636922836,
-0.009173696860671043,
0.13938234746456146,
-0.006697363220155239,
-0.054729294031858444,
0.04391197860240936,
0.0085190050303936,
-0.020368872210383415,
-0.21948286890983582,
-0.12163611501455307,
-0.047100864350795746,
-0.027634182944893837,
0.04568774998188019,
0.02134319394826889,
0.0045893448404967785,
0.01252917293459177,
-0.08382173627614975,
-0.039717886596918106,
0.06081289052963257,
0.026546431705355644,
0.14600184559822083,
0.010353969410061836,
0.0521838515996933,
-0.04141702130436897,
-0.023403607308864594,
0.10513213276863098,
-0.03829402104020119,
0.04226667433977127,
0.06904801726341248,
0.09597714990377426,
0.06132396683096886,
0.04074155539274216,
0.05732368305325508,
-0.016684256494045258,
-0.0240003801882267,
-0.005333854351192713,
-0.02672131359577179,
-0.0625588670372963,
0.018565785139799118,
0.04357031732797623,
0.14205247163772583,
-0.1342594176530838,
-0.11836443096399307,
0.033526815474033356,
0.011434835381805897,
0.1200941950082779,
0.09712734073400497,
-0.022046439349651337,
-0.0982707068324089,
0.030196452513337135,
-0.09109955281019211,
-0.03432833403348923,
0.05335115268826485,
0.095004603266716,
-0.15619732439517975,
0.08655428886413574,
0.07326678186655045,
0.08807161450386047,
-0.039326056838035583,
0.03241276741027832,
-0.04951298609375954,
0.06421376764774323,
0.004074749071151018,
0.06921903789043427,
-0.1730775684118271,
0.1047685369849205,
0.016184447333216667,
0.08512326329946518,
-0.053192488849163055,
0.026619482785463333,
0.04221952334046364,
0.014473527669906616,
0.13043096661567688,
-0.009134596213698387,
-0.09831143170595169,
-0.0005813040770590305,
-0.11270935833454132,
0.015962369740009308,
0.06343887746334076,
-0.06112784147262573,
0.05366604030132294,
-0.0039020907133817673,
-0.009133238345384598,
-0.03308473527431488,
-0.005407682619988918,
-0.2629029154777527,
-0.14374549686908722,
0.05036493390798569,
-0.004724748432636261,
0.04109790176153183,
-0.04028789699077606,
-0.07311879843473434,
-0.1330994963645935,
0.1025531068444252,
0.0017156380927190185,
-0.00996119063347578,
-0.08153954148292542,
0.0317297987639904,
0.09750504791736603,
-0.060490336269140244,
0.015977967530488968,
0.04253019392490387,
0.14613083004951477,
-0.06605309247970581,
-0.04050126299262047,
0.01996408775448799,
-0.10422146320343018,
-0.12595883011817932,
0.018960164859890938,
0.17880375683307648,
0.11501774191856384,
0.06401649862527847,
0.09218662977218628,
0.01999479904770851,
-0.0003393053193576634,
-0.09973057359457016,
0.021281272172927856,
0.01642037369310856,
-0.07953526824712753,
0.039579883217811584,
-0.00043983006617054343,
-0.2667149603366852,
-0.14474664628505707,
-0.06077888235449791,
0.0769946426153183,
0.18522512912750244,
-0.0261868666857481,
0.1616910994052887,
0.26752379536628723,
-0.08390162140130997,
-0.2263941913843155,
-0.03617996349930763,
0.005121992900967598,
0.028944391757249832,
0.04008908569812775,
-0.20558986067771912,
0.09861510246992111,
0.0009590319823473692,
0.01108085922896862,
-0.052324697375297546,
-0.2102629691362381,
-0.13314618170261383,
0.1703612357378006,
-0.023683495819568634,
0.045836277306079865,
-0.027311954647302628,
-0.07280103117227554,
-0.03217067942023277,
-0.062354568392038345,
0.016683468595147133,
-0.10561671853065491,
0.07302693277597427,
0.05582529306411743,
-0.0015174623113125563,
0.026041874662041664,
0.013933343812823296,
0.11176731437444687,
0.09163486957550049,
-0.01919170469045639,
-0.07714366912841797,
0.02313155308365822,
-0.0020001064985990524,
-0.005313183646649122,
0.10560339689254761,
0.04852549731731415,
0.017829101532697678,
-0.04706834629178047,
-0.08686711639165878,
-0.06667527556419373,
0.0649166852235794,
-0.071279376745224,
-0.0112611660733819,
-0.056057535111904144,
0.0895581990480423,
0.016077522188425064,
0.0017852025339379907,
-0.07166709750890732,
-0.09814766049385071,
-0.01785358041524887,
0.14435508847236633,
0.2092854380607605,
-0.04249138385057449,
0.0014742773491889238,
-0.04259930178523064,
-0.04249483346939087,
0.04531331732869148,
-0.011941506527364254,
0.043853629380464554,
0.052219804376363754,
0.028351057320833206,
0.09568177908658981,
-0.03546488657593727,
-0.12899751961231232,
0.023860417306423187,
0.03591343015432358,
-0.06960272043943405,
-0.1986767202615738,
-0.05232146754860878,
-0.01634734682738781,
-0.017887795343995094,
-0.03680921718478203,
0.19508691132068634,
-0.021867303177714348,
-0.051268674433231354,
0.0018199667101725936,
0.058195486664772034,
-0.006860882509499788,
0.1222957968711853,
0.05061498284339905,
0.03889138251543045,
-0.09379962831735611,
0.052770841866731644,
0.11641189455986023,
-0.035448603332042694,
0.044642265886068344,
0.08946821093559265,
-0.049497850239276886,
-0.05655917525291443,
-0.09774370491504669,
-0.002654061419889331,
0.06059098243713379,
-0.06117592751979828,
-0.006380589213222265,
-0.1094510406255722,
0.011138840578496456,
0.01135611068457365,
0.01246541179716587,
-0.04856318607926369,
-0.04593848064541817,
-0.00035755313001573086,
-0.09922230243682861,
0.06566498428583145,
0.10104665160179138,
-0.03243011236190796,
-0.1098259761929512,
0.10785911977291107,
0.014034672640264034,
0.07936374843120575,
-0.0384812094271183,
-0.05988960340619087,
-0.08680398762226105,
-0.009381077252328396,
-0.09071267396211624,
0.03290992230176926,
-0.13137824833393097,
-0.0127396946772933,
-0.04367846995592117,
-0.03390661999583244,
-0.008784240111708641,
0.07331911474466324,
-0.033781975507736206,
0.003203294239938259,
-0.027195805683732033,
0.08140062540769577,
-0.12352437525987625,
0.07236398011445999,
0.05752832442522049,
-0.04730243235826492,
0.10650373250246048,
0.028116803616285324,
-0.05247194692492485,
0.039537884294986725,
-0.21753951907157898,
-0.055794887244701385,
-0.030602186918258667,
0.04525216296315193,
-0.009353593923151493,
-0.1732492595911026,
0.004683046136051416,
0.016216177493333817,
0.011912856251001358,
-0.021368036046624184,
0.05347747728228569,
-0.028754156082868576,
-0.016227327287197113,
-0.07521608471870422,
-0.05584920942783356,
-0.036914944648742676,
0.06541304290294647,
0.07190057635307312,
0.012639793567359447,
0.09854321926832199,
-0.08846025168895721,
0.07776165008544922,
-0.0787743553519249,
0.027056563645601273,
-0.02698536030948162,
0.021690063178539276,
-0.07566346973180771,
-0.0723281055688858,
0.08034540712833405,
-0.01281466893851757,
0.07018125057220459,
0.024671560153365135,
-0.0281002726405859,
0.042047347873449326,
-0.05778320133686066,
-0.05921107903122902,
0.03886154294013977,
0.1312396079301834,
0.055647555738687515,
0.019024837762117386,
-0.002156019676476717,
-0.04115023463964462,
0.006834243889898062,
0.1427537351846695,
0.14458024501800537,
0.16864803433418274,
0.09135348349809647,
0.03206171840429306,
0.07597722858190536,
-0.05248117074370384,
-0.0740497037768364,
0.10110592842102051,
-0.0744282528758049,
0.03658387064933777,
-0.05076232925057411,
-0.07335086911916733,
0.07270996272563934,
-0.13702884316444397,
0.07632043957710266,
-0.02676115743815899,
-0.08461623638868332,
-0.10937371850013733,
-0.1363140195608139,
-0.06552856415510178,
-0.042145736515522,
0.005171569995582104,
-0.10900893807411194,
0.02885310724377632,
0.0028126505203545094,
0.030485566705465317,
-0.08861005306243896,
0.11656384915113449,
-0.12383020669221878,
-0.12294633686542511,
0.15511350333690643,
-0.036594975739717484,
-0.019196489825844765,
-0.0012178730685263872,
0.042610861361026764,
0.023033851757645607,
0.08876825124025345,
0.05037796497344971,
0.04637783765792847,
0.025742320343852043,
0.030766302719712257,
-0.09455370903015137,
-0.06744173914194107,
0.03165506199002266,
-0.015195665881037712,
0.09973516315221786,
0.19194084405899048,
0.08867202699184418,
-0.08153857290744781,
0.009758755564689636,
0.1403179168701172,
0.022480659186840057,
-0.11042818427085876,
-0.15116086602210999,
0.02917793020606041,
-0.029868222773075104,
0.005070398561656475,
0.004893808625638485,
-0.09434354305267334,
0.020178239792585373,
0.20697711408138275,
0.17301245033740997,
-0.04512924328446388,
0.025863638147711754,
-0.00440354784950614,
0.0075721126049757,
0.02689376100897789,
0.07248974591493607,
0.08617955446243286,
0.1752997487783432,
-0.007776147220283747,
0.042638346552848816,
-0.020317669957876205,
-0.09856808930635452,
-0.11045926064252853,
0.1021418645977974,
0.009404738433659077,
-0.033581215888261795,
-0.004703882150352001,
0.1788894236087799,
-0.10969635844230652,
-0.23324157297611237,
-0.1173505038022995,
-0.03723055124282837,
-0.11488725990056992,
0.02022060565650463,
-0.06295895576477051,
0.14147181808948517,
0.05151038244366646,
-0.007246667984873056,
0.010170985944569111,
0.17895372211933136,
0.0363558866083622,
0.02985728345811367,
-0.023814693093299866,
0.10710274428129196,
-0.08866185694932938,
0.10941238701343536,
-0.0013833885313943028,
0.053243719041347504,
0.033582914620637894,
0.03312739357352257,
-0.06661227345466614,
0.030484450981020927,
0.0336185023188591,
-0.005017581861466169,
0.041903067380189896,
0.17231862246990204,
-0.002364284126088023,
0.09244991093873978,
0.10358985513448715,
-0.07637350261211395,
0.025670984759926796,
-0.01598490960896015,
-0.0005307369865477085,
-0.061537448316812515,
0.1562240719795227,
-0.1502617597579956,
0.12597650289535522,
0.11181309074163437,
-0.06722556799650192,
-0.04604538530111313,
-0.010219813324511051,
0.054611485451459885,
-0.055450573563575745,
0.10250359773635864,
-0.004443369340151548,
-0.168858140707016,
0.030419999733567238,
-0.1285228133201599,
0.06662239879369736,
-0.2579101324081421,
-0.04430634528398514,
-0.0459076426923275,
-0.01855938509106636,
0.009927754290401936,
0.10905604809522629,
0.08658260107040405,
-0.04492446035146713,
-0.01074821688234806,
-0.0397116057574749,
0.008577153086662292,
0.09224825352430344,
-0.07983530312776566,
-0.02332002855837345
] |
null | null |
transformers
|
# Wav2Vec2-Base-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the fr unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "fr", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-fr-voxpopuli
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"fr",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"fr"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Base-VoxPopuli
Facebook's Wav2Vec2 base model pretrained on the fr unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the fr unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the fr unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
69,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the fr unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.060822803527116776,
0.03455115854740143,
-0.0043860189616680145,
-0.004201655741780996,
0.12075220048427582,
-0.037583332508802414,
0.06846323609352112,
0.0003358792164362967,
0.04009434953331947,
0.015121749602258205,
0.013089538551867008,
0.02335681952536106,
0.08871017396450043,
0.12284194678068161,
-0.017376579344272614,
-0.28495630621910095,
0.06979720294475555,
0.016329500824213028,
0.06440770626068115,
0.04606316238641739,
0.11083338409662247,
-0.0740489736199379,
0.049446165561676025,
0.05534268915653229,
-0.08776912838220596,
0.020566066727042198,
0.022719774395227432,
-0.0843379870057106,
0.12545588612556458,
0.11470604687929153,
0.08402586728334427,
0.05821237713098526,
0.032612159848213196,
-0.16160979866981506,
0.03393246978521347,
0.0639013946056366,
-0.05989309772849083,
-0.012555108405649662,
0.12028151005506516,
-0.025417843833565712,
0.21522843837738037,
-0.024009892717003822,
-0.03786700591444969,
0.08901684731245041,
-0.13122740387916565,
-0.13077877461910248,
-0.07015258818864822,
0.13334080576896667,
0.13465720415115356,
0.06953578442335129,
-0.07764027267694473,
0.03296986594796181,
-0.036110274493694305,
0.06796412914991379,
0.07206879556179047,
-0.28805044293403625,
-0.03624752536416054,
0.11230677366256714,
0.07114002108573914,
-0.03450039401650429,
-0.10271888971328735,
0.09614700078964233,
0.027866581454873085,
-0.00913492776453495,
-0.01560894027352333,
-0.09255468100309372,
0.031841255724430084,
-0.08479160815477371,
-0.11741536110639572,
0.013325641863048077,
0.18184977769851685,
0.03927544504404068,
-0.06243006885051727,
-0.06231378763914108,
-0.030698280781507492,
0.18876051902770996,
-0.061703797429800034,
-0.170181006193161,
0.009324869140982628,
0.04037664830684662,
0.08267832547426224,
-0.17424561083316803,
-0.06513945758342743,
-0.0030189959798008204,
-0.05625898018479347,
0.10245723277330399,
0.028056731447577477,
-0.018037350848317146,
-0.06263305991888046,
0.011265185661613941,
-0.1030561700463295,
-0.06912887096405029,
0.003564786398783326,
-0.10379073023796082,
-0.08637698739767075,
-0.028113877400755882,
-0.084866963326931,
-0.07489597797393799,
-0.029375698417425156,
0.08687286823987961,
0.004966335836797953,
0.06152736768126488,
-0.08988232910633087,
0.03423011302947998,
0.0020867199636995792,
0.08830041438341141,
-0.12952907383441925,
-0.0130850775167346,
0.00981715228408575,
-0.04530761390924454,
-0.0053228638134896755,
-0.03416052088141441,
-0.07744041830301285,
-0.07027684897184372,
-0.03139031305909157,
0.0594310387969017,
0.0003097035805694759,
0.025811512023210526,
-0.06206764280796051,
-0.09248699247837067,
0.04294408857822418,
-0.07285550236701965,
0.01922195591032505,
0.033665381371974945,
0.00676825363188982,
0.20150229334831238,
0.006831759121268988,
0.06485679745674133,
-0.1441655158996582,
0.017436467111110687,
-0.02029276080429554,
0.0040547349490225315,
-0.008434434421360493,
-0.045684415847063065,
0.03133123368024826,
-0.036120228469371796,
-0.007828367874026299,
-0.13141889870166779,
-0.08729439973831177,
-0.08295222371816635,
0.0031580592039972544,
-0.039061401039361954,
-0.06469544768333435,
-0.03965802490711212,
-0.0037899084854871035,
-0.01974511332809925,
-0.023299798369407654,
-0.021480679512023926,
-0.0188054870814085,
0.0026268393266946077,
-0.021587790921330452,
0.07776437699794769,
-0.05954991281032562,
0.08234226703643799,
0.010940911248326302,
-0.020927496254444122,
-0.14120382070541382,
0.1265985518693924,
-0.08206386119127274,
-0.06029760092496872,
-0.15032626688480377,
-0.03769312798976898,
-0.07384677231311798,
0.06488805264234543,
-0.008936665020883083,
0.16083094477653503,
-0.19304902851581573,
-0.1164809912443161,
0.25182822346687317,
-0.11812616884708405,
0.01734047196805477,
0.1745026856660843,
0.01829831674695015,
0.037311166524887085,
0.1637616604566574,
0.13814382255077362,
0.037271928042173386,
-0.11931072920560837,
0.04608142748475075,
-0.04058908671140671,
-0.01640540547668934,
0.05950982868671417,
0.050545793026685715,
-0.010433129966259003,
-0.005840817466378212,
-0.009772425517439842,
-0.07139867544174194,
-0.05186503380537033,
-0.0014084656722843647,
-0.05979658663272858,
0.03357294201850891,
-0.023045774549245834,
0.08568476140499115,
-0.006815224885940552,
-0.005167112220078707,
0.026639556512236595,
-0.08908116072416306,
-0.025770574808120728,
0.0680980533361435,
-0.054427552968263626,
0.07467969506978989,
-0.10621126741170883,
0.053496766835451126,
0.11558669805526733,
0.062442030757665634,
-0.13794362545013428,
0.05626774579286575,
-0.022204410284757614,
0.08422378450632095,
0.09596532583236694,
0.19043484330177307,
-0.025068018585443497,
-0.04246971756219864,
-0.08657221496105194,
0.02818787656724453,
-0.0247196014970541,
-0.04218003898859024,
-0.02128784917294979,
-0.10196277499198914,
-0.028202146291732788,
-0.04403333365917206,
0.07022694498300552,
-0.18098895251750946,
0.009921696968376637,
0.07947515696287155,
0.07887426763772964,
0.016238180920481682,
0.017556440085172653,
0.0052847531624138355,
0.09490931034088135,
0.03720706328749657,
0.007835173048079014,
0.07193303853273392,
-0.008636278100311756,
-0.05711249262094498,
0.11886144429445267,
-0.06233453378081322,
0.02267022430896759,
0.13943901658058167,
-0.10620176792144775,
-0.009172825142741203,
0.002561565488576889,
0.027893967926502228,
0.002811368787661195,
0.007763516623526812,
-0.017959782853722572,
0.21132898330688477,
0.024200163781642914,
0.08282575011253357,
-0.0818866714835167,
0.021769024431705475,
-0.0135111752897501,
-0.03357955068349838,
-0.06473260372877121,
0.06650174409151077,
0.037062548100948334,
-0.0960683822631836,
0.014273826964199543,
0.11494298279285431,
0.0019045756198465824,
0.14594462513923645,
0.019245367497205734,
-0.019999384880065918,
0.013038940727710724,
-0.05732627213001251,
-0.023041224107146263,
-0.010303851217031479,
-0.16199970245361328,
-0.015672631561756134,
0.02409047819674015,
0.011480189859867096,
0.06857139617204666,
-0.055123262107372284,
0.0003548384993337095,
0.011364882811903954,
-0.07677745074033737,
-0.03754568099975586,
0.0442255474627018,
-0.010542292147874832,
0.07178764790296555,
-0.04888658970594406,
-0.006464521400630474,
-0.015573039650917053,
-0.028932133689522743,
-0.1038145199418068,
0.11829548329114914,
-0.06003805249929428,
-0.3646112382411957,
-0.09806394577026367,
-0.10269883275032043,
-0.09142184257507324,
0.04271002858877182,
0.042105160653591156,
-0.10310791432857513,
-0.07608339190483093,
0.0022309657651931047,
0.16731631755828857,
-0.032762832939624786,
-0.07899445295333862,
0.04749654605984688,
0.0037641036324203014,
-0.007637436967343092,
-0.08890031278133392,
0.005257351323962212,
-0.03576723858714104,
-0.12794792652130127,
-0.012572427280247211,
-0.024240801110863686,
0.03818782791495323,
0.13930407166481018,
0.035397157073020935,
-0.022042160853743553,
-0.02288530021905899,
0.20560519397258759,
-0.13003475964069366,
0.0847276896238327,
0.2623372972011566,
-0.02353169396519661,
0.019307315349578857,
0.14222677052021027,
-0.013404574245214462,
-0.08160383254289627,
-0.006244124844670296,
0.07166314125061035,
-0.003882642835378647,
-0.2628834843635559,
-0.12894266843795776,
-0.06397563219070435,
-0.029635438695549965,
0.027289923280477524,
-0.009195747785270214,
0.021156471222639084,
0.04261155053973198,
-0.10551073402166367,
-0.019221272319555283,
0.05389239639043808,
0.023681640625,
0.21897396445274353,
-0.04362766072154045,
0.14115479588508606,
-0.023996321484446526,
-0.027588747441768646,
0.06925562769174576,
0.02938232570886612,
0.08329053968191147,
0.11523514240980148,
0.05679435282945633,
0.08485179394483566,
0.020144714042544365,
0.02407064102590084,
-0.0010880542686209083,
-0.001317113870754838,
-0.03052408993244171,
-0.05199248343706131,
-0.021218227222561836,
-0.042929962277412415,
0.024674199521541595,
0.10670533031225204,
-0.16945788264274597,
-0.12141018360853195,
0.01632648892700672,
0.03351271152496338,
0.13554896414279938,
0.05624097213149071,
-0.07819780707359314,
-0.03929034620523453,
0.04629254713654518,
-0.08350483328104019,
-0.04855290800333023,
0.0644327774643898,
0.08184194564819336,
-0.14773957431316376,
0.14844682812690735,
0.028861945495009422,
0.09717321395874023,
-0.015679171308875084,
0.05673372000455856,
-0.1596629023551941,
-0.016303356736898422,
0.03354362025856972,
0.06791985780000687,
-0.2476072758436203,
0.21395961940288544,
0.020797736942768097,
0.06251508742570877,
-0.07028850167989731,
0.006303461268544197,
0.04924386739730835,
0.144916370511055,
0.1327885389328003,
-0.0053066168911755085,
-0.06238296627998352,
0.015158473514020443,
-0.011251939460635185,
0.03716086223721504,
0.039126474410295486,
-0.02263922244310379,
0.046238068491220474,
-0.004151014145463705,
0.013328668661415577,
-0.009416917338967323,
0.11844613403081894,
-0.22989365458488464,
-0.1519700586795807,
0.027905501425266266,
0.009925267659127712,
0.11199451982975006,
-0.005221465136855841,
-0.06586874276399612,
-0.11218228936195374,
0.10255827009677887,
-0.0009938711300492287,
-0.0161537304520607,
-0.10721554607152939,
0.036953240633010864,
0.02567722089588642,
-0.09922738373279572,
0.036479927599430084,
0.054763589054346085,
0.13033229112625122,
-0.10224355757236481,
-0.06287258863449097,
0.047482170164585114,
-0.09312115609645844,
-0.060817424207925797,
0.04750869423151016,
0.17894594371318817,
0.10358715802431107,
0.03237732872366905,
0.11454524099826813,
-0.03943881019949913,
0.0503879114985466,
-0.11299055069684982,
0.07049787789583206,
0.007257184479385614,
-0.02150023728609085,
0.02142232470214367,
-0.059001874178647995,
-0.2503075897693634,
-0.10657279193401337,
-0.015280084684491158,
0.1700412929058075,
0.18782538175582886,
0.021505314856767654,
0.15058384835720062,
0.24078701436519623,
-0.09459412097930908,
-0.2550053596496582,
-0.04218075051903725,
-0.020993079990148544,
0.03802693262696266,
0.018878061324357986,
-0.26896968483924866,
0.061033375561237335,
0.05793324112892151,
0.006810430902987719,
-0.07533232122659683,
-0.2011261135339737,
-0.12795792520046234,
0.19724038243293762,
0.02134145237505436,
0.15261025726795197,
-0.09259355813264847,
-0.048704102635383606,
-0.08777270466089249,
-0.08162566274404526,
0.0823243036866188,
-0.1365814059972763,
0.08477243781089783,
0.055858515202999115,
-0.011544077657163143,
0.0066971504129469395,
0.055866777896881104,
0.12065223604440689,
0.06646338105201721,
0.014719542115926743,
-0.02718452177941799,
0.032545916736125946,
0.031883370131254196,
0.031676024198532104,
0.02953159809112549,
0.026918884366750717,
-0.03200589120388031,
-0.053228043019771576,
-0.11085685342550278,
-0.10307209193706512,
0.09893035888671875,
-0.06025414913892746,
-0.004143873695284128,
-0.020668959245085716,
0.10363995283842087,
0.01894741877913475,
0.01467136386781931,
-0.05962013453245163,
-0.14649803936481476,
0.03527165204286575,
0.14041084051132202,
0.23099219799041748,
-0.12729309499263763,
0.010926778428256512,
-0.03713986277580261,
-0.04416022077202797,
0.08866170793771744,
-0.00647704629227519,
0.06993505358695984,
0.03817153349518776,
-0.010001664981245995,
0.09636721760034561,
0.017685018479824066,
-0.07176469266414642,
0.00394812598824501,
0.04428818076848984,
-0.06517747789621353,
-0.25114569067955017,
-0.06841685622930527,
-0.014105918817222118,
0.020797472447156906,
0.013626882806420326,
0.18970634043216705,
-0.013532011769711971,
-0.057582736015319824,
-0.018370434641838074,
0.042583905160427094,
-0.04354197904467583,
0.05125122144818306,
0.03383132815361023,
0.03171364217996597,
-0.12001670897006989,
0.06210305914282799,
0.08718159794807434,
-0.14516261219978333,
0.06522651016712189,
0.03867963328957558,
-0.04927445203065872,
-0.0958995521068573,
-0.13367758691310883,
0.03635955974459648,
-0.015950798988342285,
-0.08358917385339737,
0.030490251258015633,
-0.16570143401622772,
0.03812814876437187,
0.12704414129257202,
0.03139014169573784,
-0.01722828857600689,
-0.057421281933784485,
-0.04849402233958244,
-0.021325239911675453,
0.0005732676363550127,
0.12008743733167648,
-0.06286977976560593,
-0.13608476519584656,
0.14716583490371704,
0.010185924358665943,
0.07416011393070221,
-0.047546446323394775,
-0.03841015323996544,
-0.1400485336780548,
0.007772604003548622,
-0.16302622854709625,
0.014837308786809444,
-0.11957431584596634,
0.0009937516879290342,
-0.043930534273386,
-0.019516905769705772,
-0.04396679997444153,
0.026612965390086174,
-0.10323451459407806,
0.017473984509706497,
-0.004510796628892422,
0.08013641089200974,
-0.10789160430431366,
0.07777968794107437,
0.0824698656797409,
-0.02326241508126259,
0.06824426352977753,
0.020056577399373055,
-0.031932283192873,
0.10950344055891037,
-0.16994860768318176,
-0.052244096994400024,
0.041315700858831406,
0.033363815397024155,
-0.004027821589261293,
-0.14629347622394562,
0.013524767011404037,
0.027957122772932053,
0.03813367709517479,
-0.004961218684911728,
0.08274631202220917,
-0.05529019609093666,
-0.009001457132399082,
-0.04704951494932175,
-0.09354065358638763,
-0.007564351428300142,
0.07676535099744797,
0.10576613992452621,
0.005102421157062054,
0.11267130076885223,
-0.08117181062698364,
0.04822015389800072,
-0.10080249607563019,
0.06681868433952332,
-0.040462203323841095,
-0.027530670166015625,
0.05568850412964821,
-0.14149174094200134,
0.053824618458747864,
0.003324478166177869,
0.09587236493825912,
-0.008726291358470917,
0.014018941670656204,
0.022419948130846024,
-0.10192205011844635,
-0.11566134542226791,
0.03876065835356712,
0.1287747323513031,
0.08374343812465668,
-0.007894864305853844,
0.021999232470989227,
0.001600315677933395,
0.034866396337747574,
0.2034960836172104,
0.21053382754325867,
0.19870828092098236,
0.02696615271270275,
0.0944310799241066,
0.022228490561246872,
-0.0579143762588501,
-0.024129191413521767,
0.014896060340106487,
-0.09040139615535736,
0.026939671486616135,
-0.056199632585048676,
-0.05261639505624771,
0.08586063235998154,
-0.14795872569084167,
0.12360862642526627,
0.023622309789061546,
-0.08469469845294952,
-0.1591879427433014,
-0.1647050678730011,
-0.0661400854587555,
-0.09479612857103348,
-0.013850843533873558,
-0.12955570220947266,
-0.015492519363760948,
0.012338834814727306,
0.02324952743947506,
-0.11707638204097748,
0.10213636606931686,
-0.14681971073150635,
-0.1656615287065506,
0.1813158392906189,
-0.036292675882577896,
0.004260215442627668,
-0.009610634297132492,
0.003823914099484682,
0.0035594638902693987,
0.08532020449638367,
0.014825914055109024,
0.03797071427106857,
-0.011652580462396145,
0.05424879118800163,
-0.07525791972875595,
-0.050641704350709915,
-0.0007043264922685921,
0.026478881016373634,
0.11820878088474274,
0.1972038894891739,
0.0366685688495636,
-0.05644363909959793,
0.007028291001915932,
0.1446678340435028,
0.024105388671159744,
-0.08801250159740448,
-0.14765535295009613,
0.06086105853319168,
0.03848433494567871,
0.030820945277810097,
-0.03999758139252663,
-0.06486140191555023,
0.008453292772173882,
0.28915777802467346,
0.15990294516086578,
-0.061208512634038925,
0.03022274561226368,
0.0074258167296648026,
0.03877118602395058,
0.08431234955787659,
0.09970377385616302,
0.08450964838266373,
0.15093812346458435,
-0.03478638827800751,
-0.01569652371108532,
0.013111989945173264,
-0.06840457022190094,
-0.0757419764995575,
0.15837529301643372,
0.040913671255111694,
-0.08211469650268555,
0.017672432586550713,
0.14900675415992737,
-0.16059072315692902,
-0.09541599452495575,
-0.07930586487054825,
-0.05338526889681816,
-0.10255884379148483,
-0.026738742366433144,
-0.060006964951753616,
0.11046940833330154,
0.1027732715010643,
-0.009683924727141857,
-0.030493592843413353,
0.1916058361530304,
0.05110837146639824,
-0.01458489429205656,
-0.04954151436686516,
0.12235836684703827,
-0.0342380665242672,
0.06237766146659851,
-0.01747354120016098,
0.03954679146409035,
0.05012910068035126,
0.04534243419766426,
-0.0037696710787713528,
0.05028962716460228,
-0.006811344530433416,
0.030627360567450523,
0.06923138350248337,
0.12962856888771057,
0.0059700957499444485,
0.04581201449036598,
0.09320440143346786,
-0.14466392993927002,
0.03465515002608299,
0.047495387494564056,
-0.04127119854092598,
-0.0014864510158076882,
0.1722278892993927,
-0.19878007471561432,
0.05725160241127014,
0.15942500531673431,
-0.027749763801693916,
-0.016457146033644676,
-0.04297599196434021,
0.06723600625991821,
-0.016029635444283485,
0.05387058109045029,
-0.03375452756881714,
-0.13889607787132263,
0.005628934595733881,
-0.06437494605779648,
0.030040452256798744,
-0.18572913110256195,
0.0051294355653226376,
-0.040046896785497665,
0.0006019325228407979,
-0.05002148449420929,
0.09616809338331223,
0.015642507001757622,
-0.05141480639576912,
0.018804332241415977,
-0.10063810646533966,
0.033195242285728455,
0.09500090777873993,
-0.0876065045595169,
-0.03905635327100754
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **hr** on **8.1k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **hr**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "hr", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-hr-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"hr",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"hr"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #hr #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in hr on 8.1k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hr. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in hr on 8.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hr. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #hr #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in hr on 8.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hr. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
250
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #hr #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in hr on 8.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hr. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.08290942758321762,
0.09867038577795029,
-0.002494994318112731,
0.00034156240872107446,
0.07564247399568558,
-0.03760639578104019,
0.14232634007930756,
0.049520935863256454,
0.009832317940890789,
0.08721789717674255,
-0.015020920895040035,
-0.05499691888689995,
0.07753043621778488,
0.11673393845558167,
0.05052490532398224,
-0.2590514123439789,
0.0387774221599102,
-0.0585654191672802,
0.03894171491265297,
0.044446807354688644,
0.11822350323200226,
-0.08411659300327301,
0.0329018272459507,
0.05798288434743881,
-0.035747725516557693,
0.024238908663392067,
-0.04962189123034477,
-0.07894682139158249,
0.061749715358018875,
0.05473519489169121,
-0.034008417278528214,
0.035500288009643555,
0.10705291479825974,
-0.19638262689113617,
0.03665376454591751,
0.03311873972415924,
0.032823145389556885,
0.017728986218571663,
0.08611399680376053,
0.026865901425480843,
0.1631532609462738,
-0.027176134288311005,
-0.004779891110956669,
0.08170044422149658,
-0.04765601083636284,
-0.09503021091222763,
-0.06341055035591125,
0.1509743183851242,
0.08807545155286789,
0.1101628839969635,
-0.07610318064689636,
0.0831967294216156,
-0.025318259373307228,
0.03948640078306198,
0.07892785221338272,
-0.18540379405021667,
-0.050357360392808914,
0.05348808318376541,
0.09803109616041183,
0.0225080419331789,
-0.0882224589586258,
0.07675180584192276,
0.052615854889154434,
-0.008304785937070847,
-0.0701606273651123,
-0.03718354180455208,
0.1309800148010254,
-0.10296294838190079,
-0.11828004568815231,
0.005839286372065544,
0.1743987798690796,
0.06444335728883743,
-0.07282727211713791,
-0.1445736140012741,
0.017050258815288544,
0.2057165652513504,
-0.05626906827092171,
-0.08847200870513916,
0.005858041811734438,
0.008662394247949123,
0.03941936045885086,
-0.06815870851278305,
-0.07127533853054047,
-0.004376144614070654,
0.02294856868684292,
0.10988035053014755,
0.024552233517169952,
-0.014196936972439289,
-0.06906616687774658,
0.0008141851867549121,
-0.08738990873098373,
-0.11040293425321579,
-0.009379514493048191,
-0.06405660510063171,
-0.07225823402404785,
-0.033263660967350006,
0.000048056703235488385,
-0.08774281293153763,
0.032533444464206696,
0.08394736796617508,
0.05206194519996643,
0.05560025945305824,
-0.05146162211894989,
-0.03387140855193138,
0.11921638250350952,
0.08410105109214783,
-0.12263277173042297,
-0.014927851967513561,
0.014872299507260323,
-0.013429572805762291,
-0.0008481619297526777,
-0.035197045654058456,
-0.043307214975357056,
0.005715412087738514,
-0.014300627633929253,
0.047001928091049194,
0.051219671964645386,
-0.030529038980603218,
-0.029604777693748474,
-0.0931323692202568,
0.09236263483762741,
-0.0783749595284462,
0.01982077769935131,
0.05284516513347626,
-0.008437369018793106,
0.08867987990379333,
-0.05438758805394173,
0.0788959413766861,
-0.10818658024072647,
-0.01088028121739626,
-0.024986134842038155,
-0.006515733897686005,
0.023088479414582253,
-0.024272922426462173,
0.02519397996366024,
0.007716658525168896,
0.007872727699577808,
-0.11644833534955978,
0.009131738916039467,
-0.0991225466132164,
-0.017711149528622627,
-0.08573099225759506,
-0.040567636489868164,
-0.0471159927546978,
0.017356712371110916,
-0.0017207280034199357,
-0.0062897163443267345,
0.017641540616750717,
-0.014647437259554863,
-0.00988455954939127,
0.004693840630352497,
0.04076514393091202,
0.05138782039284706,
0.08334148675203323,
-0.011404733173549175,
-0.019441088661551476,
-0.10503198206424713,
0.11702896654605865,
-0.08176042139530182,
-0.02576095424592495,
-0.14009760320186615,
-0.045530322939157486,
-0.030916916206479073,
0.036869604140520096,
0.009484678506851196,
0.12548202276229858,
-0.17077329754829407,
-0.07088559865951538,
0.10585620999336243,
-0.1161281019449234,
0.008024833165109158,
0.18084163963794708,
-0.004924540873616934,
0.07873984426259995,
0.09449820220470428,
0.21503247320652008,
0.02232237346470356,
-0.17852260172367096,
-0.011204552836716175,
-0.03311963751912117,
0.04433024674654007,
0.12727347016334534,
0.06040801480412483,
-0.06311322003602982,
0.05746395140886307,
-0.016520868986845016,
-0.03320002555847168,
-0.07893089205026627,
-0.010468644089996815,
-0.04613099247217178,
0.022555436939001083,
-0.050170887261629105,
0.02063012309372425,
-0.006107510067522526,
-0.012963561341166496,
-0.0011681505711749196,
-0.09542366117238998,
-0.06793437898159027,
0.12271932512521744,
-0.0702529177069664,
0.023033343255519867,
-0.10052145272493362,
0.05131952464580536,
0.0627916231751442,
0.002479672199115157,
-0.12031101435422897,
0.11783178895711899,
0.029963750392198563,
-0.06351233273744583,
0.14587461948394775,
0.09874401241540909,
-0.03285373002290726,
0.010968777351081371,
-0.0202294010668993,
0.019597113132476807,
-0.029173118993639946,
0.021234208717942238,
-0.028703905642032623,
-0.11103162914514542,
-0.009009080938994884,
-0.06270240992307663,
0.12345969676971436,
-0.13678665459156036,
-0.013618767261505127,
0.039651889353990555,
0.10701078921556473,
-0.015446378849446774,
-0.029262958094477654,
0.08184266835451126,
0.037134937942028046,
0.03551735356450081,
-0.020590800791978836,
0.017529483884572983,
-0.015461390838027,
-0.0065019926987588406,
0.04895065724849701,
-0.14495179057121277,
-0.16449496150016785,
0.09637478739023209,
0.023061633110046387,
-0.013166503980755806,
0.05889855697751045,
0.018808189779520035,
-0.014629822224378586,
-0.04416665807366371,
0.007605532184243202,
0.2461535632610321,
-0.010871551930904388,
0.06404349207878113,
-0.07607972621917725,
-0.006216404028236866,
0.01726202666759491,
-0.05151434615254402,
-0.0849921703338623,
0.0799657478928566,
0.007139906752854586,
-0.0862487182021141,
-0.049422480165958405,
0.040029678493738174,
0.06775617599487305,
0.1568581461906433,
0.002397139323875308,
-0.0853295624256134,
-0.03447795659303665,
-0.0708029493689537,
-0.020574122667312622,
0.0561039038002491,
-0.13806307315826416,
-0.030262203887104988,
0.026326388120651245,
0.010472838766872883,
0.05429121479392052,
-0.02975340746343136,
0.05008477345108986,
0.0028517635073512793,
-0.04883546009659767,
-0.07076861709356308,
0.032818831503391266,
-0.03182188794016838,
0.03968259319663048,
-0.005561708007007837,
-0.006926335394382477,
-0.04474097117781639,
-0.05864974856376648,
-0.14520688354969025,
0.09120088815689087,
-0.06486821174621582,
-0.29952600598335266,
-0.08452346175909042,
-0.04413248971104622,
-0.03163846954703331,
0.006360277067869902,
0.047879718244075775,
-0.11551742255687714,
-0.10704052448272705,
-0.06254804134368896,
0.13693012297153473,
-0.03304339200258255,
-0.059657249599695206,
0.11805544048547745,
-0.003587258281186223,
0.026216398924589157,
-0.09877879917621613,
0.013426998630166054,
-0.04615923762321472,
-0.032615888863801956,
-0.025445984676480293,
0.013500805012881756,
0.05355461686849594,
0.12245075404644012,
0.021612737327814102,
0.003368835197761655,
0.007431448437273502,
0.21594181656837463,
-0.13608182966709137,
0.08252306282520294,
0.22389817237854004,
-0.05739346519112587,
-0.01294354721903801,
0.14269006252288818,
-0.005127406679093838,
-0.05023390054702759,
0.038172267377376556,
-0.004050572402775288,
-0.01784171164035797,
-0.22299177944660187,
-0.12158320099115372,
-0.0391017347574234,
-0.028388192877173424,
0.034673359245061874,
0.011398113332688808,
-0.011104680597782135,
0.010972870513796806,
-0.0790184959769249,
-0.04086214303970337,
0.0664898231625557,
0.024462390691041946,
0.14367537200450897,
0.009284775704145432,
0.05072007328271866,
-0.04222772642970085,
-0.02505267970263958,
0.09571260213851929,
-0.035591963678598404,
0.055310722440481186,
0.06704771518707275,
0.09243287891149521,
0.060733672231435776,
0.038137394934892654,
0.060924071818590164,
-0.015882115811109543,
-0.015328419394791126,
-0.008813157677650452,
-0.02984985150396824,
-0.05745350196957588,
0.014803250320255756,
0.050492215901613235,
0.1510702520608902,
-0.13319122791290283,
-0.11780843138694763,
0.04885333031415939,
0.008581794798374176,
0.13927307724952698,
0.10289221256971359,
-0.031228890642523766,
-0.09679891169071198,
0.031724054366350174,
-0.08734448999166489,
-0.03721228986978531,
0.057230886071920395,
0.07454896718263626,
-0.15819287300109863,
0.08752274513244629,
0.07368621230125427,
0.08219844102859497,
-0.05838092789053917,
0.03806118667125702,
-0.05488736927509308,
0.054505717009305954,
0.0001493089075665921,
0.07653626799583435,
-0.18030820786952972,
0.11517897993326187,
0.01553278136998415,
0.08517254143953323,
-0.05537332594394684,
0.02242075465619564,
0.04552529752254486,
0.013969887979328632,
0.1290925294160843,
-0.009073524735867977,
-0.08763173222541809,
-0.006072141230106354,
-0.11107873916625977,
0.027544138953089714,
0.061182666569948196,
-0.04900030046701431,
0.0586823895573616,
-0.008388797752559185,
-0.006875431630760431,
-0.02969847060739994,
0.0014932843623682857,
-0.25963282585144043,
-0.1384805589914322,
0.05213391035795212,
-0.014122983440756798,
0.05404120683670044,
-0.042093317955732346,
-0.07595206052064896,
-0.129394069314003,
0.09783832728862762,
-0.0019446236547082663,
-0.011712878942489624,
-0.07459237426519394,
0.026709817349910736,
0.10260716825723648,
-0.0622410923242569,
0.009821389801800251,
0.049774836748838425,
0.1454492062330246,
-0.06887225061655045,
-0.04175901040434837,
0.02365906909108162,
-0.10284973680973053,
-0.12970216572284698,
0.01645994558930397,
0.16400471329689026,
0.12087298929691315,
0.0719483345746994,
0.09383007884025574,
0.02023189142346382,
-0.007615027483552694,
-0.09509507566690445,
0.027498774230480194,
0.03347472846508026,
-0.07618989050388336,
0.04330608248710632,
-0.007042123470455408,
-0.2706342935562134,
-0.14330987632274628,
-0.06120331957936287,
0.07846155762672424,
0.1912604719400406,
-0.03002793900668621,
0.15703852474689484,
0.27413153648376465,
-0.08301037549972534,
-0.22772888839244843,
-0.03485913574695587,
0.0009888163767755032,
0.031176798045635223,
0.035458240658044815,
-0.1961640864610672,
0.09877824038267136,
-0.012500676326453686,
0.006993862800300121,
-0.07145733386278152,
-0.19488173723220825,
-0.13119027018547058,
0.18934062123298645,
-0.026873348280787468,
0.04944981634616852,
-0.022427862510085106,
-0.06354200094938278,
-0.04140612110495567,
-0.0532277412712574,
0.0029693657997995615,
-0.1012134701013565,
0.075147345662117,
0.05106377229094505,
0.017709236592054367,
0.02266695350408554,
0.011380227282643318,
0.11104410886764526,
0.09244851768016815,
-0.023763306438922882,
-0.07156093418598175,
0.01623082347214222,
0.0004731716471724212,
-0.016835616901516914,
0.10626044869422913,
0.05029672384262085,
0.01782918907701969,
-0.04550103470683098,
-0.08125197887420654,
-0.06644103676080704,
0.06078829988837242,
-0.0716642439365387,
-0.01569535955786705,
-0.051268406212329865,
0.09071934968233109,
0.01222387608140707,
-0.0023163314908742905,
-0.07062455266714096,
-0.0964098647236824,
-0.027010062709450722,
0.12790825963020325,
0.21894843876361847,
-0.0490570031106472,
0.008759642951190472,
-0.0440126471221447,
-0.045462466776371,
0.04336142539978027,
0.002739455783739686,
0.04152446985244751,
0.055389124900102615,
0.023695262148976326,
0.10026121884584427,
-0.03297144174575806,
-0.12873023748397827,
0.025565044954419136,
0.028696220368146896,
-0.06059645861387253,
-0.18568769097328186,
-0.05174698308110237,
-0.00687794666737318,
-0.024292588233947754,
-0.028150133788585663,
0.19825243949890137,
-0.01696220599114895,
-0.05093613266944885,
0.00023412083100993186,
0.0631362572312355,
-0.006356845609843731,
0.11796943098306656,
0.05090900883078575,
0.041653960943222046,
-0.09235716611146927,
0.057218119502067566,
0.11101775616407394,
-0.033711932599544525,
0.05245799198746681,
0.07981701195240021,
-0.053576886653900146,
-0.05867023766040802,
-0.09465759992599487,
0.010867000557482243,
0.047358471900224686,
-0.05795985087752342,
-0.01729612797498703,
-0.1172654777765274,
0.007447039242833853,
0.0007725178729742765,
0.009872876107692719,
-0.050771281123161316,
-0.05334026739001274,
-0.00014939848915673792,
-0.0956343412399292,
0.06819088757038116,
0.09317336976528168,
-0.03140381723642349,
-0.10387694835662842,
0.10276854038238525,
0.016047954559326172,
0.08262328058481216,
-0.03923199698328972,
-0.06160527840256691,
-0.08704208582639694,
-0.0022918505128473043,
-0.1286892294883728,
0.03462088853120804,
-0.12631188333034515,
-0.011772114783525467,
-0.047728292644023895,
-0.04031791538000107,
-0.008417555131018162,
0.06879722326993942,
-0.036318954080343246,
0.007174935191869736,
-0.028995579108595848,
0.08649657666683197,
-0.11757729947566986,
0.07286221534013748,
0.05155852437019348,
-0.05238430202007294,
0.11554411798715591,
0.02118906006217003,
-0.05382543057203293,
0.04395187646150589,
-0.1841365098953247,
-0.06324891000986099,
-0.039348307996988297,
0.041002340614795685,
-0.0027809569146484137,
-0.17981694638729095,
-0.005504535511136055,
0.015357324853539467,
0.015817951411008835,
-0.01963132806122303,
0.050766926258802414,
-0.019256561994552612,
-0.004013055469840765,
-0.07578805834054947,
-0.05103765428066254,
-0.03778548166155815,
0.070374995470047,
0.06788812577724457,
0.009089243598282337,
0.09232743084430695,
-0.08766186982393265,
0.07784238457679749,
-0.08173944056034088,
0.02216176688671112,
-0.02651219628751278,
0.024260438978672028,
-0.07253136485815048,
-0.08293286710977554,
0.08716455101966858,
-0.020669566467404366,
0.07813107222318649,
0.026947861537337303,
-0.023463992401957512,
0.03963937237858772,
-0.051016662269830704,
-0.07157305628061295,
0.04048436880111694,
0.13340213894844055,
0.05001947656273842,
0.027940351516008377,
-0.004363833460956812,
-0.04012267664074898,
0.00016900055925361812,
0.1477925330400467,
0.15133844316005707,
0.16497668623924255,
0.1032332181930542,
0.0371243990957737,
0.07772604376077652,
-0.058530230075120926,
-0.08092502504587173,
0.09909305721521378,
-0.06633315980434418,
0.04395278915762901,
-0.04649302735924721,
-0.07035801559686661,
0.07337157428264618,
-0.14141425490379333,
0.0686289444565773,
-0.03327971324324608,
-0.08873297274112701,
-0.1135975643992424,
-0.12844233214855194,
-0.06484893709421158,
-0.05359586328268051,
0.008468301966786385,
-0.10699659585952759,
0.02158607542514801,
0.014284701086580753,
0.0316573865711689,
-0.08465372771024704,
0.11269263923168182,
-0.11167239397764206,
-0.1229822039604187,
0.1525939404964447,
-0.03071480430662632,
-0.018515976145863533,
-0.006605294067412615,
0.0471942313015461,
0.03140932694077492,
0.09197510778903961,
0.049736760556697845,
0.043563056737184525,
0.010949235409498215,
0.028723204508423805,
-0.08540930598974228,
-0.06800760328769684,
0.030251333490014076,
-0.015351079404354095,
0.100582055747509,
0.1860363483428955,
0.08314300328493118,
-0.08401169627904892,
0.010782723315060139,
0.14187684655189514,
0.013657830655574799,
-0.12733706831932068,
-0.14370116591453552,
0.05862673744559288,
-0.033719003200531006,
0.004186239559203386,
0.008114033378660679,
-0.09253016859292984,
0.02250257506966591,
0.20301581919193268,
0.18417537212371826,
-0.047544922679662704,
0.02002042718231678,
-0.003673651721328497,
0.007067086640745401,
0.01892954669892788,
0.07644543051719666,
0.0888398289680481,
0.16970646381378174,
-0.014512262307107449,
0.04589039832353592,
-0.018521439284086227,
-0.10319101065397263,
-0.09952135384082794,
0.10665589570999146,
0.011487709358334541,
-0.03037661872804165,
-0.008155792020261288,
0.17408466339111328,
-0.10027241706848145,
-0.24690265953540802,
-0.12668979167938232,
-0.045685362070798874,
-0.11637763679027557,
0.023204736411571503,
-0.0500553734600544,
0.13437330722808838,
0.05375365912914276,
-0.011162124574184418,
0.008684871718287468,
0.18903560936450958,
0.030854789540171623,
0.02937900274991989,
-0.02378709241747856,
0.11350954324007034,
-0.10307110846042633,
0.10287779569625854,
-0.004505775403231382,
0.05528533086180687,
0.03593846410512924,
0.03589118644595146,
-0.06273386627435684,
0.03550869598984718,
0.03289227932691574,
-0.003511901246383786,
0.04877643287181854,
0.1697370558977127,
-0.008727981708943844,
0.09971321374177933,
0.11151748895645142,
-0.07615393400192261,
0.028311079367995262,
-0.022362645715475082,
-0.017651479691267014,
-0.0598771795630455,
0.14683806896209717,
-0.1399666666984558,
0.13282397389411926,
0.10503291338682175,
-0.06838903576135635,
-0.039512600749731064,
-0.007428880315274,
0.05120387300848961,
-0.05000004917383194,
0.09760095924139023,
-0.005113746505230665,
-0.16860370337963104,
0.025243641808629036,
-0.1301700621843338,
0.07127656787633896,
-0.2613621652126312,
-0.04317854344844818,
-0.04587608575820923,
-0.02176591008901596,
0.01408025249838829,
0.11111979186534882,
0.08631866425275803,
-0.039313141256570816,
-0.014758523553609848,
-0.0233285054564476,
0.005585134495049715,
0.09088881313800812,
-0.09408216923475266,
-0.02769876830279827
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **hu** on **17.7k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **hu**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "hu", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-hu-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"hu",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"hu"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #hu #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in hu on 17.7k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hu. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in hu on 17.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hu. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #hu #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in hu on 17.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hu. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #hu #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in hu on 17.7k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in hu. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07669567316770554,
0.11371425539255142,
-0.002596998820081353,
0.003941941075026989,
0.07713092863559723,
-0.04688310623168945,
0.13738590478897095,
0.04211502894759178,
0.00414044177159667,
0.09719830006361008,
-0.019267985597252846,
-0.061873819679021835,
0.08007239550352097,
0.12755651772022247,
0.05945393070578575,
-0.26116275787353516,
0.0396992526948452,
-0.06643171608448029,
0.058629732578992844,
0.04495752975344658,
0.11580284684896469,
-0.08496778458356857,
0.030995622277259827,
0.055264491587877274,
-0.03083600103855133,
0.02749077044427395,
-0.051218125969171524,
-0.07967562973499298,
0.05103430151939392,
0.048425428569316864,
-0.02589031495153904,
0.025576896965503693,
0.09986226260662079,
-0.19470323622226715,
0.036649882793426514,
0.037460844963788986,
0.03169342875480652,
0.010967337526381016,
0.09630385041236877,
0.029029393568634987,
0.16481560468673706,
-0.016933102160692215,
-0.008872359059751034,
0.08391690999269485,
-0.05413489416241646,
-0.09196438640356064,
-0.0672941505908966,
0.16280116140842438,
0.09671741724014282,
0.1008034199476242,
-0.07873167842626572,
0.0759003609418869,
-0.02326473779976368,
0.041322849690914154,
0.0810529887676239,
-0.18666131794452667,
-0.05348183959722519,
0.05236034840345383,
0.10737711191177368,
0.023093873634934425,
-0.09142446517944336,
0.07608962059020996,
0.05691144987940788,
-0.010063428431749344,
-0.07095444947481155,
-0.03474077954888344,
0.13232776522636414,
-0.10312294214963913,
-0.11478186398744583,
0.00665312958881259,
0.172687366604805,
0.06057959794998169,
-0.0718354880809784,
-0.14906084537506104,
0.01272520050406456,
0.20155483484268188,
-0.06103690713644028,
-0.09016687422990799,
0.00855769868940115,
0.01731214113533497,
0.05980440229177475,
-0.05936175212264061,
-0.06582354009151459,
-0.0012299376539885998,
0.028722653165459633,
0.10510198026895523,
0.023028545081615448,
-0.01882653497159481,
-0.06972027570009232,
-0.005948858335614204,
-0.07520956546068192,
-0.12070310860872269,
-0.00661437725648284,
-0.06426548957824707,
-0.06987190246582031,
-0.036446861922740936,
0.006443736609071493,
-0.0855078473687172,
0.030027970671653748,
0.09515207260847092,
0.06271591037511826,
0.05914396420121193,
-0.05524132400751114,
-0.030508676543831825,
0.12321562319993973,
0.07443377375602722,
-0.12551499903202057,
-0.02705082669854164,
0.017529312521219254,
-0.015876635909080505,
0.008572193793952465,
-0.037390898913145065,
-0.03701656311750412,
0.015899360179901123,
-0.018309112638235092,
0.04499899223446846,
0.0539000928401947,
-0.036757953464984894,
-0.03470428287982941,
-0.09378620237112045,
0.09508579969406128,
-0.07919500768184662,
0.029788585379719734,
0.0519520528614521,
-0.005362947937101126,
0.09480668604373932,
-0.06244857609272003,
0.08092300593852997,
-0.11137693375349045,
-0.0015084167243912816,
-0.029363244771957397,
-0.0021852704230695963,
0.02203652262687683,
-0.028790686279535294,
0.03576609119772911,
0.007639222778379917,
0.003345042234286666,
-0.11703725904226303,
0.008704486303031445,
-0.10322719067335129,
-0.023376958444714546,
-0.08244651556015015,
-0.042470213025808334,
-0.04586441069841385,
0.016804680228233337,
-0.009472056291997433,
-0.0038008710835129023,
-0.0000306444417219609,
-0.017917854711413383,
-0.007311863359063864,
0.005680184345692396,
0.047433942556381226,
0.058676354587078094,
0.0799684002995491,
-0.017061298713088036,
-0.01237193401902914,
-0.1137976422905922,
0.1151534840464592,
-0.08312291651964188,
-0.006456555332988501,
-0.13214315474033356,
-0.0424882136285305,
-0.03374730795621872,
0.029932595789432526,
0.016964204609394073,
0.12379729747772217,
-0.17349450290203094,
-0.07135242223739624,
0.1058838963508606,
-0.12473301589488983,
0.010217261500656605,
0.18630114197731018,
0.0031139568891376257,
0.07320865988731384,
0.09948944300413132,
0.22497409582138062,
0.03222721442580223,
-0.17428730428218842,
-0.016550052911043167,
-0.0510181188583374,
0.043763477355241776,
0.13177771866321564,
0.05735275521874428,
-0.05704893171787262,
0.052925024181604385,
-0.015749534592032433,
-0.017600372433662415,
-0.07513310015201569,
-0.006133105140179396,
-0.05046151578426361,
0.02173622138798237,
-0.04725830256938934,
0.018362540751695633,
-0.003750585950911045,
-0.019978871569037437,
-0.012387018650770187,
-0.0857030525803566,
-0.06977477669715881,
0.12076865136623383,
-0.0644802674651146,
0.026974117383360863,
-0.10025925934314728,
0.06620178371667862,
0.07418791949748993,
0.004733850248157978,
-0.11583560705184937,
0.12019906938076019,
0.03197497874498367,
-0.05266397073864937,
0.14205606281757355,
0.07646943628787994,
-0.03619016334414482,
0.017494402825832367,
-0.016535930335521698,
0.019005639478564262,
-0.03155047446489334,
0.016640886664390564,
-0.02478061243891716,
-0.10429801046848297,
-0.010610978119075298,
-0.060705021023750305,
0.1181643158197403,
-0.14264515042304993,
-0.014341743662953377,
0.03506365790963173,
0.11656660586595535,
-0.016182394698262215,
-0.038626622408628464,
0.09096390753984451,
0.05064048618078232,
0.03224010765552521,
-0.01961427927017212,
0.021469643339514732,
-0.01938285492360592,
0.00022360673756338656,
0.043700024485588074,
-0.14629507064819336,
-0.15698228776454926,
0.09476079046726227,
0.01526241097599268,
-0.015361460857093334,
0.0690816268324852,
0.020211530849337578,
-0.02131899818778038,
-0.04548574239015579,
0.0037224639672785997,
0.23384927213191986,
-0.015643324702978134,
0.06273898482322693,
-0.08012095093727112,
-0.005466196686029434,
0.014940137974917889,
-0.04816393926739693,
-0.08578737825155258,
0.08130107074975967,
0.008318191394209862,
-0.07847312092781067,
-0.045711684972047806,
0.0386098250746727,
0.07030370086431503,
0.1551363319158554,
0.007362939417362213,
-0.08432638645172119,
-0.030732573941349983,
-0.06010124832391739,
-0.014322346076369286,
0.05125325173139572,
-0.13210557401180267,
-0.024982644245028496,
0.02246781997382641,
0.0062834094278514385,
0.0537322498857975,
-0.027641208842396736,
0.04343137890100479,
0.0025467209052294493,
-0.0491693839430809,
-0.0818616971373558,
0.0334538035094738,
-0.03333393111824989,
0.0364878885447979,
-0.010968797840178013,
-0.0002661132311914116,
-0.0534345917403698,
-0.05765748769044876,
-0.14400045573711395,
0.08189317584037781,
-0.06982892751693726,
-0.31709328293800354,
-0.09108023345470428,
-0.04687675088644028,
-0.03060048073530197,
0.01107621006667614,
0.04770592600107193,
-0.114556685090065,
-0.11315365135669708,
-0.06644503027200699,
0.12605252861976624,
-0.025183526799082756,
-0.06575164943933487,
0.12635378539562225,
-0.0038884119130671024,
0.02969985641539097,
-0.0962064266204834,
0.01788630522787571,
-0.03789952024817467,
-0.021962303668260574,
-0.029954291880130768,
0.018715566024184227,
0.0633232593536377,
0.12454831600189209,
0.027090532705187798,
-0.0035229967907071114,
0.009090456180274487,
0.22008544206619263,
-0.13560453057289124,
0.0826973021030426,
0.2299492508172989,
-0.057473499327898026,
-0.007040307391434908,
0.14794500172138214,
-0.005486206617206335,
-0.0547604039311409,
0.04158881679177284,
-0.0009113909327425063,
-0.015926385298371315,
-0.2145705670118332,
-0.12582194805145264,
-0.04872498661279678,
-0.020974867045879364,
0.042248450219631195,
0.015950044617056847,
-0.01471760030835867,
0.01072234008461237,
-0.08832024037837982,
-0.034524235874414444,
0.058749720454216,
0.031220365315675735,
0.1471489667892456,
0.007843266241252422,
0.04828161001205444,
-0.04385936260223389,
-0.022522766143083572,
0.0998971238732338,
-0.03000732697546482,
0.04545978829264641,
0.07622810453176498,
0.10199546068906784,
0.06303254514932632,
0.046002503484487534,
0.05971663445234299,
-0.021091774106025696,
-0.0183556005358696,
-0.0044185626320540905,
-0.025741521269083023,
-0.06261797249317169,
0.017169523984193802,
0.0420321449637413,
0.14116989076137543,
-0.13106760382652283,
-0.1188214123249054,
0.03348975628614426,
0.016205308958888054,
0.1180204227566719,
0.10366705805063248,
-0.024427052587270737,
-0.09504402428865433,
0.040868837386369705,
-0.09114924818277359,
-0.03477839380502701,
0.058510664850473404,
0.08658530563116074,
-0.1568727344274521,
0.09392799437046051,
0.0796460285782814,
0.09243125468492508,
-0.04266209155321121,
0.030679352581501007,
-0.0560610294342041,
0.06146695837378502,
0.003152661258354783,
0.06980341672897339,
-0.1633320450782776,
0.10383141785860062,
0.014608597382903099,
0.0866505429148674,
-0.04724159091711044,
0.023960554972290993,
0.04592760279774666,
0.01854213885962963,
0.1174187958240509,
-0.005279394332319498,
-0.10251856595277786,
-0.011047674342989922,
-0.1152106374502182,
0.021786103025078773,
0.0538259856402874,
-0.0565643236041069,
0.05770226567983627,
-0.003548798616975546,
-0.006675974000245333,
-0.038683634251356125,
-0.009402703493833542,
-0.2637270987033844,
-0.13495557010173798,
0.04882025718688965,
0.003910204861313105,
0.056779004633426666,
-0.04004516452550888,
-0.07741392403841019,
-0.13992182910442352,
0.1019275113940239,
-0.0004315325350034982,
-0.01403751689940691,
-0.07094505429267883,
0.020064447075128555,
0.10017052292823792,
-0.058684341609478,
0.006343330256640911,
0.05068446695804596,
0.14032791554927826,
-0.067873015999794,
-0.0396430529654026,
0.017691021785140038,
-0.09970694780349731,
-0.12637987732887268,
0.01600121147930622,
0.17753711342811584,
0.11350595206022263,
0.06500770896673203,
0.09149786084890366,
0.016595447435975075,
-0.0018763213884085417,
-0.10258270055055618,
0.014499279670417309,
0.02962125837802887,
-0.07291429489850998,
0.04483357071876526,
-0.0058184354566037655,
-0.2610185146331787,
-0.1497693955898285,
-0.061391498893499374,
0.0708470419049263,
0.18278342485427856,
-0.028394728899002075,
0.1698644757270813,
0.27669599652290344,
-0.08764190971851349,
-0.22623300552368164,
-0.04886263608932495,
-0.0009260470978915691,
0.030907930806279182,
0.04174179583787918,
-0.20132911205291748,
0.10455971956253052,
-0.009552518837153912,
0.011652253568172455,
-0.05809158831834793,
-0.21287469565868378,
-0.1368250846862793,
0.17291077971458435,
-0.028325427323579788,
0.049104638397693634,
-0.023864516988396645,
-0.06436924636363983,
-0.03672866150736809,
-0.05975009873509407,
0.00047162402188405395,
-0.09244290739297867,
0.07253129035234451,
0.05466724932193756,
0.01704721711575985,
0.02195567451417446,
0.014620638452470303,
0.11701732128858566,
0.0907721072435379,
-0.025148898363113403,
-0.08091169595718384,
0.025663096457719803,
0.003228476271033287,
-0.01195097342133522,
0.10329730063676834,
0.0461544394493103,
0.013675285503268242,
-0.05405241623520851,
-0.08092391490936279,
-0.06079062074422836,
0.0610031932592392,
-0.073515385389328,
-0.007363996002823114,
-0.059927377849817276,
0.09213890135288239,
0.016582055017352104,
0.003789418376982212,
-0.07386007159948349,
-0.09113858640193939,
-0.01537830475717783,
0.11443949490785599,
0.21316003799438477,
-0.0714491531252861,
0.010166188701987267,
-0.044170916080474854,
-0.0457785464823246,
0.052681535482406616,
0.00819859653711319,
0.0437258705496788,
0.05590560659766197,
0.021675515919923782,
0.09181887656450272,
-0.0339512974023819,
-0.1274377852678299,
0.029384281486272812,
0.03809887543320656,
-0.06920807808637619,
-0.19201220571994781,
-0.05174817517399788,
-0.01282991748303175,
-0.019072778522968292,
-0.0330231599509716,
0.18982385098934174,
-0.020195258781313896,
-0.053411878645420074,
0.00624487828463316,
0.062887042760849,
-0.005579323507845402,
0.11836273223161697,
0.045712411403656006,
0.04042111709713936,
-0.09414660185575485,
0.049971845000982285,
0.11973779648542404,
-0.024542881175875664,
0.04427507892251015,
0.0981549397110939,
-0.049921005964279175,
-0.052174873650074005,
-0.10615749657154083,
0.008085831068456173,
0.07123301923274994,
-0.06291163712739944,
-0.00847573671489954,
-0.10953199863433838,
0.007040814030915499,
0.003557091113179922,
0.012761208228766918,
-0.046476785093545914,
-0.04626022279262543,
-0.0007003093487583101,
-0.09548086673021317,
0.06514262408018112,
0.10190138965845108,
-0.031805381178855896,
-0.11598920822143555,
0.11212916672229767,
0.019685223698616028,
0.08193278312683105,
-0.03870197758078575,
-0.06311503052711487,
-0.09238681197166443,
-0.005468122195452452,
-0.10268032550811768,
0.03650730475783348,
-0.1420285552740097,
-0.006383984815329313,
-0.04877083748579025,
-0.03789064288139343,
-0.0175011046230793,
0.07073292136192322,
-0.02955607697367668,
0.0031373591627925634,
-0.02908398024737835,
0.08746762573719025,
-0.12111101299524307,
0.07395361363887787,
0.06045416742563248,
-0.04827960208058357,
0.10873562097549438,
0.01822195202112198,
-0.05311744287610054,
0.04021020978689194,
-0.19974704086780548,
-0.0650683268904686,
-0.03246831148862839,
0.045685842633247375,
-0.009553846903145313,
-0.1805349886417389,
-0.0028396486304700375,
0.01597455143928528,
0.013182885013520718,
-0.01841786503791809,
0.053425487130880356,
-0.025573162361979485,
-0.024295656010508537,
-0.07525186985731125,
-0.06019324064254761,
-0.03524450957775116,
0.06299178302288055,
0.06688576191663742,
0.0053277104161679745,
0.09252213686704636,
-0.09111357480287552,
0.07751542329788208,
-0.08345172554254532,
0.022799184545874596,
-0.030173785984516144,
0.025387361645698547,
-0.07048073410987854,
-0.07624758780002594,
0.08319180458784103,
-0.015240765176713467,
0.07315090298652649,
0.02088976465165615,
-0.028534771874547005,
0.04066389054059982,
-0.04350094869732857,
-0.0758860856294632,
0.039625950157642365,
0.13713288307189941,
0.0549934059381485,
0.01826128363609314,
-0.006831314414739609,
-0.04603523761034012,
0.004441701807081699,
0.13960261642932892,
0.14242954552173615,
0.16167962551116943,
0.0932239517569542,
0.029802286997437477,
0.06869054585695267,
-0.0509473979473114,
-0.08979358524084091,
0.10314526408910751,
-0.07881351560354233,
0.03614169731736183,
-0.049743033945560455,
-0.06939414143562317,
0.0790150836110115,
-0.1356135904788971,
0.07420755177736282,
-0.029236294329166412,
-0.0871468186378479,
-0.10825485736131668,
-0.13951069116592407,
-0.06586410105228424,
-0.05282736197113991,
0.004818524233996868,
-0.10690934956073761,
0.022931357845664024,
0.0011616091942414641,
0.02697104401886463,
-0.09057948738336563,
0.11418671905994415,
-0.11480995267629623,
-0.12221898138523102,
0.14620040357112885,
-0.033822186291217804,
-0.012297531589865685,
-0.0014047077856957912,
0.038904279470443726,
0.02636977657675743,
0.09300536662340164,
0.050946902483701706,
0.04540146887302399,
0.018557989969849586,
0.028926365077495575,
-0.09516263753175735,
-0.06668754667043686,
0.031420886516571045,
-0.017677508294582367,
0.10393349081277847,
0.18068072199821472,
0.0906623974442482,
-0.08021173626184464,
0.00952499732375145,
0.13851815462112427,
0.024246862158179283,
-0.11981675028800964,
-0.15368381142616272,
0.03984261304140091,
-0.03666738048195839,
-0.005619856063276529,
0.004433250054717064,
-0.08956342190504074,
0.016752373427152634,
0.21108485758304596,
0.17472587525844574,
-0.04579650238156319,
0.01909109391272068,
-0.014917443506419659,
0.007663060910999775,
0.019620381295681,
0.08483964204788208,
0.08865824341773987,
0.1764914095401764,
-0.0056127700954675674,
0.03710203990340233,
-0.028942886739969254,
-0.10027698427438736,
-0.12348688393831253,
0.09736736863851547,
0.007207293063402176,
-0.033100832253694534,
-0.010406429879367352,
0.18618910014629364,
-0.10942236334085464,
-0.2211901694536209,
-0.12240190804004669,
-0.04683377966284752,
-0.11429183930158615,
0.023477530106902122,
-0.037615809589624405,
0.13443826138973236,
0.04414109140634537,
-0.00553864985704422,
0.01068610604852438,
0.1884177029132843,
0.035285212099552155,
0.03707946836948395,
-0.025801535695791245,
0.1084563210606575,
-0.10048335790634155,
0.11160443723201752,
-0.003117573680356145,
0.04636155813932419,
0.03556572273373604,
0.041509442031383514,
-0.06456679850816727,
0.03317705914378166,
0.03545701131224632,
-0.005558217875659466,
0.04213038086891174,
0.17369332909584045,
-0.005222667008638382,
0.09372846782207489,
0.10787796229124069,
-0.06501761823892593,
0.021114425733685493,
-0.025479750707745552,
0.00458112359046936,
-0.054746512323617935,
0.15508240461349487,
-0.14474406838417053,
0.1286645531654358,
0.10245988517999649,
-0.07381915301084518,
-0.04106986150145531,
-0.007270810194313526,
0.04798182472586632,
-0.06112907826900482,
0.09183686226606369,
-0.00337237142957747,
-0.17241021990776062,
0.020920807495713234,
-0.13178792595863342,
0.06766407936811447,
-0.24549445509910583,
-0.04086543619632721,
-0.04550132900476456,
-0.015111343003809452,
0.0045072115026414394,
0.11431198567152023,
0.08258306235074997,
-0.049100227653980255,
-0.011920129880309105,
-0.05150667950510979,
0.009340070188045502,
0.09399589896202087,
-0.08444618433713913,
-0.028212610632181168
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **it** on **21.9k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **it**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "it", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-it-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"it",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"it"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #it #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in it on 21.9k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in it. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in it on 21.9k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in it. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #it #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in it on 21.9k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in it. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #it #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in it on 21.9k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in it. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07402291148900986,
0.10791771858930588,
-0.0028074427973479033,
0.004236092325299978,
0.07543395459651947,
-0.04973645880818367,
0.1342596858739853,
0.04495513439178467,
0.0004913624725304544,
0.09749486297369003,
-0.013839476741850376,
-0.04921945929527283,
0.0731927677989006,
0.13142870366573334,
0.061980172991752625,
-0.2555970251560211,
0.03839869424700737,
-0.06534286588430405,
0.05154493451118469,
0.04895507171750069,
0.12117188423871994,
-0.08443280309438705,
0.02839432656764984,
0.05340872332453728,
-0.03522414341568947,
0.029849113896489143,
-0.046129193156957626,
-0.08230580389499664,
0.052981309592723846,
0.047320667654275894,
-0.030571576207876205,
0.023846399039030075,
0.09127821028232574,
-0.18801768124103546,
0.037457216531038284,
0.04251677915453911,
0.02850557118654251,
0.011311623267829418,
0.099809929728508,
0.02134787105023861,
0.1634403020143509,
-0.021235182881355286,
-0.005800284445285797,
0.08309376984834671,
-0.05744415521621704,
-0.0943622887134552,
-0.06440600752830505,
0.16138862073421478,
0.09855590015649796,
0.108766570687294,
-0.08003907650709152,
0.07698902487754822,
-0.019779087975621223,
0.0447365902364254,
0.07374914735555649,
-0.17800472676753998,
-0.05607075244188309,
0.05747620016336441,
0.10810518264770508,
0.020522141829133034,
-0.08767250180244446,
0.07279601693153381,
0.04979414865374565,
-0.011900502257049084,
-0.06621019542217255,
-0.03652176633477211,
0.13527579605579376,
-0.10716177523136139,
-0.11728150397539139,
-0.0008099868427962065,
0.17407268285751343,
0.0599135085940361,
-0.07523218542337418,
-0.15017692744731903,
0.012831142172217369,
0.2085348665714264,
-0.05633661150932312,
-0.0948631763458252,
0.008972364477813244,
0.02052801102399826,
0.04940751567482948,
-0.0698293149471283,
-0.07111505419015884,
-0.006852477323263884,
0.0219799242913723,
0.11474873870611191,
0.018194295465946198,
-0.01787894032895565,
-0.07254090160131454,
-0.0017628911882638931,
-0.0919535905122757,
-0.11639228463172913,
-0.005886946804821491,
-0.06568630039691925,
-0.06956276297569275,
-0.036494314670562744,
0.0001791365648387,
-0.10207558423280716,
0.030113350600004196,
0.0974789634346962,
0.06833045184612274,
0.05413847416639328,
-0.0568736232817173,
-0.03174152225255966,
0.1281970590353012,
0.0690482035279274,
-0.12423400580883026,
-0.01724489964544773,
0.012945829890668392,
-0.01871149241924286,
0.009581120684742928,
-0.03766822814941406,
-0.03552195429801941,
0.01558645348995924,
-0.015858452767133713,
0.046919338405132294,
0.06142332777380943,
-0.0343913696706295,
-0.034769169986248016,
-0.09597225487232208,
0.09438773989677429,
-0.08101901412010193,
0.026822933927178383,
0.04801433905959129,
-0.006236893590539694,
0.09378287196159363,
-0.06504078954458237,
0.07888998091220856,
-0.11053891479969025,
0.002615047851577401,
-0.027835888788104057,
-0.004839894827455282,
0.01940319500863552,
-0.0276283361017704,
0.031121548265218735,
-0.0026319746393710375,
0.0040138885378837585,
-0.11446817219257355,
0.006216172594577074,
-0.10328815132379532,
-0.01990497298538685,
-0.08243860304355621,
-0.04190700873732567,
-0.04535915330052376,
0.016921013593673706,
-0.004996528849005699,
-0.005531043279916048,
0.013972727581858635,
-0.018688950687646866,
-0.006401183549314737,
0.006354776676744223,
0.04515735059976578,
0.05692661181092262,
0.0819987878203392,
-0.018509812653064728,
-0.015435484237968922,
-0.10445469617843628,
0.11763951182365417,
-0.07743413001298904,
-0.020610446110367775,
-0.13836641609668732,
-0.036696892231702805,
-0.039732400327920914,
0.03298637643456459,
0.013937494717538357,
0.12493915110826492,
-0.1783307045698166,
-0.06998106092214584,
0.11594267189502716,
-0.12303607165813446,
0.011939548887312412,
0.17890380322933197,
0.00012566604709718376,
0.07801321893930435,
0.10134528577327728,
0.22097180783748627,
0.019512267783284187,
-0.17085950076580048,
-0.0161725003272295,
-0.049594696611166,
0.03483249992132187,
0.13420794904232025,
0.062111325562000275,
-0.06611013412475586,
0.059533555060625076,
-0.018092012032866478,
-0.02591290883719921,
-0.07884292304515839,
-0.0030365558341145515,
-0.04567395895719528,
0.019147997722029686,
-0.04411547631025314,
0.020448915660381317,
-0.004707928281277418,
-0.02483954094350338,
-0.012924209237098694,
-0.09262748807668686,
-0.06886667758226395,
0.11926011741161346,
-0.06232992559671402,
0.024130254983901978,
-0.09989457577466965,
0.06280606985092163,
0.0661194920539856,
0.006697637028992176,
-0.12783420085906982,
0.11851060390472412,
0.03159042075276375,
-0.04802808165550232,
0.14591047167778015,
0.07680216431617737,
-0.033278774470090866,
0.007037511095404625,
-0.012770689092576504,
0.0187558364123106,
-0.02991136722266674,
0.013916943222284317,
-0.025827748700976372,
-0.1034717932343483,
-0.007440970279276371,
-0.06591702252626419,
0.1145171970129013,
-0.13058803975582123,
-0.015316614881157875,
0.04300587624311447,
0.10527845472097397,
-0.01512379851192236,
-0.04042385146021843,
0.09234942495822906,
0.04275421053171158,
0.03190939873456955,
-0.021067364141345024,
0.022026818245649338,
-0.016079407185316086,
-0.00041778371087275445,
0.04738534614443779,
-0.15175761282444,
-0.1613563448190689,
0.09677060693502426,
0.015227840282022953,
-0.017227906733751297,
0.0662212073802948,
0.023141726851463318,
-0.01973619870841503,
-0.045231305062770844,
0.003375199157744646,
0.2355189472436905,
-0.011308805085718632,
0.06067740172147751,
-0.07989895343780518,
-0.007206811103969812,
0.01682230643928051,
-0.05047252029180527,
-0.0889083594083786,
0.07767659425735474,
-0.00019619916565716267,
-0.086077980697155,
-0.04287705197930336,
0.05015641078352928,
0.07118944078683853,
0.1553782820701599,
0.010032166726887226,
-0.08579277992248535,
-0.034357212483882904,
-0.06376573443412781,
-0.012239698320627213,
0.04213419929146767,
-0.14052574336528778,
-0.023752424865961075,
0.022619295865297318,
0.009266117587685585,
0.05042831972241402,
-0.024153422564268112,
0.04618677496910095,
0.008635377511382103,
-0.04939254745841026,
-0.07131656259298325,
0.03358249366283417,
-0.034702375531196594,
0.037511132657527924,
-0.005831129848957062,
0.005499611608684063,
-0.04581532999873161,
-0.059490855783224106,
-0.14187948405742645,
0.08836540579795837,
-0.06583299487829208,
-0.31140539050102234,
-0.08729300647974014,
-0.054612863808870316,
-0.03271932154893875,
0.013957441784441471,
0.04816216975450516,
-0.10852111130952835,
-0.10838042199611664,
-0.06794404983520508,
0.1257602423429489,
-0.0329953171312809,
-0.0652279406785965,
0.11342111974954605,
-0.0052304561249911785,
0.026572247967123985,
-0.09827826917171478,
0.015959039330482483,
-0.038108937442302704,
-0.032382287085056305,
-0.02966431714594364,
0.020911643281579018,
0.059069257229566574,
0.12482774257659912,
0.023938169702887535,
-0.0038391128182411194,
0.009313077665865421,
0.21824811398983002,
-0.13872869312763214,
0.0820380225777626,
0.23598332703113556,
-0.05664446949958801,
-0.007742059882730246,
0.14213484525680542,
-0.007423374801874161,
-0.05239862948656082,
0.04372382164001465,
0.003603794611990452,
-0.020306725054979324,
-0.22081629931926727,
-0.12268193811178207,
-0.04166707396507263,
-0.021986903622746468,
0.04554326459765434,
0.018003586679697037,
-0.001234445720911026,
0.015159114263951778,
-0.08433299511671066,
-0.043003205209970474,
0.05939429998397827,
0.03321438655257225,
0.14156405627727509,
0.008720321580767632,
0.05398384481668472,
-0.042629074305295944,
-0.021063728258013725,
0.10390495508909225,
-0.030269291251897812,
0.04049461707472801,
0.07471422106027603,
0.10096573084592819,
0.0637730211019516,
0.03762226924300194,
0.056263770908117294,
-0.01863459125161171,
-0.017532523721456528,
-0.0038611229974776506,
-0.029686467722058296,
-0.06430349498987198,
0.016644936054944992,
0.04243176802992821,
0.14539267122745514,
-0.13248491287231445,
-0.11676955223083496,
0.02961539663374424,
0.013943885453045368,
0.1246308982372284,
0.09836430102586746,
-0.024655234068632126,
-0.09441680461168289,
0.03675316646695137,
-0.09072040021419525,
-0.03504721447825432,
0.053433746099472046,
0.0846320316195488,
-0.16070556640625,
0.0887017473578453,
0.07677412778139114,
0.08842219412326813,
-0.043580759316682816,
0.030868668109178543,
-0.05329008400440216,
0.054391056299209595,
0.002334046410396695,
0.0725729838013649,
-0.1739795058965683,
0.10426637530326843,
0.01718343235552311,
0.08810702711343765,
-0.051820702850818634,
0.02493658848106861,
0.044827524572610855,
0.010478053241968155,
0.12717701494693756,
-0.009688879363238811,
-0.10184662789106369,
-0.00045924450387246907,
-0.11723026633262634,
0.017256779596209526,
0.05834474042057991,
-0.05922910198569298,
0.05551772564649582,
-0.004051898140460253,
-0.006988666020333767,
-0.03723428025841713,
-0.001092580147087574,
-0.2573049068450928,
-0.13880762457847595,
0.04653964564204216,
-0.002205999568104744,
0.05813058093190193,
-0.03878454491496086,
-0.07437919825315475,
-0.12341491878032684,
0.11020954698324203,
-0.005558885168284178,
-0.015042872168123722,
-0.07238908857107162,
0.022527918219566345,
0.09820904582738876,
-0.06023826450109482,
0.014017519541084766,
0.04480642452836037,
0.1432517021894455,
-0.06520035117864609,
-0.041744232177734375,
0.023976167663931847,
-0.10096411406993866,
-0.12262066453695297,
0.014642558060586452,
0.1744268387556076,
0.11203952878713608,
0.0654551163315773,
0.09341438859701157,
0.02103479951620102,
-0.00549385417252779,
-0.09789977222681046,
0.024877183139324188,
0.025705425068736076,
-0.07136755436658859,
0.04360247775912285,
0.0010161534883081913,
-0.2732020318508148,
-0.14972257614135742,
-0.0621270053088665,
0.08136259019374847,
0.18437053263187408,
-0.02709101140499115,
0.16304069757461548,
0.2780075669288635,
-0.09066548943519592,
-0.2268323451280594,
-0.043706003576517105,
-0.0002106027095578611,
0.027844207361340523,
0.04272838681936264,
-0.2076692432165146,
0.09946548938751221,
-0.00042844327981583774,
0.011133232153952122,
-0.06017206236720085,
-0.21694858372211456,
-0.13494279980659485,
0.17001673579216003,
-0.02465163543820381,
0.04141484573483467,
-0.028240548446774483,
-0.06851696968078613,
-0.03403765708208084,
-0.04829341545701027,
0.013638527132570744,
-0.09442149102687836,
0.07546614110469818,
0.05505344271659851,
0.010071516036987305,
0.023626290261745453,
0.012482091784477234,
0.11075977981090546,
0.08758026361465454,
-0.02240290306508541,
-0.08139165490865707,
0.01921837404370308,
0.005156312603503466,
-0.013422295451164246,
0.10568442940711975,
0.049259692430496216,
0.01621752604842186,
-0.04453485459089279,
-0.08407878875732422,
-0.06495930999517441,
0.05894424766302109,
-0.0720803290605545,
-0.013587559573352337,
-0.05527472496032715,
0.09103026986122131,
0.012641798704862595,
0.0008448821608908474,
-0.07111985236406326,
-0.09719296544790268,
-0.014764842577278614,
0.12117104977369308,
0.21651574969291687,
-0.05572039633989334,
-0.0038573206402361393,
-0.04389356076717377,
-0.04492907226085663,
0.04837580770254135,
-0.0021690053399652243,
0.04221208766102791,
0.04934306815266609,
0.02489718236029148,
0.08964736014604568,
-0.03271225839853287,
-0.13102902472019196,
0.03033868782222271,
0.036013517528772354,
-0.06902951002120972,
-0.19057568907737732,
-0.04738987609744072,
0.002232246333733201,
-0.02271142229437828,
-0.03669186308979988,
0.19184421002864838,
-0.013529211282730103,
-0.05540483072400093,
0.003639594418928027,
0.059705108404159546,
-0.006209764163941145,
0.11839881539344788,
0.04423554614186287,
0.04041428118944168,
-0.08971775323152542,
0.055839672684669495,
0.12053092569112778,
-0.040029022842645645,
0.04858753830194473,
0.09343704581260681,
-0.046834807842969894,
-0.05433773621916771,
-0.09838838875293732,
-0.0003123731876257807,
0.06509756296873093,
-0.06047038361430168,
-0.002822350012138486,
-0.1036001667380333,
0.008175257593393326,
0.008343309164047241,
0.012469845823943615,
-0.04398972913622856,
-0.04911664128303528,
-0.0031195145566016436,
-0.09380912035703659,
0.06529951095581055,
0.09587408602237701,
-0.029690014198422432,
-0.10919082164764404,
0.1097535789012909,
0.019467225298285484,
0.07946224510669708,
-0.03721512109041214,
-0.06813167780637741,
-0.08928517997264862,
-0.004745183512568474,
-0.09062311798334122,
0.035475537180900574,
-0.13733986020088196,
-0.013351586647331715,
-0.04689084738492966,
-0.02895537205040455,
-0.010104968212544918,
0.06795407831668854,
-0.02940591424703598,
0.00037720962427556515,
-0.029373889788985252,
0.08352348208427429,
-0.12593671679496765,
0.07237409800291061,
0.05619693174958229,
-0.045762594789266586,
0.1076638326048851,
0.02030068077147007,
-0.05020119994878769,
0.036169491708278656,
-0.2080821394920349,
-0.05681479349732399,
-0.035159897059202194,
0.04212847352027893,
-0.01316788885742426,
-0.17943735420703888,
0.00048129784408956766,
0.015799041837453842,
0.010770212858915329,
-0.019327638670802116,
0.04906819388270378,
-0.0285459216684103,
-0.012381711043417454,
-0.06669484078884125,
-0.0609610378742218,
-0.038062140345573425,
0.06687566637992859,
0.07044532895088196,
0.005253042560070753,
0.09988255053758621,
-0.0896078422665596,
0.07676650583744049,
-0.08005033433437347,
0.026663610711693764,
-0.029540374875068665,
0.023173121735453606,
-0.07616465538740158,
-0.07554253935813904,
0.07891099900007248,
-0.015699243173003197,
0.06984741985797882,
0.028385359793901443,
-0.027544649317860603,
0.043932829052209854,
-0.05129528418183327,
-0.06017940491437912,
0.03906167298555374,
0.13791219890117645,
0.05151606351137161,
0.017752226442098618,
-0.0012875963002443314,
-0.0408395454287529,
0.002520230133086443,
0.14561009407043457,
0.14213570952415466,
0.16710259020328522,
0.10089598596096039,
0.03150571510195732,
0.06807749718427658,
-0.04772728681564331,
-0.08899575471878052,
0.09151211380958557,
-0.06987429410219193,
0.03592822328209877,
-0.04876343160867691,
-0.07081998884677887,
0.07164685428142548,
-0.13662996888160706,
0.07568758726119995,
-0.0284600630402565,
-0.08335529267787933,
-0.10692687332630157,
-0.1449286788702011,
-0.06329496949911118,
-0.04258463531732559,
0.0018495813710615039,
-0.10773878544569016,
0.027545727789402008,
0.012590478174388409,
0.02703060582280159,
-0.09316834807395935,
0.11562144756317139,
-0.121057890355587,
-0.12593013048171997,
0.1523473858833313,
-0.03510899841785431,
-0.013205688446760178,
-0.0010988516733050346,
0.046977199614048004,
0.02185765467584133,
0.09290491044521332,
0.052464861422777176,
0.05014565959572792,
0.02135007455945015,
0.02961529791355133,
-0.09936517477035522,
-0.06758011877536774,
0.032174333930015564,
-0.016320345923304558,
0.09982778131961823,
0.1892961710691452,
0.08792197704315186,
-0.08359456062316895,
0.01281939260661602,
0.14137564599514008,
0.02520267479121685,
-0.11617108434438705,
-0.14629988372325897,
0.038928646594285965,
-0.0373225212097168,
-0.003023313358426094,
0.0033542837481945753,
-0.09220337122678757,
0.020610462874174118,
0.2035858929157257,
0.17319241166114807,
-0.040945958346128464,
0.021391235291957855,
-0.009766129776835442,
0.0077963219955563545,
0.021874666213989258,
0.08303329348564148,
0.08372144401073456,
0.17707397043704987,
-0.009799649938941002,
0.051456015557050705,
-0.02509576827287674,
-0.09877265989780426,
-0.11741413921117783,
0.09487614035606384,
0.006495131645351648,
-0.031080272048711777,
-0.006749399472028017,
0.18586669862270355,
-0.10629274696111679,
-0.2157692164182663,
-0.12294629216194153,
-0.04103255644440651,
-0.11580127477645874,
0.023785771802067757,
-0.04539661854505539,
0.13507314026355743,
0.05324804037809372,
-0.003821701742708683,
0.011625798419117928,
0.17535659670829773,
0.03542948514223099,
0.031190618872642517,
-0.03127375617623329,
0.10921259224414825,
-0.09028562158346176,
0.11655856668949127,
-0.0032766652293503284,
0.04598712921142578,
0.035196755081415176,
0.03385591134428978,
-0.06817042082548141,
0.033580027520656586,
0.03414252772927284,
0.0023447894491255283,
0.04845263063907623,
0.16975244879722595,
-0.004279065411537886,
0.09610432386398315,
0.10834519565105438,
-0.06302592903375626,
0.027097605168819427,
-0.02501896396279335,
0.0010148000437766314,
-0.061362139880657196,
0.15748369693756104,
-0.14790120720863342,
0.12980809807777405,
0.10489557683467865,
-0.07131298631429672,
-0.04566154628992081,
-0.010403933003544807,
0.04929853603243828,
-0.0579647570848465,
0.09774380177259445,
-0.008043281733989716,
-0.16664835810661316,
0.026623675599694252,
-0.12451300024986267,
0.06782189756631851,
-0.26311326026916504,
-0.04311184957623482,
-0.04118689149618149,
-0.01730954833328724,
0.005982270464301109,
0.10693353414535522,
0.08207924664020538,
-0.04838905110955238,
-0.01158951222896576,
-0.03799496218562126,
0.009678107686340809,
0.09162922948598862,
-0.08414005488157272,
-0.02778439037501812
] |
null | null |
transformers
|
# Wav2Vec2-Base-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the it unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "it", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-it-voxpopuli
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"it",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"it"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Base-VoxPopuli
Facebook's Wav2Vec2 base model pretrained on the it unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the it unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the it unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
69,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the it unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.05940120294690132,
0.03734838590025902,
-0.004387782886624336,
-0.007748189847916365,
0.12088974565267563,
-0.0344688706099987,
0.0723329558968544,
0.002697639400139451,
0.03853825107216835,
0.012451889924705029,
0.01286226324737072,
0.026379475370049477,
0.08949165791273117,
0.12452036887407303,
-0.013732347637414932,
-0.2836898863315582,
0.07049687951803207,
0.016170864924788475,
0.07079099118709564,
0.04737408459186554,
0.10981027036905289,
-0.07549183070659637,
0.048260606825351715,
0.05119266360998154,
-0.0845135822892189,
0.020639214664697647,
0.02474263869225979,
-0.086113840341568,
0.12560445070266724,
0.11087574809789658,
0.08650119602680206,
0.0571121871471405,
0.030182085931301117,
-0.1625368595123291,
0.03453401103615761,
0.06581618636846542,
-0.0600401870906353,
-0.013778096064925194,
0.1193418949842453,
-0.0204927958548069,
0.21449267864227295,
-0.026149872690439224,
-0.03727666288614273,
0.08918551355600357,
-0.13131210207939148,
-0.12951865792274475,
-0.07114376872777939,
0.13313886523246765,
0.13961109519004822,
0.06836128234863281,
-0.07931701093912125,
0.035760022699832916,
-0.03564887121319771,
0.06839156150817871,
0.07012331485748291,
-0.28871235251426697,
-0.036157529801130295,
0.1122565045952797,
0.07470422983169556,
-0.027962561696767807,
-0.10245368629693985,
0.09762334078550339,
0.025732846930623055,
-0.008567447774112225,
-0.01863010786473751,
-0.09256385266780853,
0.03473259508609772,
-0.0829213336110115,
-0.11542035639286041,
0.010122398845851421,
0.18197043240070343,
0.03941483423113823,
-0.061514467000961304,
-0.06589525938034058,
-0.03072267584502697,
0.18870379030704498,
-0.06071917712688446,
-0.16937017440795898,
0.010358658619225025,
0.041440870612859726,
0.07770471274852753,
-0.1758723258972168,
-0.06557632237672806,
-0.005107388366013765,
-0.055919237434864044,
0.09821777790784836,
0.026392951607704163,
-0.017983119934797287,
-0.06487035006284714,
0.012554611079394817,
-0.0941462516784668,
-0.06800566613674164,
0.002986603183671832,
-0.10434699058532715,
-0.08933974802494049,
-0.028761906549334526,
-0.08331289887428284,
-0.0768631100654602,
-0.02790963649749756,
0.08717241883277893,
0.014375220984220505,
0.06208006292581558,
-0.09104649722576141,
0.03575606644153595,
0.0038202800787985325,
0.09381379932165146,
-0.1308637261390686,
-0.016389163210988045,
0.007791070733219385,
-0.043586768209934235,
-0.0013784990878775716,
-0.03508235514163971,
-0.0774453803896904,
-0.07089333981275558,
-0.025722410529851913,
0.06406594067811966,
0.007243991829454899,
0.02587796002626419,
-0.06353262066841125,
-0.09253144264221191,
0.03870135918259621,
-0.07452807575464249,
0.0218733549118042,
0.0302441269159317,
0.010127444751560688,
0.2045872062444687,
0.00355518888682127,
0.06327782571315765,
-0.1433705985546112,
0.015919549390673637,
-0.021733375266194344,
0.00428173178806901,
-0.011985581368207932,
-0.04641271010041237,
0.030789470300078392,
-0.03786643221974373,
-0.009040207602083683,
-0.131987527012825,
-0.08965476602315903,
-0.08282306790351868,
0.0027463457081466913,
-0.0362587608397007,
-0.06307101249694824,
-0.0397646389901638,
-0.0006840544519945979,
-0.019669391214847565,
-0.027226172387599945,
-0.02227986603975296,
-0.01970481686294079,
0.0038901991210877895,
-0.02551274187862873,
0.08045339584350586,
-0.05742820352315903,
0.08067753165960312,
0.010878262110054493,
-0.019497720524668694,
-0.14307542145252228,
0.1280476152896881,
-0.07939482480287552,
-0.056426554918289185,
-0.14897356927394867,
-0.03456002101302147,
-0.06915267556905746,
0.06534483283758163,
-0.007267654407769442,
0.1606975644826889,
-0.20095625519752502,
-0.11663985252380371,
0.2507156431674957,
-0.1200479194521904,
0.016218863427639008,
0.1746252328157425,
0.019051019102334976,
0.03954383730888367,
0.16829662024974823,
0.1387234628200531,
0.033951498568058014,
-0.11345256119966507,
0.041473984718322754,
-0.042640455067157745,
-0.020394189283251762,
0.061492059379816055,
0.04894745722413063,
-0.011626164428889751,
-0.007872060872614384,
-0.009844856336712837,
-0.07075027376413345,
-0.051916468888521194,
0.00014222237223293632,
-0.060606665909290314,
0.03379515931010246,
-0.020764710381627083,
0.0857660174369812,
-0.007702820934355259,
-0.008759131655097008,
0.02645202912390232,
-0.09298912435770035,
-0.024960195645689964,
0.06912297755479813,
-0.05398159101605415,
0.07677356153726578,
-0.10842887312173843,
0.05322869494557381,
0.11514724791049957,
0.06395750492811203,
-0.13756178319454193,
0.05555574223399162,
-0.02291754074394703,
0.08642612397670746,
0.09503622353076935,
0.18444961309432983,
-0.026569239795207977,
-0.04672307148575783,
-0.0858975499868393,
0.024025538936257362,
-0.02689274773001671,
-0.04161336272954941,
-0.022222090512514114,
-0.09921589493751526,
-0.029132120311260223,
-0.04267638921737671,
0.06512401252985,
-0.17590966820716858,
0.008136180229485035,
0.0748211145401001,
0.07424838095903397,
0.016396112740039825,
0.017657997086644173,
0.005147200543433428,
0.09630356729030609,
0.036883383989334106,
0.00572765851393342,
0.07244230806827545,
-0.007395056542009115,
-0.06087449938058853,
0.11768238246440887,
-0.06683389097452164,
0.018502483144402504,
0.14070677757263184,
-0.10933755338191986,
-0.007492614910006523,
0.005992654711008072,
0.030314190313220024,
0.002052613999694586,
0.010459999553859234,
-0.019298134371638298,
0.20975430309772491,
0.027070302516222,
0.08031978458166122,
-0.08101005107164383,
0.024604545906186104,
-0.015930000692605972,
-0.03257669880986214,
-0.06345998495817184,
0.06195788457989693,
0.03657800704240799,
-0.09561945497989655,
0.011499093845486641,
0.12303350865840912,
0.0007158042280934751,
0.15133507549762726,
0.021211761981248856,
-0.019783804193139076,
0.0120194461196661,
-0.062320418655872345,
-0.023228084668517113,
-0.01143725123256445,
-0.16512292623519897,
-0.017204945906996727,
0.02167012356221676,
0.013600229285657406,
0.06711035966873169,
-0.05389437824487686,
0.0009666726691648364,
0.00895300880074501,
-0.07567904889583588,
-0.03970868140459061,
0.04177339747548103,
-0.011370089836418629,
0.07295577973127365,
-0.04699642211198807,
-0.0017517119413241744,
-0.0155079560354352,
-0.029836442321538925,
-0.10263749212026596,
0.11914969235658646,
-0.0611782930791378,
-0.36353322863578796,
-0.09842242300510406,
-0.10340645164251328,
-0.08794280886650085,
0.04242904856801033,
0.04243243858218193,
-0.10331621021032333,
-0.07501812279224396,
0.0027470500208437443,
0.17130962014198303,
-0.03272213414311409,
-0.07954224944114685,
0.04651128500699997,
0.003248597262427211,
-0.009180049411952496,
-0.08875266462564468,
0.005597198847681284,
-0.033497147262096405,
-0.13351860642433167,
-0.010829668492078781,
-0.026577003300189972,
0.03547435253858566,
0.1408466100692749,
0.036024853587150574,
-0.022884123027324677,
-0.025722956284880638,
0.19945310056209564,
-0.1320766657590866,
0.08680691570043564,
0.2607499063014984,
-0.022462986409664154,
0.019538354128599167,
0.14342227578163147,
-0.013733973726630211,
-0.08049003034830093,
-0.005634179338812828,
0.0711105465888977,
-0.0028975026216357946,
-0.2646539509296417,
-0.12720711529254913,
-0.06080186739563942,
-0.025309931486845016,
0.027235303074121475,
-0.009828277863562107,
0.018075108528137207,
0.041996151208877563,
-0.10539048165082932,
-0.019620981067419052,
0.052684515714645386,
0.02548619545996189,
0.21462295949459076,
-0.04546075686812401,
0.14177000522613525,
-0.0242487620562315,
-0.023181727156043053,
0.07089104503393173,
0.03272748738527298,
0.0817660391330719,
0.1181202083826065,
0.059599727392196655,
0.08646083623170853,
0.016870668157935143,
0.024134086444973946,
-0.002709505148231983,
0.0015965461498126388,
-0.029400069266557693,
-0.052385710179805756,
-0.020765874534845352,
-0.04486644268035889,
0.02338235266506672,
0.10967808216810226,
-0.17039383947849274,
-0.12077580392360687,
0.01752397045493126,
0.034601010382175446,
0.13938650488853455,
0.05520929396152496,
-0.08002721518278122,
-0.03648709878325462,
0.04950806125998497,
-0.08424645662307739,
-0.049437016248703,
0.06353351473808289,
0.07945040613412857,
-0.148639976978302,
0.14851251244544983,
0.03017774410545826,
0.09647831320762634,
-0.019165389239788055,
0.0558173730969429,
-0.16070502996444702,
-0.021138347685337067,
0.03275027871131897,
0.06884358078241348,
-0.24923297762870789,
0.212227001786232,
0.022375447675585747,
0.06523426622152328,
-0.06759463250637054,
0.004503151401877403,
0.05177313834428787,
0.14427818357944489,
0.13162246346473694,
-0.005954492837190628,
-0.06765056401491165,
0.014035413041710854,
-0.012462346814572811,
0.03825550153851509,
0.036343421787023544,
-0.020289460197091103,
0.045893486589193344,
-0.005745386239141226,
0.013710327446460724,
-0.012778162024915218,
0.1211935430765152,
-0.2287476807832718,
-0.1490202397108078,
0.025249425321817398,
0.014040912501513958,
0.12031512707471848,
-0.00563953910022974,
-0.06608200818300247,
-0.1048831194639206,
0.10634683072566986,
-0.003983698319643736,
-0.016689715906977654,
-0.10408817231655121,
0.03527260944247246,
0.02419205568730831,
-0.09937077015638351,
0.03627600520849228,
0.05605990067124367,
0.13027872145175934,
-0.1012372374534607,
-0.06368177384138107,
0.04915420338511467,
-0.09142497926950455,
-0.05843138322234154,
0.04663363844156265,
0.1757233589887619,
0.10191722214221954,
0.03325025364756584,
0.11613334715366364,
-0.03985518217086792,
0.048304419964551926,
-0.11251228302717209,
0.07263752073049545,
0.013057355768978596,
-0.018161021173000336,
0.023639999330043793,
-0.05703587085008621,
-0.25307124853134155,
-0.10777077823877335,
-0.014626782387495041,
0.17445138096809387,
0.18662074208259583,
0.020740944892168045,
0.14908717572689056,
0.24514956772327423,
-0.09916607290506363,
-0.25615912675857544,
-0.04542256519198418,
-0.02272828109562397,
0.03826656937599182,
0.018110042437911034,
-0.2710079550743103,
0.05893377214670181,
0.05786782503128052,
0.007654815912246704,
-0.07929931581020355,
-0.20233070850372314,
-0.12831823527812958,
0.19787974655628204,
0.02063080109655857,
0.14911183714866638,
-0.09367501735687256,
-0.04746793210506439,
-0.08975744247436523,
-0.07258020341396332,
0.08090680092573166,
-0.13430729508399963,
0.08529307693243027,
0.05576686933636665,
-0.0072168405167758465,
0.005675794091075659,
0.055726367980241776,
0.11981778591871262,
0.06470675021409988,
0.014926275238394737,
-0.028191272169351578,
0.030145376920700073,
0.0330546498298645,
0.027599293738603592,
0.029030634090304375,
0.026405056938529015,
-0.032244972884655,
-0.05249376595020294,
-0.1101670190691948,
-0.10260074585676193,
0.09440039843320847,
-0.05990128591656685,
-0.0036445565056055784,
-0.018399370834231377,
0.10417753458023071,
0.015720469877123833,
0.01579667255282402,
-0.05713783949613571,
-0.1472293585538864,
0.035398293286561966,
0.12616556882858276,
0.23375479876995087,
-0.13550129532814026,
0.009060186333954334,
-0.03804314136505127,
-0.04548445716500282,
0.09045484662055969,
-0.0029980980325490236,
0.06950312107801437,
0.03490322828292847,
-0.010776623152196407,
0.09506502002477646,
0.019057950004935265,
-0.07269766181707382,
0.00687711127102375,
0.045390237122774124,
-0.0653020441532135,
-0.24607394635677338,
-0.06619881838560104,
-0.003021423239260912,
0.01792972721159458,
0.013060197234153748,
0.18870992958545685,
-0.008757433854043484,
-0.05938083678483963,
-0.017519941553473473,
0.04328463599085808,
-0.04253918677568436,
0.049078766256570816,
0.030172256752848625,
0.03230477496981621,
-0.11821917444467545,
0.06416410952806473,
0.08716581761837006,
-0.14907531440258026,
0.06691096723079681,
0.03959597647190094,
-0.04784132540225983,
-0.09608334302902222,
-0.13424469530582428,
0.038981497287750244,
-0.011973879300057888,
-0.0847364291548729,
0.032320693135261536,
-0.16146019101142883,
0.03657809644937515,
0.12436981499195099,
0.031126368790864944,
-0.013824181631207466,
-0.059310149401426315,
-0.05007052421569824,
-0.017933297902345657,
-0.0005758749321103096,
0.11542300879955292,
-0.06171349436044693,
-0.13786056637763977,
0.1466006189584732,
0.013238604180514812,
0.07412848621606827,
-0.04645124450325966,
-0.04273856058716774,
-0.14100085198879242,
0.00940911378711462,
-0.16061241924762726,
0.017069756984710693,
-0.12256405502557755,
0.000807864882517606,
-0.04341994598507881,
-0.01586155965924263,
-0.04595864564180374,
0.022830821573734283,
-0.10116329044103622,
0.01490744948387146,
-0.005266203079372644,
0.08098453283309937,
-0.10859835147857666,
0.07886625826358795,
0.08087858557701111,
-0.02329283580183983,
0.06893037259578705,
0.016277465969324112,
-0.029377566650509834,
0.10956022143363953,
-0.16285264492034912,
-0.05281247943639755,
0.03967408835887909,
0.03145609050989151,
-0.005995942745357752,
-0.14861950278282166,
0.013103152625262737,
0.027601566165685654,
0.03706827387213707,
-0.004002722911536694,
0.07919114828109741,
-0.054731447249650955,
-0.006390886381268501,
-0.042992912232875824,
-0.09513850510120392,
-0.008387746289372444,
0.07768969982862473,
0.10468290001153946,
0.0003942593466490507,
0.11437855660915375,
-0.0821826383471489,
0.0464656800031662,
-0.10247840732336044,
0.06769856065511703,
-0.04219062998890877,
-0.02647477015852928,
0.057209305465221405,
-0.14392465353012085,
0.05359654501080513,
0.0019679877441376448,
0.09551884979009628,
-0.006455579772591591,
0.01515455823391676,
0.024301566183567047,
-0.09902031719684601,
-0.11708175390958786,
0.03792969509959221,
0.13258177042007446,
0.0815218910574913,
-0.008900442160665989,
0.023901766166090965,
0.0003406499745324254,
0.03228132799267769,
0.20295076072216034,
0.20856979489326477,
0.19888824224472046,
0.03223215416073799,
0.0948043242096901,
0.019245443865656853,
-0.055981650948524475,
-0.03119693323969841,
0.009146451018750668,
-0.09111388027667999,
0.026935763657093048,
-0.05526553466916084,
-0.053007692098617554,
0.08756363391876221,
-0.14842136204242706,
0.12297406047582626,
0.022584034129977226,
-0.08384594321250916,
-0.1584656983613968,
-0.16907520592212677,
-0.06553670018911362,
-0.09590530395507812,
-0.014018054120242596,
-0.12882669270038605,
-0.01676574908196926,
0.018303750082850456,
0.021266581490635872,
-0.11890968680381775,
0.10197369009256363,
-0.14451298117637634,
-0.16694800555706024,
0.18173164129257202,
-0.03581393510103226,
0.006264984142035246,
-0.010604667477309704,
0.005252320319414139,
0.003893284359946847,
0.08650551736354828,
0.016262758523225784,
0.03898736834526062,
-0.013712557032704353,
0.05655250698328018,
-0.0775022879242897,
-0.05063583329319954,
0.0006756782531738281,
0.025677213445305824,
0.11637676507234573,
0.1952631175518036,
0.03701058402657509,
-0.05802226439118385,
0.009130852296948433,
0.14624610543251038,
0.025212958455085754,
-0.0901852697134018,
-0.1464129537343979,
0.0637921392917633,
0.03508581966161728,
0.026434797793626785,
-0.04150779917836189,
-0.06397193670272827,
0.008946138434112072,
0.2873135805130005,
0.15820153057575226,
-0.059316426515579224,
0.02784654311835766,
0.004929228685796261,
0.038645144551992416,
0.08019181340932846,
0.10586085915565491,
0.08306479454040527,
0.14977262914180756,
-0.03353928402066231,
-0.012490343302488327,
0.010783281177282333,
-0.0693252757191658,
-0.07642536610364914,
0.1561063826084137,
0.03940845653414726,
-0.07968365401029587,
0.0166554544121027,
0.15212693810462952,
-0.15789254009723663,
-0.09076949954032898,
-0.08181116729974747,
-0.051737602800130844,
-0.10213687270879745,
-0.02453652024269104,
-0.05014001950621605,
0.1073029488325119,
0.10297848284244537,
-0.008522062562406063,
-0.030108721926808357,
0.18974900245666504,
0.049802515655756,
-0.015199504792690277,
-0.05418519675731659,
0.12385325878858566,
-0.03518786281347275,
0.06555124372243881,
-0.019216112792491913,
0.03441832214593887,
0.05038318410515785,
0.04590091109275818,
-0.004970143083482981,
0.05243443697690964,
-0.0068536195904016495,
0.033584367483854294,
0.07283312827348709,
0.12874801456928253,
0.0056478530168533325,
0.04638366401195526,
0.09606117010116577,
-0.13755173981189728,
0.035129155963659286,
0.0413811095058918,
-0.04282284900546074,
-0.0022700424306094646,
0.1737300455570221,
-0.19707369804382324,
0.05869237706065178,
0.15545611083507538,
-0.02922745980322361,
-0.015841329470276833,
-0.043016765266656876,
0.06441459059715271,
-0.017608901485800743,
0.052644506096839905,
-0.03585245460271835,
-0.13566166162490845,
0.0041675749234855175,
-0.063743956387043,
0.03155135363340378,
-0.1887275129556656,
0.00642869109287858,
-0.035964030772447586,
0.0007693332736380398,
-0.053610075265169144,
0.09533437341451645,
0.013768783770501614,
-0.05269036814570427,
0.018751518800854683,
-0.10020594298839569,
0.03477763384580612,
0.09561745822429657,
-0.09035410732030869,
-0.04250181466341019
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **lt** on **14.4k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **lt**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "lt", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-lt-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"lt",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"lt"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #lt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in lt on 14.4k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lt. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in lt on 14.4k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #lt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in lt on 14.4k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #lt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in lt on 14.4k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07501654326915741,
0.1057787761092186,
-0.002841763198375702,
0.01156411599367857,
0.07990258187055588,
-0.054835833609104156,
0.13415174186229706,
0.047840606421232224,
0.010014291852712631,
0.09897634387016296,
-0.02631797268986702,
-0.0538385845720768,
0.06529495865106583,
0.13227424025535583,
0.06141849234700203,
-0.2538413405418396,
0.032700251787900925,
-0.07081567496061325,
0.03518548235297203,
0.04924163222312927,
0.11373811960220337,
-0.08480961620807648,
0.027929645031690598,
0.050069410353899,
-0.029877403751015663,
0.037262752652168274,
-0.046867676079273224,
-0.08443407714366913,
0.05235850438475609,
0.04509110003709793,
-0.03134046122431755,
0.02592889778316021,
0.08296789973974228,
-0.18113026022911072,
0.037116777151823044,
0.0386020801961422,
0.025071987882256508,
0.0031252787448465824,
0.102187879383564,
0.017201049253344536,
0.16205807030200958,
-0.02827928401529789,
-0.006095844320952892,
0.08225878328084946,
-0.05912336707115173,
-0.09442553669214249,
-0.06559653580188751,
0.1645258367061615,
0.09561411291360855,
0.10776405036449432,
-0.07978902012109756,
0.09168555587530136,
-0.02219797670841217,
0.04401779919862747,
0.0816563218832016,
-0.18070147931575775,
-0.05144544690847397,
0.05560895800590515,
0.10263562202453613,
0.0243734922260046,
-0.08912010490894318,
0.07535839080810547,
0.04614732787013054,
-0.014668557792901993,
-0.05678918957710266,
-0.03563442453742027,
0.11971347779035568,
-0.09982026368379593,
-0.11909899115562439,
0.010742226615548134,
0.1825634390115738,
0.05142463371157646,
-0.06949815154075623,
-0.14086845517158508,
0.011744925752282143,
0.20677851140499115,
-0.04984910041093826,
-0.09188000857830048,
0.010902021080255508,
0.021428748965263367,
0.04487336054444313,
-0.05994322523474693,
-0.06923291087150574,
0.0033171619288623333,
0.012321369722485542,
0.09715250134468079,
0.015669524669647217,
-0.026599492877721786,
-0.06797440350055695,
-0.00012225274986121804,
-0.09431406855583191,
-0.12064436823129654,
-0.018515780568122864,
-0.06876552850008011,
-0.06948979943990707,
-0.04236762970685959,
-0.0022534560412168503,
-0.09637326747179031,
0.028363216668367386,
0.09317526966333389,
0.07119816541671753,
0.04608342796564102,
-0.06011635437607765,
-0.036865297704935074,
0.11523770540952682,
0.052871223539114,
-0.11841585487127304,
-0.0198568943887949,
0.014515572227537632,
-0.025862814858555794,
0.009638472460210323,
-0.03265814110636711,
-0.031342826783657074,
0.01905405893921852,
-0.03498902544379234,
0.052896905690431595,
0.05430120229721069,
-0.031639158725738525,
-0.03986038640141487,
-0.09451951086521149,
0.09605056047439575,
-0.08494865149259567,
0.027697749435901642,
0.04888546094298363,
-0.005452463403344154,
0.09778789430856705,
-0.05803702399134636,
0.07592064142227173,
-0.11437804251909256,
0.004154522903263569,
-0.02773047238588333,
-0.0034898011945188046,
0.021561605855822563,
-0.023012060672044754,
0.03673359751701355,
0.009774095378816128,
0.003199635772034526,
-0.12042577564716339,
-0.002221707021817565,
-0.09824147820472717,
-0.0274063590914011,
-0.07922298461198807,
-0.040589839220047,
-0.04838021844625473,
0.01389983482658863,
-0.001720936968922615,
-0.005103690084069967,
0.0018930051010102034,
-0.021839573979377747,
-0.005284805782139301,
0.011790091171860695,
0.05249979346990585,
0.07078561931848526,
0.08224154263734818,
-0.029622068628668785,
-0.024250688031315804,
-0.09255742281675339,
0.11750241369009018,
-0.08008270710706711,
-0.01393617782741785,
-0.1366981565952301,
-0.030811043456196785,
-0.04051656275987625,
0.03536202013492584,
0.01082436554133892,
0.13651296496391296,
-0.17940999567508698,
-0.07788629829883575,
0.13160571455955505,
-0.11868323385715485,
-0.0001508240238763392,
0.1737874448299408,
0.001892804866656661,
0.0768081471323967,
0.10204090178012848,
0.20385143160820007,
0.04166225716471672,
-0.16893154382705688,
-0.026568766683340073,
-0.058767516165971756,
0.04008215293288231,
0.12152756005525589,
0.0613413043320179,
-0.055103570222854614,
0.07178039103746414,
-0.020191451534628868,
-0.035707369446754456,
-0.07284802198410034,
0.0022921524941921234,
-0.047527942806482315,
0.013257235288619995,
-0.04066168889403343,
0.028913326561450958,
-0.0121274683624506,
-0.013113201595842838,
-0.014776380732655525,
-0.09138786792755127,
-0.041300415992736816,
0.12520991265773773,
-0.059838663786649704,
0.025092944502830505,
-0.09374663978815079,
0.061258506029844284,
0.06612227857112885,
0.007552990224212408,
-0.12396521121263504,
0.10796356201171875,
0.027413003146648407,
-0.05641075596213341,
0.14467458426952362,
0.08300411701202393,
-0.028283240273594856,
-0.001669713412411511,
-0.017820041626691818,
0.022187702357769012,
-0.01921749673783779,
0.004948799032717943,
-0.020974256098270416,
-0.10989446938037872,
0.008479529991745949,
-0.06490788608789444,
0.10411898791790009,
-0.14185848832130432,
-0.011815952137112617,
0.051604077219963074,
0.1188686341047287,
-0.008105739951133728,
-0.03710557892918587,
0.09460826963186264,
0.04390083625912666,
0.024691781029105186,
-0.01717711053788662,
0.01651838794350624,
-0.019573919475078583,
-0.0013358143623918295,
0.06511644273996353,
-0.1389007866382599,
-0.1588880568742752,
0.10204838216304779,
0.021690962836146355,
-0.014241163618862629,
0.06421858817338943,
0.029848193749785423,
-0.022757992148399353,
-0.04619049280881882,
0.006201358512043953,
0.22119110822677612,
-0.01275134552270174,
0.06281674653291702,
-0.07956964522600174,
-0.020966866984963417,
0.009235994890332222,
-0.04210473224520683,
-0.08532639592885971,
0.08232468366622925,
0.00021086206834297627,
-0.07673712074756622,
-0.028210973367094994,
0.053100574761629105,
0.07500872015953064,
0.16386008262634277,
0.007729119621217251,
-0.09110529720783234,
-0.022820930927991867,
-0.058299895375967026,
-0.008822834119200706,
0.02541443519294262,
-0.1347873955965042,
-0.026231911033391953,
0.026835843920707703,
0.00518013583496213,
0.03847135975956917,
-0.01848185807466507,
0.037132833153009415,
0.010634180158376694,
-0.04602903127670288,
-0.079045869410038,
0.04480448737740517,
-0.03015291504561901,
0.03534939885139465,
-0.00792444683611393,
0.020568344742059708,
-0.0432254783809185,
-0.05710579827427864,
-0.13667985796928406,
0.081483855843544,
-0.07138106971979141,
-0.324171245098114,
-0.09200327843427658,
-0.0562802217900753,
-0.03118872456252575,
0.01089811697602272,
0.054162174463272095,
-0.11646339297294617,
-0.11392703652381897,
-0.06828932464122772,
0.12227234989404678,
-0.012903864495456219,
-0.06357680261135101,
0.12044346332550049,
-0.0029400228522717953,
0.02411583811044693,
-0.09747662395238876,
0.02427351288497448,
-0.026534680277109146,
-0.0326702781021595,
-0.025515003129839897,
0.026173191145062447,
0.05367155745625496,
0.12774786353111267,
0.03301553800702095,
-0.012625008821487427,
0.011873764917254448,
0.2030220776796341,
-0.14544469118118286,
0.07645872235298157,
0.22710581123828888,
-0.055644840002059937,
-0.0070412070490419865,
0.14888933300971985,
-0.003718247637152672,
-0.051841817796230316,
0.05005590617656708,
0.008362513966858387,
-0.013671396300196648,
-0.23671242594718933,
-0.12226283550262451,
-0.053524114191532135,
-0.02856515161693096,
0.047517307102680206,
0.02639361284673214,
-0.0003109503595624119,
0.013460631482303143,
-0.09200088679790497,
-0.038837529718875885,
0.04851468652486801,
0.03856191411614418,
0.15569810569286346,
0.014805118553340435,
0.05497618764638901,
-0.046369489282369614,
-0.01987577974796295,
0.11134406924247742,
-0.04012526944279671,
0.05899393558502197,
0.06799217313528061,
0.10385938733816147,
0.06075949966907501,
0.03416125103831291,
0.054699432104825974,
-0.02700946480035782,
-0.017103997990489006,
-0.004773743450641632,
-0.023129990324378014,
-0.06972639262676239,
0.031022852286696434,
0.04542675241827965,
0.13161519169807434,
-0.12941472232341766,
-0.11100754886865616,
0.02159149758517742,
0.025811001658439636,
0.11766352504491806,
0.09705048054456711,
-0.02084857039153576,
-0.10462693125009537,
0.041758373379707336,
-0.10009336471557617,
-0.02892114967107773,
0.047048281878232956,
0.08723260462284088,
-0.15476730465888977,
0.09582608938217163,
0.07523377984762192,
0.08664572238922119,
-0.03377837315201759,
0.025494027882814407,
-0.04761233925819397,
0.05812589451670647,
0.0005142757436260581,
0.06203687936067581,
-0.1659412831068039,
0.10528845340013504,
0.014653491787612438,
0.0819873958826065,
-0.05994422361254692,
0.019701393321156502,
0.05415820702910423,
0.013136757537722588,
0.1240968406200409,
-0.003090379061177373,
-0.0954764261841774,
0.011994183994829655,
-0.1111334040760994,
0.012360484339296818,
0.05847328528761864,
-0.06698939204216003,
0.0605335533618927,
-0.007518337108194828,
-0.01572468690574169,
-0.035092346370220184,
-0.01156544778496027,
-0.23440489172935486,
-0.13544467091560364,
0.048880647867918015,
-0.0013634354108944535,
0.06013955548405647,
-0.038164980709552765,
-0.08620649576187134,
-0.12694194912910461,
0.10692354291677475,
-0.0065515730530023575,
-0.02087237872183323,
-0.07148189097642899,
0.011349140666425228,
0.09939505904912949,
-0.06231750175356865,
0.022263014689087868,
0.050244566053152084,
0.14416581392288208,
-0.054770223796367645,
-0.045523446053266525,
0.023712540045380592,
-0.09684713929891586,
-0.1251745969057083,
0.012057273648679256,
0.18021269142627716,
0.114726223051548,
0.05976051837205887,
0.07909378409385681,
0.022533942013978958,
0.00019210563914384693,
-0.09023796021938324,
0.012657507322728634,
0.03012540005147457,
-0.07900413125753403,
0.046306245028972626,
-0.004663263913244009,
-0.2633156180381775,
-0.14967024326324463,
-0.07919225096702576,
0.07294944673776627,
0.18597261607646942,
-0.0219257902354002,
0.17558854818344116,
0.2644774317741394,
-0.0909251868724823,
-0.22260375320911407,
-0.036025550216436386,
-0.0032046930864453316,
0.024747753515839577,
0.06216314062476158,
-0.20725984871387482,
0.10722056776285172,
-0.005044986959546804,
0.01600179634988308,
-0.03923096880316734,
-0.21572762727737427,
-0.14046427607536316,
0.15962031483650208,
-0.03601902350783348,
0.04714246839284897,
-0.036076534539461136,
-0.07149362564086914,
-0.02469445951282978,
-0.04395590350031853,
0.015431192703545094,
-0.09277155250310898,
0.08203374594449997,
0.04656090959906578,
0.02207074873149395,
0.021814389154314995,
0.01731167361140251,
0.10527816414833069,
0.07913218438625336,
-0.02385890483856201,
-0.08838178962469101,
0.030076585710048676,
0.00008280951442429796,
-0.015470820479094982,
0.08983474969863892,
0.04137570410966873,
0.008689388632774353,
-0.043818727135658264,
-0.08807110786437988,
-0.06537048518657684,
0.0661010891199112,
-0.06646860390901566,
-0.015868760645389557,
-0.0595969632267952,
0.09466036409139633,
0.020556088536977768,
-0.0059905811212956905,
-0.06336978822946548,
-0.09685222804546356,
-0.012848274782299995,
0.10660304874181747,
0.21986982226371765,
-0.03737061098217964,
0.001450133160687983,
-0.043155331164598465,
-0.045606814324855804,
0.04790395498275757,
-0.030008243396878242,
0.04869060963392258,
0.053060855716466904,
0.031014952808618546,
0.07897994667291641,
-0.03299139812588692,
-0.12734100222587585,
0.02983034960925579,
0.047811757773160934,
-0.06745367497205734,
-0.17647601664066315,
-0.050213806331157684,
-0.01659131795167923,
-0.02088231034576893,
-0.03519754856824875,
0.19783437252044678,
-0.02725359797477722,
-0.05882526934146881,
0.013328549452126026,
0.055324483662843704,
-0.006705799140036106,
0.11883102357387543,
0.04314752668142319,
0.036353498697280884,
-0.08747724443674088,
0.05538107454776764,
0.11812186986207962,
-0.03828520327806473,
0.045695506036281586,
0.11241473257541656,
-0.04918043315410614,
-0.060204021632671356,
-0.0947968065738678,
-0.01106974296271801,
0.048256803303956985,
-0.04762120544910431,
-0.00014624244067817926,
-0.09999221563339233,
0.016184039413928986,
0.02643994428217411,
0.00772485975176096,
-0.04900364205241203,
-0.0444030687212944,
0.0035017398186028004,
-0.08881551027297974,
0.07075811177492142,
0.10428863018751144,
-0.029362499713897705,
-0.1200343519449234,
0.10915222764015198,
0.010652540251612663,
0.07373282313346863,
-0.03873228281736374,
-0.06796971708536148,
-0.09332139045000076,
-0.007399496156722307,
-0.09339818358421326,
0.029652278870344162,
-0.1411726176738739,
-0.007899248041212559,
-0.04776516184210777,
-0.03768754377961159,
-0.01733340509235859,
0.07241218537092209,
-0.03302440419793129,
0.0042058746330440044,
-0.029576323926448822,
0.09382923692464828,
-0.12739655375480652,
0.068766288459301,
0.05877091735601425,
-0.04476890340447426,
0.10850758105516434,
0.02201012894511223,
-0.05590793490409851,
0.04207301512360573,
-0.21540239453315735,
-0.05439070612192154,
-0.025174345821142197,
0.05194634944200516,
-0.009307595901191235,
-0.1769869327545166,
0.0010999629739671946,
0.020425044000148773,
0.01996416598558426,
-0.01876874454319477,
0.034047164022922516,
-0.028311364352703094,
-0.019776439294219017,
-0.06757886707782745,
-0.05303870514035225,
-0.034848280251026154,
0.057856012135744095,
0.07519138604402542,
0.0061125680804252625,
0.10477795451879501,
-0.09392985701560974,
0.07143764197826385,
-0.07699543237686157,
0.025012344121932983,
-0.027964197099208832,
0.014759097248315811,
-0.06782938539981842,
-0.07585900276899338,
0.07555615156888962,
-0.013856954872608185,
0.07581575959920883,
0.011812001466751099,
-0.03487826883792877,
0.051510389894247055,
-0.050796028226614,
-0.0718124657869339,
0.03683341294527054,
0.14643433690071106,
0.05678322911262512,
0.0145066874101758,
-0.005618010647594929,
-0.04368606209754944,
0.002917232923209667,
0.13430418074131012,
0.1394045203924179,
0.1628287136554718,
0.09831374883651733,
0.025306429713964462,
0.07383890450000763,
-0.04724597930908203,
-0.0895753800868988,
0.07543007284402847,
-0.08463068306446075,
0.039383310824632645,
-0.05204235017299652,
-0.05461122468113899,
0.07619966566562653,
-0.13479292392730713,
0.0733661875128746,
-0.03333716094493866,
-0.07836706191301346,
-0.10213126242160797,
-0.1463802307844162,
-0.06582468748092651,
-0.05817447975277901,
0.0020318401511758566,
-0.10865418612957001,
0.04276371747255325,
0.009915328584611416,
0.04064371436834335,
-0.09132611006498337,
0.11456995457410812,
-0.11613232642412186,
-0.13028407096862793,
0.15159504115581512,
-0.04001692682504654,
-0.016770953312516212,
-0.005209822207689285,
0.04394068941473961,
0.014833577908575535,
0.09449306130409241,
0.04027572274208069,
0.05021325498819351,
0.020770128816366196,
0.020432109013199806,
-0.10950670391321182,
-0.06763729453086853,
0.033369746059179306,
-0.002734162611886859,
0.08912152796983719,
0.1884714961051941,
0.09008590877056122,
-0.07917703688144684,
0.015258284285664558,
0.1543193757534027,
0.01685309410095215,
-0.1194205954670906,
-0.15078039467334747,
0.02285930886864662,
-0.032790422439575195,
-0.008556709624826908,
-0.0008982112049125135,
-0.09566952288150787,
0.015184217132627964,
0.2130064070224762,
0.16435232758522034,
-0.03234807774424553,
0.02729792520403862,
-0.012559938244521618,
0.012205004692077637,
0.017187422141432762,
0.0789487212896347,
0.0925813689827919,
0.17587076127529144,
-0.006669015623629093,
0.04461999237537384,
-0.015288968570530415,
-0.0880301222205162,
-0.11515242606401443,
0.09021205455064774,
-0.0007233931100927293,
-0.03691258281469345,
-0.006275436375290155,
0.18298760056495667,
-0.11577670276165009,
-0.21062885224819183,
-0.11600271612405777,
-0.0412675216794014,
-0.11735191196203232,
0.015251712873578072,
-0.04694826900959015,
0.13839936256408691,
0.035817164927721024,
0.004639001563191414,
0.005692039616405964,
0.18852685391902924,
0.04344706982374191,
0.027941619977355003,
-0.034879378974437714,
0.11430879682302475,
-0.09493907541036606,
0.09747673571109772,
-0.0025997182819992304,
0.04278780519962311,
0.03236805275082588,
0.042000096291303635,
-0.0697765201330185,
0.018086226657032967,
0.03933269530534744,
-0.02476787194609642,
0.04732789844274521,
0.1750955432653427,
-0.0075302524492144585,
0.09645407646894455,
0.11084763705730438,
-0.051515184342861176,
0.022745968773961067,
0.005761913023889065,
0.01585841365158558,
-0.06527211517095566,
0.15310901403427124,
-0.15034419298171997,
0.131031796336174,
0.1066373661160469,
-0.07049360126256943,
-0.043263062834739685,
-0.012945612892508507,
0.05316472798585892,
-0.05827797204256058,
0.0880042091012001,
-0.007085952442139387,
-0.17300964891910553,
0.02840401791036129,
-0.10196856409311295,
0.07616736739873886,
-0.2639372646808624,
-0.04478783160448074,
-0.042337313294410706,
-0.014873849228024483,
-0.004254610743373632,
0.12100686877965927,
0.08921343088150024,
-0.04757527634501457,
-0.0071801356971263885,
-0.059031710028648376,
0.014722172170877457,
0.0930464118719101,
-0.08551117032766342,
-0.019948270171880722
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **lv** on **13.1k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **lv**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "lv", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-lv-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"lv",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"lv"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #lv #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in lv on 13.1k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lv. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in lv on 13.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lv. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #lv #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in lv on 13.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lv. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #lv #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in lv on 13.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in lv. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.0736682340502739,
0.10661173611879349,
-0.003014428075402975,
0.010772347450256348,
0.07801350206136703,
-0.056207627058029175,
0.13111190497875214,
0.0472584068775177,
0.012363345362246037,
0.09552694857120514,
-0.021727781742811203,
-0.0468369759619236,
0.06449118256568909,
0.12961545586585999,
0.06345582008361816,
-0.25401467084884644,
0.03109404630959034,
-0.06932863593101501,
0.038443673402071,
0.048014264553785324,
0.1165701225399971,
-0.09002291411161423,
0.029444891959428787,
0.04916297271847725,
-0.022450149059295654,
0.03688683733344078,
-0.048068419098854065,
-0.08408273756504059,
0.054681841284036636,
0.04353850707411766,
-0.03264080360531807,
0.02680983766913414,
0.08585439622402191,
-0.1842750608921051,
0.03691920265555382,
0.03767921403050423,
0.024986756965517998,
0.006061172112822533,
0.1053408831357956,
0.016840685158967972,
0.16743531823158264,
-0.035796117037534714,
-0.006865530274808407,
0.08128589391708374,
-0.05770177021622658,
-0.09828399121761322,
-0.06652691215276718,
0.16489282250404358,
0.0938236191868782,
0.10299645364284515,
-0.07789222151041031,
0.08623877912759781,
-0.01855643279850483,
0.0492703802883625,
0.09013509005308151,
-0.18290740251541138,
-0.047354839742183685,
0.06556528061628342,
0.09633240848779678,
0.014735717326402664,
-0.08811876177787781,
0.07694714516401291,
0.04368394613265991,
-0.018488015979528427,
-0.058243248611688614,
-0.03353361785411835,
0.10783446580171585,
-0.09789713472127914,
-0.11790522187948227,
0.002933668438345194,
0.1820114552974701,
0.05069844052195549,
-0.06594014167785645,
-0.13606952130794525,
0.010020099580287933,
0.19947117567062378,
-0.05297026038169861,
-0.0918855220079422,
0.010079611092805862,
0.018419040367007256,
0.03872251138091087,
-0.06062788888812065,
-0.0696442499756813,
-0.0015130330575630069,
0.015331369824707508,
0.09548159688711166,
0.01768791861832142,
-0.021219050511717796,
-0.07159791141748428,
0.0015628673136234283,
-0.09836389124393463,
-0.1221255362033844,
-0.01413851696997881,
-0.07228965312242508,
-0.06967242062091827,
-0.0406203418970108,
0.00029707388603128493,
-0.11141077429056168,
0.030595071613788605,
0.08908023685216904,
0.06762421876192093,
0.042913008481264114,
-0.060040365904569626,
-0.03504582494497299,
0.1154278889298439,
0.05796497315168381,
-0.11809659004211426,
-0.018512878566980362,
0.01953333429992199,
-0.029504993930459023,
0.013081956654787064,
-0.030231276527047157,
-0.03260468319058418,
0.013817016035318375,
-0.026720084249973297,
0.05597768351435661,
0.048824090510606766,
-0.028062641620635986,
-0.03850860893726349,
-0.09599971026182175,
0.0906425416469574,
-0.08303109556436539,
0.025307008996605873,
0.04985816404223442,
-0.0035615703091025352,
0.09371445327997208,
-0.05410996079444885,
0.07150158286094666,
-0.11679065227508545,
0.004579356871545315,
-0.02304169535636902,
-0.0031367214396595955,
0.02137514017522335,
-0.024405427277088165,
0.037772852927446365,
0.015289883129298687,
0.004264045041054487,
-0.12120614945888519,
-0.007407744415104389,
-0.09504655748605728,
-0.02748238854110241,
-0.0776999443769455,
-0.03814917430281639,
-0.04605143144726753,
0.017976844683289528,
-0.004699238110333681,
-0.003189502051100135,
0.008003613911569118,
-0.022791709750890732,
-0.0013807776849716902,
0.009453270584344864,
0.05389314517378807,
0.07722891122102737,
0.08395931124687195,
-0.03272278979420662,
-0.026104848831892014,
-0.08517468720674515,
0.114031583070755,
-0.08134330809116364,
-0.014385982416570187,
-0.13936230540275574,
-0.03140128403902054,
-0.04216986522078514,
0.031400520354509354,
0.013342106714844704,
0.13329172134399414,
-0.1845998615026474,
-0.07463715225458145,
0.1332404911518097,
-0.12080030888319016,
0.0010857494780793786,
0.17389421164989471,
0.0003269512963015586,
0.07239776849746704,
0.10605420917272568,
0.20446011424064636,
0.03753463178873062,
-0.17146867513656616,
-0.02483738400042057,
-0.05615805834531784,
0.04010603576898575,
0.12201451510190964,
0.059448760002851486,
-0.052701398730278015,
0.06638146936893463,
-0.015653032809495926,
-0.030649220570921898,
-0.0761287659406662,
0.0014873761683702469,
-0.05060507729649544,
0.010524348355829716,
-0.042171355336904526,
0.03531923145055771,
-0.009470954537391663,
-0.012830573134124279,
-0.015271066688001156,
-0.08871542662382126,
-0.05621938407421112,
0.12981078028678894,
-0.05982984974980354,
0.026429707184433937,
-0.09406675398349762,
0.06364534795284271,
0.06190051883459091,
0.012177342548966408,
-0.1226026862859726,
0.11634984612464905,
0.027728155255317688,
-0.044971127063035965,
0.14040805399417877,
0.09137293696403503,
-0.027934731915593147,
-0.004306841176003218,
-0.016662757843732834,
0.021611828356981277,
-0.023615606129169464,
0.003907547798007727,
-0.019434839487075806,
-0.10778429359197617,
0.003576282411813736,
-0.06589893996715546,
0.09093088656663895,
-0.14102713763713837,
-0.016928063705563545,
0.050152890384197235,
0.11974849551916122,
-0.005697519984096289,
-0.035312291234731674,
0.09710388630628586,
0.04446682333946228,
0.026405181735754013,
-0.01457467395812273,
0.020085278898477554,
-0.024491796270012856,
0.005709346849471331,
0.06290212273597717,
-0.13365155458450317,
-0.15721584856510162,
0.09980832040309906,
0.014567380771040916,
-0.010190187953412533,
0.06230968236923218,
0.02821943163871765,
-0.01946839690208435,
-0.049005672335624695,
0.008130394853651524,
0.21787187457084656,
-0.008353416807949543,
0.06440506875514984,
-0.08148163557052612,
-0.019844012334942818,
0.009454033337533474,
-0.04557393491268158,
-0.08395548164844513,
0.08704787492752075,
0.000649942725431174,
-0.07788901776075363,
-0.03108569234609604,
0.056242331862449646,
0.07050125300884247,
0.16884100437164307,
0.005254514515399933,
-0.08998284488916397,
-0.021737342700362206,
-0.06067174673080444,
-0.0119280144572258,
0.026990890502929688,
-0.13943833112716675,
-0.027371913194656372,
0.02964472584426403,
0.001843530684709549,
0.0375935360789299,
-0.020511368289589882,
0.03549028933048248,
0.011006982997059822,
-0.05138358846306801,
-0.07395917922258377,
0.05140506476163864,
-0.02861127257347107,
0.03689570352435112,
-0.013919769786298275,
0.020560424774885178,
-0.04518289864063263,
-0.05880559980869293,
-0.13563601672649384,
0.08085636794567108,
-0.07044671475887299,
-0.3191949129104614,
-0.09022863209247589,
-0.05607203394174576,
-0.03774841129779816,
0.010840089991688728,
0.05397926643490791,
-0.11725601553916931,
-0.11537162214517593,
-0.06689135730266571,
0.12964056432247162,
-0.01642134040594101,
-0.0630212128162384,
0.12209520488977432,
0.002114207949489355,
0.022504594177007675,
-0.10060764104127884,
0.02346159517765045,
-0.03309573978185654,
-0.03741076961159706,
-0.024764221161603928,
0.032142553478479385,
0.05316350981593132,
0.12485483288764954,
0.029201414436101913,
-0.009180132299661636,
0.012099805288016796,
0.20895974338054657,
-0.1384984850883484,
0.0716235563158989,
0.2331993728876114,
-0.04944499954581261,
-0.008034899830818176,
0.15000541508197784,
-0.0035124835558235645,
-0.050472114235162735,
0.047554366290569305,
0.007564228028059006,
-0.01829947903752327,
-0.23927444219589233,
-0.1276937574148178,
-0.05167737975716591,
-0.02631952427327633,
0.048561498522758484,
0.027387728914618492,
-0.002404941711574793,
0.012267795391380787,
-0.09452497214078903,
-0.03564823418855667,
0.051715560257434845,
0.0377906858921051,
0.16005860269069672,
0.013594943098723888,
0.05299331247806549,
-0.04846053570508957,
-0.020021455362439156,
0.10636195540428162,
-0.031157951802015305,
0.05849962309002876,
0.06351171433925629,
0.10118898749351501,
0.06390516459941864,
0.03230524808168411,
0.05600881204009056,
-0.025575360283255577,
-0.02321532368659973,
-0.00612372811883688,
-0.024067794904112816,
-0.07290251553058624,
0.023737356066703796,
0.04412856698036194,
0.129184752702713,
-0.12917979061603546,
-0.11198047548532486,
0.023843292146921158,
0.025483502075076103,
0.12088919430971146,
0.09930995106697083,
-0.013607170432806015,
-0.1053585559129715,
0.04261614754796028,
-0.09842798858880997,
-0.024345435202121735,
0.049737751483917236,
0.08117067068815231,
-0.15329864621162415,
0.09355432540178299,
0.07625866681337357,
0.08695120364427567,
-0.04069869965314865,
0.029214495792984962,
-0.05900187790393829,
0.05821715667843819,
0.00347801111638546,
0.07053864747285843,
-0.17119638621807098,
0.1137450784444809,
0.016606226563453674,
0.08340970426797867,
-0.05973140522837639,
0.019057000055909157,
0.050667520612478256,
0.012019246816635132,
0.12678302824497223,
-0.005314288195222616,
-0.10033057630062103,
-0.0008239939925260842,
-0.10640579462051392,
0.013068754225969315,
0.0505993515253067,
-0.05847292020916939,
0.05784982442855835,
-0.00874529592692852,
-0.015290872193872929,
-0.03817032650113106,
-0.0002389798464719206,
-0.24148179590702057,
-0.13374446332454681,
0.04795830324292183,
-0.002778751077130437,
0.06186242401599884,
-0.03550104424357414,
-0.08345796912908554,
-0.1251227706670761,
0.09571295976638794,
-0.006558532826602459,
-0.021579792723059654,
-0.06660518050193787,
0.015661684796214104,
0.09898847341537476,
-0.06584151089191437,
0.019195696339011192,
0.048113979399204254,
0.14499208331108093,
-0.061198972165584564,
-0.04810598865151405,
0.015554003417491913,
-0.09876388311386108,
-0.13281844556331635,
0.013924320228397846,
0.18519935011863708,
0.11059841513633728,
0.05789519101381302,
0.08227024972438812,
0.021798254922032356,
-0.0018895864486694336,
-0.08971868455410004,
0.01571185700595379,
0.04342310503125191,
-0.07692228257656097,
0.036684419959783554,
-0.007526886183768511,
-0.26105913519859314,
-0.1465764343738556,
-0.07854427397251129,
0.08093100786209106,
0.19163067638874054,
-0.024607203900814056,
0.18501296639442444,
0.26777148246765137,
-0.0923725962638855,
-0.22567422688007355,
-0.04738948121666908,
-0.004282101523131132,
0.02571926638484001,
0.0659199059009552,
-0.208342507481575,
0.10461367666721344,
-0.005878277122974396,
0.016207821667194366,
-0.04945351928472519,
-0.22063924372196198,
-0.14142675697803497,
0.16472503542900085,
-0.03341590613126755,
0.05321888625621796,
-0.03079954721033573,
-0.07078690081834793,
-0.026626477017998695,
-0.048993565142154694,
0.01895057037472725,
-0.08964159339666367,
0.08480773866176605,
0.04426051676273346,
0.02911757305264473,
0.024595897644758224,
0.017999688163399696,
0.10287676006555557,
0.07726994156837463,
-0.020764613524079323,
-0.08714651316404343,
0.0283013004809618,
-0.000875412137247622,
-0.018004709854722023,
0.09475468099117279,
0.04109906405210495,
0.013073274865746498,
-0.046830981969833374,
-0.08563434332609177,
-0.06685028225183487,
0.06596639752388,
-0.06755968183279037,
-0.016410067677497864,
-0.05891431123018265,
0.09300250560045242,
0.023417236283421516,
-0.00776131683960557,
-0.051389895379543304,
-0.09254084527492523,
-0.017923293635249138,
0.1100408211350441,
0.21709124743938446,
-0.04180934652686119,
0.0030750662554055452,
-0.042743019759655,
-0.044951461255550385,
0.05147576332092285,
-0.029361603781580925,
0.052935488522052765,
0.05021537095308304,
0.030705425888299942,
0.07617111504077911,
-0.036221012473106384,
-0.12995390594005585,
0.02690522000193596,
0.043869443237781525,
-0.07368273288011551,
-0.17195305228233337,
-0.04990271478891373,
-0.0015422491123899817,
-0.015228445641696453,
-0.02777094952762127,
0.200473353266716,
-0.027057023718953133,
-0.060011766850948334,
0.01222885400056839,
0.058243632316589355,
-0.0047934032045304775,
0.12149950116872787,
0.04627341032028198,
0.03859962150454521,
-0.08897464722394943,
0.05613190680742264,
0.11977145075798035,
-0.03430374711751938,
0.047551073133945465,
0.10848230123519897,
-0.046698421239852905,
-0.0636206567287445,
-0.09534667432308197,
-0.002667756052687764,
0.05399102717638016,
-0.046459607779979706,
0.0008896795916371047,
-0.09380222111940384,
0.013699604198336601,
0.026806684210896492,
0.010610147379338741,
-0.045666035264730453,
-0.045686714351177216,
-0.0014909342862665653,
-0.08694221824407578,
0.07695195823907852,
0.1062697172164917,
-0.032917290925979614,
-0.12189926207065582,
0.10563486069440842,
0.008034327067434788,
0.0766104981303215,
-0.03837504982948303,
-0.06847714632749557,
-0.0942290723323822,
-0.003946099430322647,
-0.09487560391426086,
0.03240622207522392,
-0.14382736384868622,
-0.006086863111704588,
-0.04879579693078995,
-0.03563765063881874,
-0.015354437753558159,
0.0739688128232956,
-0.0349423848092556,
0.007657693699002266,
-0.033921193331480026,
0.09589390456676483,
-0.13150392472743988,
0.06640802323818207,
0.05441998317837715,
-0.03930163383483887,
0.10696788877248764,
0.014196023344993591,
-0.05406307801604271,
0.0443895198404789,
-0.23119984567165375,
-0.05481567606329918,
-0.026498857885599136,
0.05065631493926048,
-0.011194556951522827,
-0.174721822142601,
0.0007061965297907591,
0.019237054511904716,
0.016885656863451004,
-0.02067917212843895,
0.03867517039179802,
-0.03147563338279724,
-0.024143975228071213,
-0.0659107193350792,
-0.0537201426923275,
-0.03454314544796944,
0.059689849615097046,
0.07109975069761276,
0.010401277802884579,
0.10199464857578278,
-0.09496372938156128,
0.07357969880104065,
-0.07820239663124084,
0.026443466544151306,
-0.02700907737016678,
0.014109984040260315,
-0.07579897344112396,
-0.07578355073928833,
0.07478801906108856,
-0.014835724607110023,
0.07719077914953232,
0.02326914109289646,
-0.03489210829138756,
0.04845710098743439,
-0.049424007534980774,
-0.07030288875102997,
0.035712722688913345,
0.1465204507112503,
0.04938142001628876,
0.02116401679813862,
-0.00778349069878459,
-0.04221785441040993,
0.0038696574047207832,
0.14880873262882233,
0.13759057223796844,
0.16456684470176697,
0.09969813376665115,
0.02305186726152897,
0.07203194499015808,
-0.04611126706004143,
-0.08494572341442108,
0.0720132440328598,
-0.08031097054481506,
0.04151704162359238,
-0.05869963392615318,
-0.05087913200259209,
0.07509579509496689,
-0.1360505223274231,
0.0752616822719574,
-0.029080374166369438,
-0.07778961956501007,
-0.10383372008800507,
-0.16031372547149658,
-0.06759510189294815,
-0.062312301248311996,
0.005876739509403706,
-0.10778428614139557,
0.04088543355464935,
0.005835814867168665,
0.03907202556729317,
-0.09597077965736389,
0.10501071065664291,
-0.10768336057662964,
-0.12989801168441772,
0.1483735889196396,
-0.03952975198626518,
-0.019041163846850395,
-0.0035596773959696293,
0.04727553203701973,
0.01716318167746067,
0.08927164226770401,
0.036925770342350006,
0.05056047439575195,
0.02237819880247116,
0.01659081131219864,
-0.10557176172733307,
-0.06902843713760376,
0.03221208229660988,
-0.00438191881403327,
0.09562572091817856,
0.19620288908481598,
0.09085598587989807,
-0.07920525968074799,
0.014356644824147224,
0.15740951895713806,
0.022467942908406258,
-0.11622881889343262,
-0.14958727359771729,
0.022346120327711105,
-0.030533190816640854,
-0.006677661091089249,
0.0015968135558068752,
-0.0951094701886177,
0.015928838402032852,
0.21328340470790863,
0.16777493059635162,
-0.02517860196530819,
0.030800126492977142,
-0.018260827288031578,
0.012615091167390347,
0.013684525154531002,
0.0791473388671875,
0.09055884182453156,
0.17825588583946228,
-0.00826091319322586,
0.04375838860869408,
-0.016292298212647438,
-0.08933372050523758,
-0.11609788984060287,
0.09720565378665924,
-0.005476381164044142,
-0.03961755707859993,
-0.004629285074770451,
0.17977911233901978,
-0.11858096718788147,
-0.21168631315231323,
-0.11872757226228714,
-0.0452582910656929,
-0.11464807391166687,
0.016007056459784508,
-0.04129971191287041,
0.1414497345685959,
0.03633434697985649,
0.0028196880593895912,
0.008616715669631958,
0.1886105239391327,
0.04190815985202789,
0.025288578122854233,
-0.0368582122027874,
0.11645715683698654,
-0.08545847982168198,
0.10022705793380737,
-0.003513673786073923,
0.045042894780635834,
0.03086186945438385,
0.040301669389009476,
-0.06901118904352188,
0.021498870104551315,
0.03925538435578346,
-0.022354252636432648,
0.04539592191576958,
0.17892014980316162,
-0.008889751508831978,
0.1017446219921112,
0.11131061613559723,
-0.05553111806511879,
0.019201036542654037,
0.0055851079523563385,
0.01798972114920616,
-0.06321568042039871,
0.15833039581775665,
-0.1493179053068161,
0.12995046377182007,
0.10719320178031921,
-0.0712800845503807,
-0.04560965672135353,
-0.018430281430482864,
0.055200740694999695,
-0.06350769102573395,
0.08165629953145981,
-0.008231759071350098,
-0.17111936211585999,
0.03017900325357914,
-0.10225729644298553,
0.07447128742933273,
-0.25233638286590576,
-0.046107467263936996,
-0.040678031742572784,
-0.01703956536948681,
-0.0047313435934484005,
0.12221334874629974,
0.0886533185839653,
-0.05101390555500984,
-0.004013568162918091,
-0.05072523653507233,
0.018474219366908073,
0.08660801500082016,
-0.08864519000053406,
-0.02349947765469551
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **mt** on **9.1k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **mt**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "mt", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-mt-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"mt",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"mt"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #mt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in mt on 9.1k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in mt on 9.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #mt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in mt on 9.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
73,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #mt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in mt on 9.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07868025451898575,
0.10225392878055573,
-0.0025330022908747196,
0.005838644225150347,
0.08357600122690201,
-0.06417461484670639,
0.12738285958766937,
0.042621172964572906,
0.018366089090704918,
0.09734471887350082,
-0.023171523585915565,
-0.038989339023828506,
0.0631084218621254,
0.14195531606674194,
0.05464952811598778,
-0.24502894282341003,
0.03533259779214859,
-0.07784240692853928,
0.04028274118900299,
0.053418513387441635,
0.11410205811262131,
-0.08610744774341583,
0.02899216115474701,
0.04688344895839691,
-0.02937435172498226,
0.038628511130809784,
-0.04746035858988762,
-0.09080318361520767,
0.04775520786643028,
0.04770710691809654,
-0.040109407156705856,
0.03235551714897156,
0.07414326071739197,
-0.16949522495269775,
0.03860617056488991,
0.04306747019290924,
0.026675580069422722,
0.003297738265246153,
0.10733013600111008,
0.010460940189659595,
0.15963506698608398,
-0.026698557659983635,
-0.0016083520604297519,
0.08712410181760788,
-0.06308113038539886,
-0.076762855052948,
-0.06829158961772919,
0.1452009081840515,
0.10731072723865509,
0.10734306275844574,
-0.08067566901445389,
0.0972151979804039,
-0.027275603264570236,
0.047564294189214706,
0.07559388130903244,
-0.17245642840862274,
-0.04572077468037605,
0.06960782408714294,
0.09669369459152222,
0.033796824514865875,
-0.09032613784074783,
0.07435080409049988,
0.0433388389647007,
-0.01855396293103695,
-0.05694984644651413,
-0.033759232610464096,
0.12191178649663925,
-0.09743261337280273,
-0.12427359074354172,
0.010372956283390522,
0.2007959932088852,
0.05315541848540306,
-0.06794670969247818,
-0.12933042645454407,
0.010023660026490688,
0.20669497549533844,
-0.04197205975651741,
-0.08534690737724304,
0.00515134958550334,
0.023031245917081833,
0.02304372936487198,
-0.05237457901239395,
-0.061755575239658356,
0.0023703433107584715,
0.0034918354358524084,
0.07864317297935486,
0.0015205881791189313,
-0.024452565237879753,
-0.07027814537286758,
0.00016377567953895777,
-0.10104577988386154,
-0.12595638632774353,
-0.01812678575515747,
-0.0710144191980362,
-0.06682037562131882,
-0.05081475153565407,
-0.006435919553041458,
-0.12003274261951447,
0.021306145936250687,
0.08542067557573318,
0.08352265506982803,
0.056777700781822205,
-0.06568809598684311,
-0.03369835764169693,
0.12396638095378876,
0.05183373764157295,
-0.1139628067612648,
-0.041136980056762695,
0.014722744934260845,
-0.026370251551270485,
0.007600302342325449,
-0.03830837085843086,
-0.026746707037091255,
0.02444254606962204,
-0.03351534530520439,
0.055327773094177246,
0.051214467734098434,
-0.025962017476558685,
-0.03074507787823677,
-0.09684573858976364,
0.10876958072185516,
-0.08665173500776291,
0.026506178081035614,
0.0489390529692173,
-0.010151281021535397,
0.09944119304418564,
-0.05450616776943207,
0.07855068892240524,
-0.11175581067800522,
0.018502261489629745,
-0.03113263100385666,
-0.0007993734325282276,
0.02026251144707203,
-0.022819409146904945,
0.04259505495429039,
0.025841424241662025,
0.0022703029680997133,
-0.12481574714183807,
-0.001763907028362155,
-0.10020764917135239,
-0.031622838228940964,
-0.07217510789632797,
-0.029739415273070335,
-0.04757259041070938,
0.015681395307183266,
0.0025352861266583204,
-0.008512523956596851,
0.015654360875487328,
-0.024416876956820488,
-0.007677956949919462,
0.004124327562749386,
0.049196358770132065,
0.07048625499010086,
0.08601129800081253,
-0.03126005455851555,
-0.020832082256674767,
-0.11143334209918976,
0.11962132155895233,
-0.08216702193021774,
-0.005177067127078772,
-0.14094477891921997,
-0.020930038765072823,
-0.04558997601270676,
0.025212954729795456,
0.01460042130202055,
0.13834841549396515,
-0.17471107840538025,
-0.07743096351623535,
0.1188240498304367,
-0.11932109296321869,
-0.004196576774120331,
0.17003001272678375,
-0.0003114150313194841,
0.0767618864774704,
0.10361378639936447,
0.20103374123573303,
0.028163522481918335,
-0.16798774898052216,
-0.0271748136729002,
-0.06539352983236313,
0.04288821294903755,
0.11944516748189926,
0.06806571781635284,
-0.05452866852283478,
0.06605444103479385,
-0.025140399113297462,
-0.022165613248944283,
-0.07228939235210419,
0.006194809917360544,
-0.048689547926187515,
0.014649205841124058,
-0.03825264051556587,
0.016347359865903854,
-0.01560705341398716,
-0.015011657029390335,
-0.020837096497416496,
-0.08078925311565399,
-0.031923532485961914,
0.1246427446603775,
-0.053329434245824814,
0.022163020446896553,
-0.09443145245313644,
0.04800402373075485,
0.04884152114391327,
0.010795548558235168,
-0.12654435634613037,
0.11896523088216782,
0.023813743144273758,
-0.05938607454299927,
0.14456269145011902,
0.07084570825099945,
-0.027447689324617386,
-0.0016448451206088066,
-0.01993691362440586,
0.02511940337717533,
-0.012353784404695034,
0.012870954349637032,
-0.03037809394299984,
-0.1130150780081749,
0.010265887714922428,
-0.06410644203424454,
0.1071246862411499,
-0.12752439081668854,
-0.012123935855925083,
0.0575983040034771,
0.120571069419384,
-0.006039222702383995,
-0.04285112023353577,
0.09398582577705383,
0.04088640585541725,
0.024910859763622284,
-0.015717172995209694,
0.01293504610657692,
-0.014893433079123497,
-0.005209342110902071,
0.07892642170190811,
-0.15061649680137634,
-0.16360120475292206,
0.1085624247789383,
0.01277901604771614,
-0.009671343490481377,
0.05111084133386612,
0.03671200945973396,
-0.02594459429383278,
-0.04564381018280983,
-0.0040473430417478085,
0.21350722014904022,
-0.01234649121761322,
0.06063738837838173,
-0.08395951986312866,
-0.02704569697380066,
0.008969254791736603,
-0.04273261874914169,
-0.09368544071912766,
0.08203739672899246,
0.007103906478732824,
-0.10772071778774261,
-0.023560866713523865,
0.08223624527454376,
0.07524899393320084,
0.1717786341905594,
0.010628653690218925,
-0.09423461556434631,
-0.02614854834973812,
-0.06574811041355133,
-0.00895919930189848,
0.018830031156539917,
-0.1529853343963623,
-0.024478713050484657,
0.024611173197627068,
0.019614266231656075,
0.041467905044555664,
-0.011825737543404102,
0.0392557792365551,
0.012431744486093521,
-0.03626440837979317,
-0.08608486503362656,
0.037087757140398026,
-0.03170449659228325,
0.03566138073801994,
-0.007359440438449383,
0.030536117032170296,
-0.04414171725511551,
-0.06208845600485802,
-0.14176474511623383,
0.08010311424732208,
-0.07956569641828537,
-0.3139921724796295,
-0.09258751571178436,
-0.06348275393247604,
-0.039257604628801346,
0.009056221693754196,
0.05545146390795708,
-0.11388799548149109,
-0.11064167320728302,
-0.07407377660274506,
0.13050004839897156,
-0.01657157391309738,
-0.0680389255285263,
0.13048867881298065,
-0.0008263510535471141,
0.01846463978290558,
-0.09473086148500443,
0.023389428853988647,
-0.022462788969278336,
-0.03566839173436165,
-0.02753761224448681,
0.016623273491859436,
0.05487145110964775,
0.13683263957500458,
0.03401374816894531,
-0.013077366165816784,
0.014331053011119366,
0.21308119595050812,
-0.14776267111301422,
0.07803557813167572,
0.22931277751922607,
-0.04819609969854355,
-0.008004137314856052,
0.14907702803611755,
-0.006454873830080032,
-0.04802297055721283,
0.05587073415517807,
0.009979288093745708,
-0.020916657522320747,
-0.24102535843849182,
-0.12240783125162125,
-0.054837875068187714,
-0.023525163531303406,
0.04786314815282822,
0.023704439401626587,
0.003879070747643709,
0.016824860125780106,
-0.09712866693735123,
-0.05018680542707443,
0.05520006641745567,
0.04111103340983391,
0.1384849101305008,
0.013298757374286652,
0.06310655176639557,
-0.045992497354745865,
-0.01228692289441824,
0.11234916001558304,
-0.03167370706796646,
0.051024194806814194,
0.07056080549955368,
0.09693241864442825,
0.06297264248132706,
0.02207830920815468,
0.05570751428604126,
-0.02233170159161091,
-0.012634948827326298,
-0.006817178335040808,
-0.030843347311019897,
-0.0778837651014328,
0.024274270981550217,
0.04813499003648758,
0.13221201300621033,
-0.128256157040596,
-0.1078246608376503,
0.02175283059477806,
0.03148166090250015,
0.11526751518249512,
0.09657067060470581,
-0.02071058750152588,
-0.10599572956562042,
0.04535976052284241,
-0.10049650818109512,
-0.019209668040275574,
0.04979512840509415,
0.09297880530357361,
-0.15759046375751495,
0.10085473209619522,
0.07793374359607697,
0.0955577939748764,
-0.019156768918037415,
0.02226685918867588,
-0.05771952122449875,
0.05401848256587982,
-0.007767677307128906,
0.05961737781763077,
-0.18262392282485962,
0.100601926445961,
0.02114603854715824,
0.08248265832662582,
-0.06743678450584412,
0.01561929564923048,
0.06313521414995193,
0.016006553545594215,
0.11460497230291367,
-0.000876719830557704,
-0.12294904887676239,
0.011014092713594437,
-0.11038506031036377,
0.010099893435835838,
0.052843376994132996,
-0.05696297809481621,
0.0581032857298851,
-0.0017454223707318306,
-0.01587660051882267,
-0.0364341139793396,
0.0063546644523739815,
-0.23575656116008759,
-0.1363259106874466,
0.047373976558446884,
0.004713007714599371,
0.05728641897439957,
-0.03527569770812988,
-0.08219332247972488,
-0.11143340915441513,
0.09286562353372574,
0.0036235542502254248,
-0.017734095454216003,
-0.07353703677654266,
0.00033453162177465856,
0.09425012767314911,
-0.06259392201900482,
0.0156096201390028,
0.046133942902088165,
0.14743533730506897,
-0.05364994704723358,
-0.04929862916469574,
0.03452874347567558,
-0.10149931162595749,
-0.12623374164104462,
0.002760897157713771,
0.18032072484493256,
0.11418429017066956,
0.06223611533641815,
0.07725544273853302,
0.024173619225621223,
0.004180767573416233,
-0.0893123522400856,
0.01290588267147541,
0.030824007466435432,
-0.08080209046602249,
0.05555824562907219,
-0.001758621889166534,
-0.2601780295372009,
-0.15374398231506348,
-0.08639726042747498,
0.07338502258062363,
0.18157924711704254,
-0.01504502259194851,
0.17295703291893005,
0.28207093477249146,
-0.08837397396564484,
-0.2308257818222046,
-0.02416164241731167,
-0.008413772098720074,
0.018076438456773758,
0.06450752168893814,
-0.1998741328716278,
0.10824307799339294,
0.009987005963921547,
0.016121074557304382,
-0.03824048489332199,
-0.2238290011882782,
-0.1407330185174942,
0.1470109522342682,
-0.02965593710541725,
0.047436755150556564,
-0.03882914036512375,
-0.07054618746042252,
-0.030793121084570885,
-0.0575251542031765,
0.009678179398179054,
-0.09728391468524933,
0.08459527790546417,
0.05815470591187477,
0.02698577754199505,
0.016691207885742188,
0.011706762947142124,
0.11109568178653717,
0.08874644339084625,
-0.025597158819437027,
-0.08941975235939026,
0.038416776806116104,
-0.00004799293674295768,
-0.020651426166296005,
0.08434303849935532,
0.033946286886930466,
0.006670807488262653,
-0.03984317556023598,
-0.08562511950731277,
-0.06593102216720581,
0.0612056739628315,
-0.06592006236314774,
-0.018455378711223602,
-0.05468010902404785,
0.09558208286762238,
0.028846751898527145,
-0.012234017252922058,
-0.05584082752466202,
-0.10700517147779465,
-0.007445650175213814,
0.1246747300028801,
0.22085672616958618,
-0.05829764902591705,
0.0064909677021205425,
-0.04001713916659355,
-0.048037756234407425,
0.05043407157063484,
-0.037873100489377975,
0.0590437576174736,
0.05072049796581268,
0.029731392860412598,
0.08355072140693665,
-0.03165953978896141,
-0.1306319683790207,
0.0320243164896965,
0.045151520520448685,
-0.06330505758523941,
-0.1538325697183609,
-0.047944020479917526,
0.016924839466810226,
-0.012600159272551537,
-0.03125733882188797,
0.20124536752700806,
-0.02307865209877491,
-0.054287929087877274,
0.009921485558152199,
0.05783908814191818,
-0.006806643679738045,
0.1304255872964859,
0.03672359883785248,
0.039049938321113586,
-0.08938849717378616,
0.06441456079483032,
0.11880162358283997,
-0.04731379821896553,
0.042181596159935,
0.11576050519943237,
-0.047907982021570206,
-0.06139833852648735,
-0.10863067954778671,
-0.007655146066099405,
0.04453015699982643,
-0.04411191865801811,
0.006513137370347977,
-0.09957891702651978,
0.01743479073047638,
0.036527253687381744,
0.004382258281111717,
-0.04671308770775795,
-0.04211673140525818,
-0.0019349188078194857,
-0.08826442807912827,
0.07054270058870316,
0.09585533291101456,
-0.027844466269016266,
-0.11163605004549026,
0.10781510919332504,
0.008022238500416279,
0.07033733278512955,
-0.03969191759824753,
-0.07171892374753952,
-0.09417133033275604,
-0.005860396660864353,
-0.09148204326629639,
0.02696634642779827,
-0.148232564330101,
-0.010180036537349224,
-0.048261601477861404,
-0.035920653492212296,
-0.0250462107360363,
0.07372153550386429,
-0.034249208867549896,
0.0026081663090735674,
-0.036261219531297684,
0.09359743446111679,
-0.1211109608411789,
0.06681044399738312,
0.059486452490091324,
-0.044981278479099274,
0.0991235002875328,
0.024372367188334465,
-0.055209964513778687,
0.03557424247264862,
-0.2159433513879776,
-0.05349814519286156,
-0.02411166951060295,
0.05319375544786453,
-0.007036891300231218,
-0.17064043879508972,
0.002017226768657565,
0.018950793892145157,
0.023316357284784317,
-0.017725158482789993,
0.04025384783744812,
-0.03060108795762062,
-0.02653617225587368,
-0.06568652391433716,
-0.05925258249044418,
-0.037774212658405304,
0.05846847593784332,
0.07768955826759338,
0.006076799239963293,
0.10543494671583176,
-0.09190709888935089,
0.06768782436847687,
-0.07662932574748993,
0.026238996535539627,
-0.02163495123386383,
0.010533123277127743,
-0.06460592895746231,
-0.07222992181777954,
0.07650739699602127,
-0.013515445403754711,
0.08054909110069275,
0.008190244436264038,
-0.04612933471798897,
0.061533812433481216,
-0.05167191103100777,
-0.06334757059812546,
0.03327542543411255,
0.14062491059303284,
0.05611710250377655,
0.014917564578354359,
-0.001903548720292747,
-0.03597753867506981,
0.004697203636169434,
0.1380401849746704,
0.13510672748088837,
0.16729463636875153,
0.10699508339166641,
0.03090711683034897,
0.07021655887365341,
-0.0388798788189888,
-0.08576664328575134,
0.0632992684841156,
-0.08057066053152084,
0.0406050905585289,
-0.05700506642460823,
-0.06556589901447296,
0.07839637994766235,
-0.13725511729717255,
0.07191779464483261,
-0.03926081955432892,
-0.07228295505046844,
-0.10329066216945648,
-0.13958287239074707,
-0.06737897545099258,
-0.05514474958181381,
-0.0016406619688495994,
-0.10846858471632004,
0.046396464109420776,
0.013990292325615883,
0.045894913375377655,
-0.09281579405069351,
0.11149460822343826,
-0.13035140931606293,
-0.1358787715435028,
0.1490698903799057,
-0.0463426448404789,
-0.012529239989817142,
-0.0032856902107596397,
0.046506114304065704,
0.02830209955573082,
0.09137000888586044,
0.042887602001428604,
0.050672177225351334,
0.016973240301012993,
0.01721678115427494,
-0.10397608578205109,
-0.07256918400526047,
0.029900776222348213,
-0.005574716720730066,
0.08736073225736618,
0.1986500769853592,
0.08838649839162827,
-0.07201273739337921,
0.01757016032934189,
0.14972448348999023,
0.0164498221129179,
-0.10705932974815369,
-0.15743404626846313,
0.02638772875070572,
-0.03105424903333187,
-0.004727235529571772,
-0.007744995877146721,
-0.09844429790973663,
0.011402382515370846,
0.20232249796390533,
0.15948720276355743,
-0.04214101284742355,
0.024581672623753548,
-0.01777143031358719,
0.013405983336269855,
0.01649775356054306,
0.08119450509548187,
0.09087396413087845,
0.15953052043914795,
-0.0035347493831068277,
0.056617651134729385,
-0.01788983680307865,
-0.09086388349533081,
-0.11140142381191254,
0.0814364030957222,
-0.007037978619337082,
-0.040478676557540894,
-0.0012326296418905258,
0.18420691788196564,
-0.1094033420085907,
-0.205706387758255,
-0.126936137676239,
-0.04432522505521774,
-0.11717096716165543,
0.015867406502366066,
-0.04286111146211624,
0.14424175024032593,
0.037470437586307526,
0.011440220288932323,
0.012361519038677216,
0.17161113023757935,
0.04546282812952995,
0.035704970359802246,
-0.045662060379981995,
0.1141529232263565,
-0.08972275257110596,
0.10076916217803955,
-0.0026294721756130457,
0.026710938662290573,
0.036463066935539246,
0.04048441722989082,
-0.06669336557388306,
0.029011324048042297,
0.04481316730380058,
-0.026221364736557007,
0.04401402920484543,
0.18259240686893463,
-0.011563990265130997,
0.12137847393751144,
0.1125512570142746,
-0.06422176212072372,
0.02080575004220009,
-0.0019663218408823013,
0.022275429219007492,
-0.061891138553619385,
0.15500874817371368,
-0.14625218510627747,
0.13519595563411713,
0.10749213397502899,
-0.0717066079378128,
-0.04394499957561493,
-0.012098868377506733,
0.049735769629478455,
-0.06115875393152237,
0.07171542197465897,
-0.015445496886968613,
-0.1795099675655365,
0.029536021873354912,
-0.11601684987545013,
0.07700192928314209,
-0.2688239514827728,
-0.04156871512532234,
-0.04392926022410393,
-0.020876212045550346,
-0.0020637589041143656,
0.12199652940034866,
0.08168919384479523,
-0.05223073810338974,
-0.008301082998514175,
-0.07460080087184906,
0.02026735432446003,
0.09417669475078583,
-0.0846656784415245,
-0.021459879353642464
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **nl** on **19.0k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **nl**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "nl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-nl-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"nl",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"nl"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #nl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in nl on 19.0k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in nl. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in nl on 19.0k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in nl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #nl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in nl on 19.0k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in nl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #nl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in nl on 19.0k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in nl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.0710948035120964,
0.11351588368415833,
-0.002869914984330535,
0.01135080587118864,
0.07181218266487122,
-0.06046873703598976,
0.1325390785932541,
0.04337720945477486,
0.015791060402989388,
0.09836999326944351,
-0.022179681807756424,
-0.046169113367795944,
0.0644698217511177,
0.13271939754486084,
0.06186749041080475,
-0.25229305028915405,
0.03527422994375229,
-0.0694115161895752,
0.03528843820095062,
0.048599280416965485,
0.11335081607103348,
-0.08622399717569351,
0.026365680620074272,
0.05187344178557396,
-0.02307778224349022,
0.03946830704808235,
-0.05087443068623543,
-0.08571209758520126,
0.05256790295243263,
0.044906649738550186,
-0.029179811477661133,
0.02600768953561783,
0.08318629115819931,
-0.18525974452495575,
0.03686274588108063,
0.036065276712179184,
0.02460566721856594,
0.003156952327117324,
0.10507051646709442,
0.015846142545342445,
0.17472398281097412,
-0.028522387146949768,
-0.005950611084699631,
0.0808643028140068,
-0.06096704304218292,
-0.09761187434196472,
-0.06311985105276108,
0.16661010682582855,
0.09539349377155304,
0.10732591897249222,
-0.08049989491701126,
0.08943510055541992,
-0.023483624681830406,
0.04803033173084259,
0.07445365935564041,
-0.17760930955410004,
-0.04940330237150192,
0.058581817895174026,
0.10103390365839005,
0.011222701519727707,
-0.09123441576957703,
0.07506844401359558,
0.04464714229106903,
-0.0179910808801651,
-0.06344250589609146,
-0.03721673786640167,
0.11877121031284332,
-0.10030204057693481,
-0.11900065094232559,
0.0053633213974535465,
0.17741376161575317,
0.05631123483181,
-0.06722874939441681,
-0.14136768877506256,
0.012161179445683956,
0.2069118171930313,
-0.0510137714445591,
-0.09370379894971848,
0.00830440130084753,
0.021650878712534904,
0.03760083392262459,
-0.05962669104337692,
-0.0667886808514595,
0.00782823283225298,
0.013189292512834072,
0.10608746111392975,
0.013343689031898975,
-0.024898888543248177,
-0.06742646545171738,
-0.003550842171534896,
-0.08610925078392029,
-0.12308135628700256,
-0.015222844667732716,
-0.0673343613743782,
-0.06941322982311249,
-0.04179005324840546,
0.0009899527067318559,
-0.10235310345888138,
0.03205879032611847,
0.09883646667003632,
0.06889313459396362,
0.0480077899992466,
-0.06540133059024811,
-0.035418570041656494,
0.11949718743562698,
0.05513256788253784,
-0.11704742163419724,
-0.025752974674105644,
0.016388559713959694,
-0.029381297528743744,
0.015720823779702187,
-0.031580034643411636,
-0.031336840242147446,
0.022311894223093987,
-0.03343302011489868,
0.051567159593105316,
0.05195367708802223,
-0.0301531795412302,
-0.039935093373060226,
-0.09722302109003067,
0.09748990088701248,
-0.08161379396915436,
0.02820846065878868,
0.05014686658978462,
-0.004307645373046398,
0.09531717747449875,
-0.05840590223670006,
0.07803241163492203,
-0.11367303878068924,
0.007059162016957998,
-0.025505676865577698,
-0.0062432377599179745,
0.025306377559900284,
-0.0195788461714983,
0.04243643209338188,
0.019040104001760483,
0.007000515703111887,
-0.12059932202100754,
-0.004754047375172377,
-0.09764686971902847,
-0.028078677132725716,
-0.07724201679229736,
-0.03871740773320198,
-0.05027377977967262,
0.015583342872560024,
-0.0031981724314391613,
-0.006066800560802221,
-0.0016974032623693347,
-0.02419171668589115,
-0.0040775020606815815,
0.011560059152543545,
0.05089897662401199,
0.07020770758390427,
0.08051739633083344,
-0.036999642848968506,
-0.02538488805294037,
-0.08989185839891434,
0.11503344029188156,
-0.08109108358621597,
-0.017330287024378777,
-0.13853245973587036,
-0.03188108652830124,
-0.047657258808612823,
0.034878309816122055,
0.013374412432312965,
0.1332123726606369,
-0.17233514785766602,
-0.07862876355648041,
0.13123154640197754,
-0.12027736008167267,
0.0030982571188360453,
0.17549632489681244,
-0.0015812793280929327,
0.0716121718287468,
0.10354028642177582,
0.21063128113746643,
0.02885361574590206,
-0.16732841730117798,
-0.02683696337044239,
-0.06134909763932228,
0.04057978093624115,
0.12317678332328796,
0.05965520814061165,
-0.05184422805905342,
0.06434493511915207,
-0.02028484083712101,
-0.01958111859858036,
-0.0702151507139206,
0.004328189883381128,
-0.049558065831661224,
0.016614068299531937,
-0.04074378311634064,
0.03136736899614334,
-0.004952649585902691,
-0.018742263317108154,
-0.01692109927535057,
-0.0918201431632042,
-0.04712587594985962,
0.12727035582065582,
-0.060891564935445786,
0.02390969730913639,
-0.09440404921770096,
0.058322884142398834,
0.06287708133459091,
0.006969695910811424,
-0.1213233470916748,
0.10845107585191727,
0.027011021971702576,
-0.05457662045955658,
0.14624294638633728,
0.08455340564250946,
-0.028143297880887985,
-0.000679602031596005,
-0.018575018271803856,
0.020265968516469002,
-0.02369757555425167,
0.006541958544403315,
-0.020852942019701004,
-0.10738947987556458,
0.006853325292468071,
-0.06504663079977036,
0.10232195258140564,
-0.14782391488552094,
-0.01860382780432701,
0.049304522573947906,
0.1258140206336975,
-0.00867467001080513,
-0.03922577574849129,
0.0971614420413971,
0.047691039741039276,
0.0260180477052927,
-0.014208227396011353,
0.01747787557542324,
-0.02318977564573288,
-0.00309289270080626,
0.05994979292154312,
-0.13975849747657776,
-0.15745145082473755,
0.09916520118713379,
0.013995619490742683,
-0.011163572780787945,
0.06527193635702133,
0.02704152837395668,
-0.02331801876425743,
-0.04435717687010765,
0.007647763472050428,
0.22301433980464935,
-0.010662956163287163,
0.062246449291706085,
-0.08185374736785889,
-0.021778173744678497,
0.006583092268556356,
-0.044986456632614136,
-0.08656419813632965,
0.08953022956848145,
-0.003056416753679514,
-0.08962434530258179,
-0.02718704380095005,
0.05837727338075638,
0.07079560309648514,
0.16629858314990997,
0.006222811993211508,
-0.0885130912065506,
-0.02514101192355156,
-0.059837620705366135,
-0.010617033578455448,
0.027226058766245842,
-0.13705164194107056,
-0.019639166072010994,
0.027731915935873985,
0.003941288683563471,
0.041117843240499496,
-0.0163510050624609,
0.034666530787944794,
0.009539053775370121,
-0.04653142765164375,
-0.0731881856918335,
0.045876603573560715,
-0.027847828343510628,
0.03488776087760925,
-0.011434199288487434,
0.014361602254211903,
-0.04239066317677498,
-0.058942198753356934,
-0.1332959532737732,
0.083314448595047,
-0.06712090969085693,
-0.33029574155807495,
-0.0916755348443985,
-0.05440778657793999,
-0.04033171385526657,
0.009204871952533722,
0.05336419865489006,
-0.11517679691314697,
-0.11639977246522903,
-0.06993749737739563,
0.12626855075359344,
-0.014591792598366737,
-0.06846808642148972,
0.11948393285274506,
-0.003421889618039131,
0.02336670272052288,
-0.09901301562786102,
0.021776797249913216,
-0.030280739068984985,
-0.031212305650115013,
-0.028629980981349945,
0.0278103519231081,
0.05731790140271187,
0.12668846547603607,
0.032152291387319565,
-0.012367848306894302,
0.015232980251312256,
0.2068650722503662,
-0.14579863846302032,
0.0738675519824028,
0.22773925960063934,
-0.05431310832500458,
-0.008660164661705494,
0.1478707492351532,
-0.0026469912845641375,
-0.0521833673119545,
0.05053750425577164,
0.013646777719259262,
-0.015989424660801888,
-0.2353527694940567,
-0.13002830743789673,
-0.051674555987119675,
-0.026293516159057617,
0.048647310584783554,
0.027395904064178467,
0.004814265761524439,
0.010374965146183968,
-0.09433270245790482,
-0.04358014464378357,
0.05148443579673767,
0.03889760375022888,
0.16132599115371704,
0.016354424878954887,
0.05529312044382095,
-0.047239553183317184,
-0.026958899572491646,
0.11003571003675461,
-0.03558379039168358,
0.04367872327566147,
0.07031631469726562,
0.0997832641005516,
0.06019959971308708,
0.03021799959242344,
0.05411994084715843,
-0.025996048003435135,
-0.01668105088174343,
-0.00633965153247118,
-0.0231078639626503,
-0.07104777544736862,
0.02452698163688183,
0.04389575123786926,
0.13257677853107452,
-0.13115188479423523,
-0.10877308994531631,
0.015298163518309593,
0.025295404717326164,
0.10965771973133087,
0.10219372063875198,
-0.019732458516955376,
-0.10538703203201294,
0.040781810879707336,
-0.10229029506444931,
-0.025387264788150787,
0.051864542067050934,
0.08234542608261108,
-0.1561201959848404,
0.09793739765882492,
0.07495852559804916,
0.0883178785443306,
-0.04104376211762428,
0.026891307905316353,
-0.05858765169978142,
0.05625133961439133,
0.0030660328920930624,
0.06608179211616516,
-0.15829172730445862,
0.11170390248298645,
0.014471165835857391,
0.08167470991611481,
-0.06030693277716637,
0.017849376425147057,
0.05017003044486046,
0.005880600772798061,
0.12456586956977844,
-0.004859408363699913,
-0.10989292711019516,
-0.0030271669384092093,
-0.11291305720806122,
0.012064623646438122,
0.05443699285387993,
-0.06296227127313614,
0.05897769704461098,
-0.003568435786291957,
-0.014384813606739044,
-0.03540117293596268,
-0.007671946194022894,
-0.2404463291168213,
-0.13710282742977142,
0.04697820544242859,
0.0037943744100630283,
0.06077216938138008,
-0.03568467125296593,
-0.0832180604338646,
-0.12255597114562988,
0.11002402007579803,
-0.0014011351158842444,
-0.018331537023186684,
-0.07172243297100067,
0.016387425363063812,
0.10166412591934204,
-0.0639917179942131,
0.016284609213471413,
0.0479777567088604,
0.13789018988609314,
-0.05889178812503815,
-0.046842895448207855,
0.023042263463139534,
-0.09616434574127197,
-0.12742452323436737,
0.012861483730375767,
0.18366457521915436,
0.12054403871297836,
0.058121468871831894,
0.08406640589237213,
0.0222356915473938,
0.0028620476368814707,
-0.09225424379110336,
0.011184750124812126,
0.02846759371459484,
-0.07639725506305695,
0.04371248185634613,
-0.0010022640926763415,
-0.25017285346984863,
-0.14808768033981323,
-0.07546123117208481,
0.0749276727437973,
0.18814826011657715,
-0.019366523250937462,
0.1780715137720108,
0.2714335322380066,
-0.09092136472463608,
-0.22611582279205322,
-0.04549241438508034,
-0.006190950516611338,
0.02404855377972126,
0.06078951433300972,
-0.20699463784694672,
0.11675158143043518,
0.0017735006986185908,
0.015131926164031029,
-0.05000167340040207,
-0.21731388568878174,
-0.13958756625652313,
0.16628767549991608,
-0.03583209961652756,
0.052007757127285004,
-0.031665172427892685,
-0.06930381804704666,
-0.02662709355354309,
-0.046226173639297485,
0.02292768657207489,
-0.09160993993282318,
0.08101584762334824,
0.048214931041002274,
0.02651459537446499,
0.02363445609807968,
0.017278654500842094,
0.10499091446399689,
0.08047506958246231,
-0.018768122419714928,
-0.08799518644809723,
0.03442758321762085,
-0.003288334934040904,
-0.015096237882971764,
0.09198612719774246,
0.03792716935276985,
0.008372007869184017,
-0.05055195093154907,
-0.09027576446533203,
-0.065693698823452,
0.07142084836959839,
-0.06711025536060333,
-0.016460660845041275,
-0.059789177030324936,
0.09615907073020935,
0.02487996220588684,
-0.004106735810637474,
-0.04720659554004669,
-0.09373579919338226,
-0.012966804206371307,
0.11513375490903854,
0.2223031371831894,
-0.03668733313679695,
0.0047444007359445095,
-0.043274201452732086,
-0.04639963433146477,
0.04752948135137558,
-0.021525928750634193,
0.05198238044977188,
0.0546371266245842,
0.03030804544687271,
0.07690906524658203,
-0.035525377839803696,
-0.1285976767539978,
0.026151731610298157,
0.045302070677280426,
-0.07061713188886642,
-0.17655286192893982,
-0.050255365669727325,
-0.01021566055715084,
-0.014338931068778038,
-0.0321025513112545,
0.20089921355247498,
-0.024475518614053726,
-0.057796888053417206,
0.013511956669390202,
0.05692083016037941,
-0.006182589568197727,
0.11932265013456345,
0.043193474411964417,
0.03811627998948097,
-0.08709950745105743,
0.05639154091477394,
0.12214373052120209,
-0.04210713878273964,
0.04846515879034996,
0.11151936650276184,
-0.04410537704825401,
-0.06101721152663231,
-0.09836824983358383,
-0.002811167389154434,
0.046748895198106766,
-0.050984520465135574,
0.0023462376557290554,
-0.09573354572057724,
0.012786262668669224,
0.03052612952888012,
0.009780539199709892,
-0.04672912508249283,
-0.043153103440999985,
0.0017845353577286005,
-0.08856351673603058,
0.07374662905931473,
0.10972911864519119,
-0.031092388555407524,
-0.11356010288000107,
0.10192465037107468,
0.008697712793946266,
0.07150041311979294,
-0.03956835716962814,
-0.06833919137716293,
-0.09863478690385818,
-0.006764462683349848,
-0.09301809966564178,
0.03199069947004318,
-0.1466386765241623,
-0.006256544031202793,
-0.04869348183274269,
-0.03772789612412453,
-0.016352349892258644,
0.07086261361837387,
-0.03460850566625595,
0.005578516982495785,
-0.02771424502134323,
0.09377481788396835,
-0.13224154710769653,
0.06583352386951447,
0.0591827891767025,
-0.040946345776319504,
0.10620840638875961,
0.018556766211986542,
-0.0533866360783577,
0.04308556020259857,
-0.22526684403419495,
-0.05065074563026428,
-0.025708142668008804,
0.05252799764275551,
-0.012068891897797585,
-0.17513133585453033,
0.0006185817182995379,
0.022460639476776123,
0.02105659805238247,
-0.019401689991354942,
0.034842927008867264,
-0.032079990953207016,
-0.025917725637555122,
-0.06883200258016586,
-0.061271123588085175,
-0.03517041727900505,
0.05903809145092964,
0.07178834080696106,
0.005522145889699459,
0.10460685193538666,
-0.09686178714036942,
0.07313727587461472,
-0.0753045603632927,
0.026633063331246376,
-0.024382496252655983,
0.013274533674120903,
-0.0705869123339653,
-0.0738932341337204,
0.07442065328359604,
-0.013638585805892944,
0.07852360606193542,
0.012635319493710995,
-0.03552333265542984,
0.049397632479667664,
-0.057526834309101105,
-0.06554145365953445,
0.03508121520280838,
0.14889457821846008,
0.055846232920885086,
0.013685590587556362,
-0.0037438920699059963,
-0.045241281390190125,
0.0033077464904636145,
0.1453768014907837,
0.13984960317611694,
0.17359191179275513,
0.09770238399505615,
0.026786604896187782,
0.07051313668489456,
-0.04360251873731613,
-0.08816106617450714,
0.07421384751796722,
-0.08019434660673141,
0.03903999179601669,
-0.05476042628288269,
-0.051432497799396515,
0.07089818269014359,
-0.13400548696517944,
0.07413199543952942,
-0.02720683254301548,
-0.07784057408571243,
-0.105486661195755,
-0.15895208716392517,
-0.06813118606805801,
-0.05496710166335106,
0.004661183804273605,
-0.11005601286888123,
0.04013707488775253,
0.004536106251180172,
0.041372403502464294,
-0.09802231937646866,
0.11077503859996796,
-0.11563906818628311,
-0.12735781073570251,
0.14889194071292877,
-0.03994685411453247,
-0.018361004069447517,
-0.0029202448204159737,
0.045957501977682114,
0.01141631044447422,
0.09618407487869263,
0.03971182554960251,
0.05131961405277252,
0.023062260821461678,
0.01934463158249855,
-0.10823830962181091,
-0.069671131670475,
0.03202300891280174,
-0.0034071689005941153,
0.09765345603227615,
0.19157551229000092,
0.09120001643896103,
-0.07794857025146484,
0.01324983686208725,
0.15024308860301971,
0.021370956674218178,
-0.11533714830875397,
-0.147535040974617,
0.030928930267691612,
-0.03596407175064087,
-0.007838777266442776,
-0.0029904625844210386,
-0.09660274535417557,
0.014126282185316086,
0.21278588473796844,
0.16191336512565613,
-0.028621546924114227,
0.02873992547392845,
-0.015849566087126732,
0.012448173016309738,
0.016610493883490562,
0.07559079676866531,
0.08858645707368851,
0.18038330972194672,
-0.005388408433645964,
0.035307835787534714,
-0.01650456339120865,
-0.09189813584089279,
-0.11592710763216019,
0.09166015684604645,
-0.002012303564697504,
-0.04181190952658653,
-0.009908335283398628,
0.18361937999725342,
-0.1278078258037567,
-0.21153120696544647,
-0.11655936390161514,
-0.04351295158267021,
-0.11750930547714233,
0.01611918769776821,
-0.04832931235432625,
0.14026892185211182,
0.036184828728437424,
0.004998764488846064,
0.004481810610741377,
0.1808830052614212,
0.043825723230838776,
0.026452641934156418,
-0.029685495421290398,
0.10926211625337601,
-0.08322470635175705,
0.1029980406165123,
-0.0009738006046973169,
0.04481054097414017,
0.03669038787484169,
0.040307436138391495,
-0.0667676106095314,
0.021564973518252373,
0.04287837818264961,
-0.024237731471657753,
0.045170433819293976,
0.17799463868141174,
-0.006074400153011084,
0.10483993589878082,
0.11310369521379471,
-0.060360778123140335,
0.019149085506796837,
0.000011051317414967343,
0.016063915565609932,
-0.06231587007641792,
0.15561671555042267,
-0.14863023161888123,
0.13143549859523773,
0.10895053297281265,
-0.07039578258991241,
-0.04681761935353279,
-0.014044110663235188,
0.05624983832240105,
-0.06324388086795807,
0.08111327886581421,
-0.007583707105368376,
-0.17156879603862762,
0.02957131899893284,
-0.1083262488245964,
0.0723041221499443,
-0.2596970200538635,
-0.04308759421110153,
-0.03928356617689133,
-0.01437733881175518,
-0.002044477267190814,
0.12294246256351471,
0.0848124623298645,
-0.052549879997968674,
-0.005262909457087517,
-0.05799192562699318,
0.017333200201392174,
0.0879795253276825,
-0.08204024285078049,
-0.019013475626707077
] |
null | null |
transformers
|
# Wav2Vec2-Base-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the nl unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "nl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-nl-voxpopuli
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"nl",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"nl"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Base-VoxPopuli
Facebook's Wav2Vec2 base model pretrained on the nl unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the nl unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the nl unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
69,
133,
57
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the nl unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.061380140483379364,
0.02793460711836815,
-0.004597926512360573,
0.002772475825622678,
0.11542867869138718,
-0.0326576791703701,
0.07890889048576355,
0.0014844215475022793,
0.041498709470033646,
0.01610957272350788,
0.01345157716423273,
0.035119254142045975,
0.08451467007398605,
0.11283507198095322,
-0.01112778764218092,
-0.29399698972702026,
0.06555283069610596,
0.023885957896709442,
0.08218806236982346,
0.05066429451107979,
0.11536602675914764,
-0.07625988870859146,
0.037993431091308594,
0.055082544684410095,
-0.08770262449979782,
0.010899219661951065,
0.02361634187400341,
-0.08007670938968658,
0.13123087584972382,
0.10614138096570969,
0.0919680967926979,
0.059900879859924316,
0.02773408219218254,
-0.15949669480323792,
0.03248780965805054,
0.055534280836582184,
-0.060725532472133636,
-0.007988116703927517,
0.12105149030685425,
-0.03360170125961304,
0.2233949601650238,
-0.028048086911439896,
-0.03511056676506996,
0.08707624673843384,
-0.13070879876613617,
-0.1433366984128952,
-0.07127740234136581,
0.14112834632396698,
0.14243291318416595,
0.06750963628292084,
-0.07299397140741348,
0.03724442049860954,
-0.039423394948244095,
0.07423736155033112,
0.08987890928983688,
-0.29814618825912476,
-0.03704395890235901,
0.11997637152671814,
0.07154124230146408,
-0.03959580883383751,
-0.10143405199050903,
0.09113743156194687,
0.020912759006023407,
-0.0004116575582884252,
-0.012157420627772808,
-0.0902746394276619,
0.02896830067038536,
-0.08537685126066208,
-0.11756224930286407,
0.002737625502049923,
0.18074215948581696,
0.04223661497235298,
-0.058506108820438385,
-0.07162782549858093,
-0.031711310148239136,
0.1879253089427948,
-0.05809500813484192,
-0.16229112446308136,
0.0032573100179433823,
0.035270001739263535,
0.07339540123939514,
-0.16595502197742462,
-0.07666966319084167,
-0.007238190155476332,
-0.04974037781357765,
0.10952827334403992,
0.031503982841968536,
-0.017366282641887665,
-0.06805279105901718,
0.02234634757041931,
-0.09678031504154205,
-0.0702834203839302,
0.006265583448112011,
-0.10349713265895844,
-0.07653097808361053,
-0.02436574175953865,
-0.07886278629302979,
-0.0884137749671936,
-0.03283274918794632,
0.08612673729658127,
0.005853660870343447,
0.05541326105594635,
-0.0772441178560257,
0.04073324799537659,
0.0003221977676730603,
0.08950859308242798,
-0.12639638781547546,
-0.005540683399885893,
0.010867638513445854,
-0.05131184309720993,
0.0018139301100745797,
-0.03354113548994064,
-0.07649882882833481,
-0.0717058852314949,
-0.01459470670670271,
0.06335590779781342,
0.011007880792021751,
0.023986713960766792,
-0.07279061526060104,
-0.0913463905453682,
0.04096486419439316,
-0.06780724972486496,
0.018158946186304092,
0.028096914291381836,
0.0031954078003764153,
0.18695002794265747,
0.00988696701824665,
0.060598038136959076,
-0.14655208587646484,
0.02467361092567444,
-0.018477080389857292,
0.008309568278491497,
-0.006754506379365921,
-0.052029795944690704,
0.034146156162023544,
-0.03550663962960243,
-0.00886788684874773,
-0.1368737667798996,
-0.09016513079404831,
-0.08079277724027634,
0.0008254591375589371,
-0.03354287147521973,
-0.0735822319984436,
-0.033681806176900864,
-0.0025931466370821,
-0.018033625558018684,
-0.025222882628440857,
-0.02694079279899597,
-0.015491117723286152,
0.017583124339580536,
-0.03189597651362419,
0.07718918472528458,
-0.06010404974222183,
0.08154004067182541,
0.0071363868191838264,
-0.02120056562125683,
-0.1432945877313614,
0.12275193631649017,
-0.07369158416986465,
-0.06285075098276138,
-0.14799810945987701,
-0.04638715833425522,
-0.07002701610326767,
0.05527498945593834,
-0.000006047265287634218,
0.15311414003372192,
-0.19358517229557037,
-0.1130976527929306,
0.24988102912902832,
-0.11368530988693237,
0.023027902469038963,
0.1748388111591339,
0.02011699602007866,
0.03847402706742287,
0.1674356907606125,
0.14425456523895264,
0.045954663306474686,
-0.126468226313591,
0.04420322924852371,
-0.04711524024605751,
-0.023967554792761803,
0.04905825853347778,
0.05024879425764084,
-0.007546383887529373,
0.002530657220631838,
-0.007147388998419046,
-0.06011250987648964,
-0.04815950617194176,
-0.006759830750524998,
-0.05933612957596779,
0.030356530100107193,
-0.026942413300275803,
0.1000504344701767,
0.00263188942335546,
-0.00703756557777524,
0.02239132672548294,
-0.08624804019927979,
-0.02270931750535965,
0.06879769265651703,
-0.05451834946870804,
0.0746411681175232,
-0.1000981256365776,
0.05974818393588066,
0.11417195200920105,
0.05607953295111656,
-0.14806057512760162,
0.03597462922334671,
-0.021718362346291542,
0.0943978875875473,
0.0921233519911766,
0.1883343756198883,
-0.021477803587913513,
-0.037649039179086685,
-0.0877607986330986,
0.019560853019356728,
-0.03730715066194534,
-0.04442137852311134,
-0.02991015836596489,
-0.08834893256425858,
-0.035592205822467804,
-0.03858403488993645,
0.06181161478161812,
-0.1716788113117218,
0.006261738948523998,
0.07340294867753983,
0.07622378319501877,
0.017166445031762123,
0.015029856003820896,
0.013275517150759697,
0.10102564096450806,
0.038523051887750626,
0.012010673061013222,
0.08243651688098907,
-0.004606747999787331,
-0.05786275863647461,
0.10986299067735672,
-0.06261005997657776,
0.009715727530419827,
0.13244754076004028,
-0.10735750198364258,
-0.006277070846408606,
0.005225921515375376,
0.018833357840776443,
-0.001616665511392057,
-0.0009774072095751762,
-0.0186582300812006,
0.20155109465122223,
0.025832392275333405,
0.07925697416067123,
-0.08219464868307114,
0.020208677276968956,
-0.0162113755941391,
-0.03716687858104706,
-0.06072787940502167,
0.08125307410955429,
0.029701832681894302,
-0.08565548062324524,
0.00870317593216896,
0.10885032266378403,
-0.003276093630120158,
0.15080250799655914,
0.023402709513902664,
-0.01952064409852028,
0.009676054120063782,
-0.05492565035820007,
-0.020679447799921036,
-0.011142426170408726,
-0.15099364519119263,
-0.01004782598465681,
0.030482223257422447,
0.01530336495488882,
0.06439011543989182,
-0.06291645765304565,
-0.00445755897089839,
0.009344808757305145,
-0.07864584773778915,
-0.045768339186906815,
0.05005483701825142,
-0.005335074383765459,
0.0808224007487297,
-0.049357131123542786,
-0.00838963221758604,
-0.01228153333067894,
-0.026021694764494896,
-0.10506916046142578,
0.11714846640825272,
-0.05681623890995979,
-0.3612931966781616,
-0.09565555304288864,
-0.10322891920804977,
-0.09690389037132263,
0.036049000918865204,
0.050936583429574966,
-0.09204734861850739,
-0.06836871057748795,
0.0076691205613315105,
0.16578756272792816,
-0.03109440952539444,
-0.08057527989149094,
0.039543237537145615,
0.0036539456341415644,
-0.01466936245560646,
-0.09419296681880951,
0.0010898157488554716,
-0.04105352610349655,
-0.12376496940851212,
0.0022901613265275955,
-0.026214636862277985,
0.041691139340400696,
0.1426394134759903,
0.032041341066360474,
-0.017752962186932564,
-0.025273289531469345,
0.21596357226371765,
-0.12285757809877396,
0.08822891861200333,
0.27537018060684204,
-0.012301675044000149,
0.027638940140604973,
0.13936646282672882,
-0.007622706238180399,
-0.07361616939306259,
-0.0053534251637756824,
0.06833700835704803,
-0.014260942116379738,
-0.26136118173599243,
-0.1351962387561798,
-0.06499301642179489,
-0.022141505032777786,
0.02693653106689453,
-0.006841170135885477,
0.03155384212732315,
0.04183153063058853,
-0.10765770822763443,
-0.023195233196020126,
0.065264493227005,
0.02958485670387745,
0.20750010013580322,
-0.03811753913760185,
0.1401364505290985,
-0.02723274752497673,
-0.02644268237054348,
0.0675261840224266,
0.029484398663043976,
0.0883839949965477,
0.10466771572828293,
0.06162875518202782,
0.0911359116435051,
0.03256798908114433,
0.02559070661664009,
0.011657330207526684,
-0.0063515049405395985,
-0.026014083996415138,
-0.046195391565561295,
-0.026199672371149063,
-0.03886256739497185,
0.01845046877861023,
0.11304329335689545,
-0.16408482193946838,
-0.12763266265392303,
0.00983602274209261,
0.028687603771686554,
0.1453661024570465,
0.06144487485289574,
-0.08342947065830231,
-0.044667087495326996,
0.039694204926490784,
-0.07866702973842621,
-0.048707425594329834,
0.055630821734666824,
0.07780127227306366,
-0.15285837650299072,
0.14678914844989777,
0.034542035311460495,
0.10042882710695267,
-0.01924552209675312,
0.0520956851541996,
-0.16383416950702667,
-0.020205091685056686,
0.03777125105261803,
0.07501699775457382,
-0.24471451342105865,
0.22390124201774597,
0.018785987049341202,
0.06391321122646332,
-0.07252766191959381,
-0.005774503108114004,
0.042343929409980774,
0.12663094699382782,
0.13699717819690704,
-0.005157495848834515,
-0.03876255080103874,
0.00819686334580183,
-0.016491703689098358,
0.02904590219259262,
0.031582433730363846,
-0.022624175995588303,
0.045171767473220825,
-0.00043181367800571024,
0.013312420807778835,
-0.009774579666554928,
0.11215665191411972,
-0.2312534749507904,
-0.13875627517700195,
0.029243305325508118,
0.02593252994120121,
0.10827556997537613,
-0.005642303265631199,
-0.07005305588245392,
-0.10391520708799362,
0.11159144341945648,
-0.0030008498579263687,
-0.029987882822752,
-0.10416151583194733,
0.03931086137890816,
0.021797792986035347,
-0.1094992458820343,
0.02955704554915428,
0.05149560049176216,
0.13046622276306152,
-0.10527542233467102,
-0.06281492859125137,
0.039019837975502014,
-0.08798495680093765,
-0.05826423317193985,
0.049337249249219894,
0.18884161114692688,
0.10640014708042145,
0.03457237780094147,
0.11598239839076996,
-0.04123683646321297,
0.04340628162026405,
-0.1167299821972847,
0.06754932552576065,
0.008089970797300339,
-0.009826311841607094,
0.01943308115005493,
-0.06068184971809387,
-0.24426551163196564,
-0.10853884369134903,
-0.015517383813858032,
0.18415877223014832,
0.1878817081451416,
0.008908846415579319,
0.15368583798408508,
0.24283762276172638,
-0.08865974843502045,
-0.25043585896492004,
-0.058305300772190094,
-0.015994763001799583,
0.041190698742866516,
0.025557179003953934,
-0.26163941621780396,
0.056118980050086975,
0.05987352505326271,
0.006936569698154926,
-0.09294562041759491,
-0.21881476044654846,
-0.13067643344402313,
0.19032804667949677,
0.034074533730745316,
0.15165196359157562,
-0.08852718025445938,
-0.05002638325095177,
-0.08456482738256454,
-0.10114143788814545,
0.09239348024129868,
-0.1302850842475891,
0.09024057537317276,
0.048603903502225876,
0.0020877879578620195,
0.008489252999424934,
0.050741296261548996,
0.1214725598692894,
0.06789393723011017,
0.015081535093486309,
-0.03234916552901268,
0.027186527848243713,
0.030061379075050354,
0.021930718794465065,
0.03880893066525459,
0.0341833271086216,
-0.02873467281460762,
-0.07025039196014404,
-0.10792402923107147,
-0.11587342619895935,
0.09703604131937027,
-0.06051989644765854,
-0.009535842575132847,
-0.01753590628504753,
0.1002560555934906,
0.01832350715994835,
0.01577463001012802,
-0.0682259127497673,
-0.13998880982398987,
0.036482300609350204,
0.12465295195579529,
0.22647817432880402,
-0.1355476975440979,
-0.01130372378975153,
-0.04779564216732979,
-0.04660986736416817,
0.0828009843826294,
0.0039010273758322,
0.06439394503831863,
0.0460994578897953,
-0.0022711625788360834,
0.0909883975982666,
0.021858353167772293,
-0.06947819143533707,
0.006515346467494965,
0.044169243425130844,
-0.0751674696803093,
-0.24882574379444122,
-0.06924591958522797,
-0.007005407474935055,
0.02057028003036976,
0.018228529021143913,
0.18709440529346466,
-0.010477825999259949,
-0.06057262420654297,
-0.015558806248009205,
0.03670014068484306,
-0.04179684817790985,
0.054934173822402954,
0.03519101068377495,
0.040512196719646454,
-0.11484809964895248,
0.05434529483318329,
0.0936737060546875,
-0.1313130259513855,
0.05676945671439171,
0.048719797283411026,
-0.05201888456940651,
-0.09604205936193466,
-0.1340368390083313,
0.026928506791591644,
-0.023133251816034317,
-0.08275431394577026,
0.029517553746700287,
-0.16189254820346832,
0.03722347691655159,
0.1288931667804718,
0.035635121166706085,
-0.009716721251606941,
-0.05562051758170128,
-0.05328735336661339,
-0.027490712702274323,
0.0025233179330825806,
0.12668757140636444,
-0.06387206166982651,
-0.13666953146457672,
0.15063080191612244,
0.010226095095276833,
0.08953053504228592,
-0.04500433802604675,
-0.04454995319247246,
-0.13794180750846863,
0.015406102873384953,
-0.15543167293071747,
0.012436057440936565,
-0.12106838077306747,
0.0009758329833857715,
-0.05218971148133278,
-0.017517713829874992,
-0.03740131855010986,
0.03078640252351761,
-0.10321976244449615,
0.016383539885282516,
-0.0066176606342196465,
0.0811595693230629,
-0.11701831966638565,
0.06970427930355072,
0.07864492386579514,
-0.020716506987810135,
0.0742957592010498,
0.019759472459554672,
-0.03186340257525444,
0.10000908374786377,
-0.16945378482341766,
-0.050037071108818054,
0.03866923972964287,
0.02819029428064823,
-0.002487858757376671,
-0.15260107815265656,
0.01018996350467205,
0.026451895013451576,
0.033875688910484314,
-0.0011142451548948884,
0.07623261958360672,
-0.06276120245456696,
-0.009061013348400593,
-0.04801337420940399,
-0.09916231036186218,
-0.017536047846078873,
0.07939835637807846,
0.09956449270248413,
0.013802781701087952,
0.11203506588935852,
-0.0778271034359932,
0.055761221796274185,
-0.10481379926204681,
0.0679786279797554,
-0.04126814380288124,
-0.027110299095511436,
0.05331643670797348,
-0.13744273781776428,
0.053248655050992966,
0.004754588007926941,
0.0708310604095459,
-0.005351238884031773,
0.0034863746259361506,
0.014444813132286072,
-0.08652975410223007,
-0.1137741357088089,
0.02621343545615673,
0.12220318615436554,
0.07216458767652512,
-0.009255209006369114,
0.03149836137890816,
0.004816181026399136,
0.026744894683361053,
0.21710504591464996,
0.22028671205043793,
0.19609734416007996,
0.04543618857860565,
0.08244747668504715,
0.019999582320451736,
-0.04713742434978485,
-0.04323757812380791,
-0.00650319829583168,
-0.07218587398529053,
0.026751471683382988,
-0.05615130439400673,
-0.04677765071392059,
0.08767468482255936,
-0.14931903779506683,
0.12355559319257736,
0.030137289315462112,
-0.08615224808454514,
-0.15328732132911682,
-0.1778942495584488,
-0.06259243935346603,
-0.08509942144155502,
-0.015092155896127224,
-0.12822160124778748,
-0.02304176799952984,
0.01596672087907791,
0.023123836144804955,
-0.11677031219005585,
0.09156101942062378,
-0.15265172719955444,
-0.1565748155117035,
0.18006566166877747,
-0.04024375230073929,
0.00816506426781416,
-0.010554654523730278,
0.009013266302645206,
-0.002018943428993225,
0.08748781681060791,
0.015199845656752586,
0.040272507816553116,
-0.012064129114151001,
0.047454457730054855,
-0.07600651681423187,
-0.054059673100709915,
-0.002588855568319559,
0.030158869922161102,
0.12856963276863098,
0.20415851473808289,
0.03510088101029396,
-0.05778845027089119,
0.005777246784418821,
0.14880846440792084,
0.032348617911338806,
-0.10361611098051071,
-0.14071357250213623,
0.0715528056025505,
0.030381930992007256,
0.013801705092191696,
-0.03392666205763817,
-0.06234360486268997,
0.014415605925023556,
0.2848206162452698,
0.15669995546340942,
-0.04594119265675545,
0.0326017364859581,
-0.0001627168821869418,
0.037862423807382584,
0.08785108476877213,
0.10205953568220139,
0.08823158591985703,
0.1756218522787094,
-0.041976988315582275,
-0.00861790869385004,
0.00024058706185314804,
-0.06177501007914543,
-0.08775017410516739,
0.15164683759212494,
0.036557555198669434,
-0.08400198072195053,
0.01049233227968216,
0.15281033515930176,
-0.1550145447254181,
-0.07037157565355301,
-0.07670126855373383,
-0.058366209268569946,
-0.1037474200129509,
-0.019544508308172226,
-0.05144709721207619,
0.09976624697446823,
0.10660287737846375,
-0.016752133145928383,
-0.02488749288022518,
0.18396013975143433,
0.05111892148852348,
-0.015002132393419743,
-0.042800456285476685,
0.12162734568119049,
-0.007787366397678852,
0.06590859591960907,
-0.016143379732966423,
0.051183171570301056,
0.05684204399585724,
0.04468078911304474,
-0.010319608263671398,
0.057252053171396255,
-0.0016171254683285952,
0.03501236438751221,
0.07204795628786087,
0.13209716975688934,
0.006273925304412842,
0.03146637603640556,
0.08776108175516129,
-0.1553107500076294,
0.029732806608080864,
0.05516544356942177,
-0.034302808344364166,
-0.00616081990301609,
0.1754968762397766,
-0.1897033452987671,
0.05583780258893967,
0.1555209755897522,
-0.030893899500370026,
-0.024103734642267227,
-0.04517963156104088,
0.06207795441150665,
-0.015541515313088894,
0.04399823769927025,
-0.033100176602602005,
-0.1368778944015503,
0.011963210068643093,
-0.0661691203713417,
0.026883279904723167,
-0.183328777551651,
-0.005904313176870346,
-0.037211786955595016,
-0.007203810382634401,
-0.052230726927518845,
0.09401541948318481,
0.010327628813683987,
-0.05683566629886627,
0.020415598526597023,
-0.09403049200773239,
0.02786348946392536,
0.09444555640220642,
-0.0895831510424614,
-0.04568695276975632
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **pl** on **21.2k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **pl**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "pl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-pl-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"pl",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"pl"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #pl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in pl on 21.2k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pl. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in pl on 21.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #pl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in pl on 21.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #pl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in pl on 21.2k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07666877657175064,
0.09922461211681366,
-0.0026992009952664375,
0.00482570193707943,
0.07416500896215439,
-0.05154355242848396,
0.1377018839120865,
0.04416605830192566,
0.004858025815337896,
0.09847766906023026,
-0.012274288572371006,
-0.05010096728801727,
0.07399848103523254,
0.1305036097764969,
0.06295498460531235,
-0.2603112757205963,
0.039805818349123,
-0.06339768320322037,
0.05336008593440056,
0.051075179129838943,
0.12208878993988037,
-0.08454550802707672,
0.02829873003065586,
0.05919729545712471,
-0.03205778822302818,
0.02897980622947216,
-0.04601477086544037,
-0.08106766641139984,
0.054662108421325684,
0.04988296329975128,
-0.033813223242759705,
0.025863585993647575,
0.09516465663909912,
-0.19376660883426666,
0.03680048882961273,
0.04014049097895622,
0.028645772486925125,
0.010709344409406185,
0.10252518951892853,
0.019791489467024803,
0.16063104569911957,
-0.018305504694581032,
-0.004665676038712263,
0.08215302973985672,
-0.05572962388396263,
-0.0954127386212349,
-0.06406814604997635,
0.1628560572862625,
0.09274373948574066,
0.11024107784032822,
-0.07850820571184158,
0.08319560438394547,
-0.02015666477382183,
0.04321596771478653,
0.07524870336055756,
-0.1860291063785553,
-0.05480998009443283,
0.05175716057419777,
0.10675036907196045,
0.019604327157139778,
-0.08621001988649368,
0.0743287056684494,
0.05073355510830879,
-0.012736872769892216,
-0.06736525893211365,
-0.03469355031847954,
0.13853903114795685,
-0.10709232836961746,
-0.11756470054388046,
-0.002841944806277752,
0.17289407551288605,
0.05804082378745079,
-0.07433586567640305,
-0.15400980412960052,
0.014581925235688686,
0.20705640316009521,
-0.05362136289477348,
-0.0935216173529625,
0.007597500458359718,
0.02080959640443325,
0.048790205270051956,
-0.06974335759878159,
-0.07137789577245712,
-0.00550841772928834,
0.02364211156964302,
0.11246805638074875,
0.021611442789435387,
-0.016434604302048683,
-0.06970127671957016,
-0.0014601704897359014,
-0.09261965751647949,
-0.11630988866090775,
-0.006671376060694456,
-0.06532619148492813,
-0.06684111058712006,
-0.03570285439491272,
-0.00121846585534513,
-0.1009172648191452,
0.026975225657224655,
0.0984371080994606,
0.06580723822116852,
0.05585181713104248,
-0.056667644530534744,
-0.03092256374657154,
0.12774807214736938,
0.07250091433525085,
-0.1237565129995346,
-0.01147976703941822,
0.014634075574576855,
-0.017950447276234627,
0.00776755390688777,
-0.034986745566129684,
-0.036281708627939224,
0.018331347033381462,
-0.013663462363183498,
0.04558451846241951,
0.05551527068018913,
-0.034338515251874924,
-0.0363803468644619,
-0.0958893895149231,
0.09025977551937103,
-0.08085257560014725,
0.026240456849336624,
0.0477306991815567,
-0.007520745508372784,
0.09767284989356995,
-0.06308069080114365,
0.08007975667715073,
-0.11340239644050598,
0.0021297975908964872,
-0.027237102389335632,
-0.002204103162512183,
0.02226407267153263,
-0.022357013076543808,
0.031712889671325684,
0.0056633176282048225,
0.0017626547487452626,
-0.11336500942707062,
0.0016001758631318808,
-0.10256640613079071,
-0.020663797855377197,
-0.0809798613190651,
-0.045510947704315186,
-0.04165104776620865,
0.021656226366758347,
-0.007614925503730774,
-0.008053263649344444,
0.008502529002726078,
-0.017699021846055984,
-0.005777154117822647,
0.008434603922069073,
0.040538687258958817,
0.056059274822473526,
0.0831497460603714,
-0.021735994145274162,
-0.016885660588741302,
-0.11545029282569885,
0.1147271916270256,
-0.07845933735370636,
-0.023307641968131065,
-0.1387154459953308,
-0.04214019328355789,
-0.035388778895139694,
0.03275183588266373,
0.012744421139359474,
0.12078014016151428,
-0.17175202071666718,
-0.07026944309473038,
0.11663831025362015,
-0.1253194361925125,
0.009005360305309296,
0.17970313131809235,
0.00036019805702380836,
0.07606086134910583,
0.09922119975090027,
0.21995535492897034,
0.022994766011834145,
-0.17031417787075043,
-0.014242187142372131,
-0.05150790140032768,
0.04244851693511009,
0.1321313977241516,
0.06311002373695374,
-0.06610181927680969,
0.05800639092922211,
-0.017063897103071213,
-0.025158798322081566,
-0.08089055866003036,
-0.0032361752819269896,
-0.04564421623945236,
0.017508696764707565,
-0.04554720222949982,
0.0208311565220356,
-0.0027669058181345463,
-0.02398085780441761,
-0.012580453418195248,
-0.09285270422697067,
-0.06392717361450195,
0.11819753050804138,
-0.0656009092926979,
0.024622097611427307,
-0.09967801719903946,
0.06568791717290878,
0.0663285180926323,
0.0021055233664810658,
-0.12913651764392853,
0.11358033865690231,
0.03071603551506996,
-0.04596305266022682,
0.14677022397518158,
0.0787312239408493,
-0.03133013844490051,
0.008198384195566177,
-0.014916639775037766,
0.018663005903363228,
-0.03184324875473976,
0.01165335439145565,
-0.024323932826519012,
-0.10297811776399612,
-0.007253015413880348,
-0.06624360382556915,
0.1155233234167099,
-0.13092376291751862,
-0.015419457107782364,
0.04422210901975632,
0.1117224246263504,
-0.016298506408929825,
-0.041513592004776,
0.09019997715950012,
0.042592231184244156,
0.032383110374212265,
-0.018109995871782303,
0.020846037194132805,
-0.018776586279273033,
0.0011952966451644897,
0.050227534025907516,
-0.14496614038944244,
-0.15714961290359497,
0.09670478850603104,
0.01487739197909832,
-0.015056856907904148,
0.05791059136390686,
0.019332008436322212,
-0.018204359337687492,
-0.04794805124402046,
0.0027884449809789658,
0.242943674325943,
-0.011562694795429707,
0.062031760811805725,
-0.08331906795501709,
-0.008027218282222748,
0.015591028146445751,
-0.047861792147159576,
-0.0883345678448677,
0.07841312885284424,
0.0072382246144115925,
-0.08075136691331863,
-0.040065035223960876,
0.044057466089725494,
0.07579969614744186,
0.14949633181095123,
0.005964772310107946,
-0.08569908887147903,
-0.029683535918593407,
-0.059127435088157654,
-0.014898923225700855,
0.04444120451807976,
-0.13726718723773956,
-0.021669277921319008,
0.02505224570631981,
0.008852456696331501,
0.05106765404343605,
-0.028396235778927803,
0.04524058476090431,
0.008028417825698853,
-0.05143143981695175,
-0.07047845423221588,
0.0323474295437336,
-0.03124202974140644,
0.037318047136068344,
-0.00976224523037672,
0.00493103452026844,
-0.045789800584316254,
-0.0561540462076664,
-0.1437494307756424,
0.08601560443639755,
-0.06405160576105118,
-0.3180088996887207,
-0.08762853592634201,
-0.0474385991692543,
-0.035871472209692,
0.011386006139218807,
0.04725303128361702,
-0.10888832807540894,
-0.10947652906179428,
-0.0688559040427208,
0.12543189525604248,
-0.035197507590055466,
-0.06354701519012451,
0.11517012119293213,
-0.006782399024814367,
0.028802787885069847,
-0.09941068291664124,
0.015439883805811405,
-0.03815310075879097,
-0.0300814900547266,
-0.029912959784269333,
0.02325248345732689,
0.05643933266401291,
0.12318140268325806,
0.022187048569321632,
-0.0025430729147046804,
0.008043360896408558,
0.2160378098487854,
-0.1355607956647873,
0.07819240540266037,
0.24083170294761658,
-0.057710036635398865,
-0.007776547688990831,
0.13885341584682465,
-0.008200126700103283,
-0.05395832657814026,
0.04282331094145775,
0.0028677000664174557,
-0.01909024454653263,
-0.22314231097698212,
-0.1258171945810318,
-0.04216507822275162,
-0.023969562724232674,
0.047117795795202255,
0.01841152273118496,
-0.004498400259763002,
0.01637214608490467,
-0.08691330999135971,
-0.04133257269859314,
0.056927818804979324,
0.031312618404626846,
0.14874273538589478,
0.006157918833196163,
0.05472269654273987,
-0.04248236119747162,
-0.02429353818297386,
0.10169931501150131,
-0.038310080766677856,
0.041224196553230286,
0.07476415485143661,
0.09439371526241302,
0.06144586205482483,
0.04072173684835434,
0.05498137325048447,
-0.01588856428861618,
-0.019416604191064835,
-0.004276377614587545,
-0.027605604380369186,
-0.062807098031044,
0.013315808959305286,
0.04347486421465874,
0.14841289818286896,
-0.13145896792411804,
-0.11685125529766083,
0.02613569237291813,
0.014134721830487251,
0.1166115254163742,
0.10008015483617783,
-0.024740926921367645,
-0.09339521825313568,
0.034580815583467484,
-0.09393016248941422,
-0.03719416260719299,
0.05629412829875946,
0.07810583710670471,
-0.16225376725196838,
0.09104462713003159,
0.07419844716787338,
0.08865731209516525,
-0.04899245500564575,
0.030847223475575447,
-0.05243672430515289,
0.0547175258398056,
0.003938779234886169,
0.07364257425069809,
-0.17083784937858582,
0.10514713823795319,
0.015377312898635864,
0.08475153893232346,
-0.05717446655035019,
0.02726008929312229,
0.040993962436914444,
0.01050024013966322,
0.1253407895565033,
-0.009875833056867123,
-0.09397665411233902,
0.006300566252321005,
-0.12072015553712845,
0.01867509074509144,
0.060450125485658646,
-0.05952407047152519,
0.0552588514983654,
-0.005029565654695034,
-0.0031706616282463074,
-0.03672555461525917,
-0.007588800508528948,
-0.26221004128456116,
-0.1398588865995407,
0.050235480070114136,
-0.0032930050510913134,
0.06104576587677002,
-0.03972232714295387,
-0.07750692218542099,
-0.1356634497642517,
0.11651279777288437,
0.00601712753996253,
-0.017562810331583023,
-0.07520721107721329,
0.022970806807279587,
0.10156776010990143,
-0.05875518545508385,
0.015125212259590626,
0.04612242430448532,
0.14066781103610992,
-0.06683707982301712,
-0.03811230883002281,
0.0208534337580204,
-0.09790776669979095,
-0.127129465341568,
0.014646535739302635,
0.17899395525455475,
0.10928349196910858,
0.06407073140144348,
0.09332308918237686,
0.021628228947520256,
-0.004270107019692659,
-0.10026927292346954,
0.021517684683203697,
0.024486716836690903,
-0.07141196727752686,
0.04240279272198677,
0.000334674259647727,
-0.25900983810424805,
-0.14758475124835968,
-0.064420685172081,
0.08239448815584183,
0.1838740110397339,
-0.025772150605916977,
0.166573628783226,
0.26846426725387573,
-0.08549943566322327,
-0.22547195851802826,
-0.0413445346057415,
0.0011715557193383574,
0.026517275720834732,
0.04420788958668709,
-0.20614883303642273,
0.09877115488052368,
-0.0030708860140293837,
0.010456079617142677,
-0.06183677911758423,
-0.2156604677438736,
-0.1365271657705307,
0.16908758878707886,
-0.024951105937361717,
0.04103632643818855,
-0.027562899515032768,
-0.06950382888317108,
-0.03514356538653374,
-0.05165527015924454,
0.011386706493794918,
-0.0911189615726471,
0.07457085698843002,
0.0521380677819252,
0.015142791904509068,
0.023944931104779243,
0.01205539982765913,
0.11188336461782455,
0.08440347760915756,
-0.02535966783761978,
-0.08134133368730545,
0.028426721692085266,
0.005225093103945255,
-0.009748471900820732,
0.10800795257091522,
0.0441940575838089,
0.015719834715127945,
-0.05612886697053909,
-0.08660601824522018,
-0.06322401762008667,
0.05958813056349754,
-0.0712270438671112,
-0.014643445611000061,
-0.0557386577129364,
0.0890725776553154,
0.011226953007280827,
0.000234093502513133,
-0.07066605240106583,
-0.09726201742887497,
-0.018323490396142006,
0.12458780407905579,
0.21860919892787933,
-0.049670614302158356,
-0.008925589732825756,
-0.04616877809166908,
-0.04436701908707619,
0.04693678021430969,
-0.0031945551745593548,
0.04370309039950371,
0.052352357655763626,
0.024172067642211914,
0.08892381936311722,
-0.03291066363453865,
-0.1310322880744934,
0.027661962434649467,
0.03575696796178818,
-0.07160241901874542,
-0.19649294018745422,
-0.04809296131134033,
0.0019477802561596036,
-0.021608348935842514,
-0.03375917300581932,
0.19299523532390594,
-0.01786641776561737,
-0.0567183755338192,
0.00428823521360755,
0.05868959426879883,
-0.005707876291126013,
0.1190127432346344,
0.0442071333527565,
0.039071306586265564,
-0.09222394973039627,
0.05361877381801605,
0.11906274408102036,
-0.037089187651872635,
0.04616298899054527,
0.09271709620952606,
-0.04511174559593201,
-0.05466397479176521,
-0.09561453759670258,
-0.0053293523378670216,
0.062396105378866196,
-0.059411972761154175,
-0.00616253400221467,
-0.11060463637113571,
0.010608909651637077,
0.009621703997254372,
0.011655953712761402,
-0.0480525866150856,
-0.04720013588666916,
-0.003570160362869501,
-0.09385064244270325,
0.06399758160114288,
0.09938589483499527,
-0.02844390459358692,
-0.10804833471775055,
0.11068855971097946,
0.017206355929374695,
0.07927548885345459,
-0.036742281168699265,
-0.0644882470369339,
-0.08953018486499786,
-0.0037173572927713394,
-0.10009153187274933,
0.03128072991967201,
-0.1325014978647232,
-0.013824854977428913,
-0.0439717099070549,
-0.03172314912080765,
-0.007825908251106739,
0.06936203688383102,
-0.031176628544926643,
0.0017390010179951787,
-0.029854590073227882,
0.08261895924806595,
-0.12804313004016876,
0.074033223092556,
0.059037644416093826,
-0.045330386608839035,
0.10946562141180038,
0.021364539861679077,
-0.052615512162446976,
0.0360000804066658,
-0.2028910368680954,
-0.05779646709561348,
-0.03368918225169182,
0.044501375406980515,
-0.013587098568677902,
-0.17674922943115234,
0.001772381248883903,
0.018987122923135757,
0.010948105715215206,
-0.01700960285961628,
0.05594506859779358,
-0.030139952898025513,
-0.013086396269500256,
-0.06597152352333069,
-0.060713402926921844,
-0.03499070554971695,
0.06376544386148453,
0.06441809237003326,
0.007487240247428417,
0.09847335517406464,
-0.0892617255449295,
0.07870419323444366,
-0.08220496028661728,
0.027023829519748688,
-0.027728013694286346,
0.025573739781975746,
-0.07042941451072693,
-0.07667803764343262,
0.07940584421157837,
-0.014701586216688156,
0.07657399028539658,
0.029887525364756584,
-0.03298821300268173,
0.04337971657514572,
-0.04662370681762695,
-0.05590838938951492,
0.03681742772459984,
0.13703468441963196,
0.05259164050221443,
0.02151007018983364,
-0.0031842573080211878,
-0.04336560517549515,
0.0021684879902750254,
0.14211244881153107,
0.14498673379421234,
0.16305029392242432,
0.09399791806936264,
0.030934350565075874,
0.07089786231517792,
-0.04773484915494919,
-0.08341773599386215,
0.08657937496900558,
-0.0675637498497963,
0.036221958696842194,
-0.047516901046037674,
-0.061839211732149124,
0.07203412055969238,
-0.1350245177745819,
0.07460065186023712,
-0.027119044214487076,
-0.08531184494495392,
-0.10831426829099655,
-0.14224860072135925,
-0.06530448794364929,
-0.045987844467163086,
0.005682621616870165,
-0.10814574360847473,
0.026728402823209763,
0.00874403677880764,
0.027911443263292313,
-0.09059842675924301,
0.11350142955780029,
-0.11998528242111206,
-0.12674827873706818,
0.14944180846214294,
-0.032343775033950806,
-0.01058576162904501,
-0.0010540700750425458,
0.04417933151125908,
0.020426977425813675,
0.08873771876096725,
0.0523054264485836,
0.050712697207927704,
0.01894211769104004,
0.02849964052438736,
-0.0936475619673729,
-0.06566688418388367,
0.0290159173309803,
-0.017561081796884537,
0.1012396365404129,
0.19436465203762054,
0.08572772890329361,
-0.08005354553461075,
0.00832356046885252,
0.13706953823566437,
0.023388653993606567,
-0.11733122169971466,
-0.14333483576774597,
0.04777674004435539,
-0.0382440909743309,
-0.001011946122162044,
0.005062692333012819,
-0.09107669442892075,
0.023077940568327904,
0.20960871875286102,
0.17262399196624756,
-0.043119534850120544,
0.022692907601594925,
-0.009324333630502224,
0.007454587612301111,
0.02608746662735939,
0.08174959570169449,
0.0867154598236084,
0.1765962839126587,
-0.012863594107329845,
0.04794884845614433,
-0.024943066760897636,
-0.09665166586637497,
-0.1185046061873436,
0.09580408781766891,
0.005345837213099003,
-0.034423936158418655,
-0.00931564811617136,
0.18359431624412537,
-0.10735539346933365,
-0.20909379422664642,
-0.11802922189235687,
-0.040363188832998276,
-0.1169622465968132,
0.023066256195306778,
-0.04288755729794502,
0.13374121487140656,
0.05133439227938652,
-0.004930868744850159,
0.012596780434250832,
0.17811919748783112,
0.038096122443675995,
0.028162160888314247,
-0.02571970596909523,
0.11054255068302155,
-0.09294755011796951,
0.1125151589512825,
-0.001183612970635295,
0.05593268945813179,
0.03360069915652275,
0.035224705934524536,
-0.06600333750247955,
0.030691297724843025,
0.03549858555197716,
-0.005860757082700729,
0.04504893720149994,
0.16818295419216156,
-0.0024307852145284414,
0.09450145810842514,
0.10716117173433304,
-0.06011929735541344,
0.026884786784648895,
-0.017625033855438232,
0.002573988400399685,
-0.060225751250982285,
0.15669697523117065,
-0.14902544021606445,
0.12983518838882446,
0.10348284244537354,
-0.07189297676086426,
-0.046235620975494385,
-0.011122909374535084,
0.04999900981783867,
-0.05803380161523819,
0.09441544860601425,
-0.006060075480490923,
-0.17091257870197296,
0.02376510575413704,
-0.11767957359552383,
0.06685367226600647,
-0.261613667011261,
-0.04328286275267601,
-0.042995698750019073,
-0.01794314570724964,
0.004962373524904251,
0.11010894924402237,
0.08132974803447723,
-0.048651255667209625,
-0.010504274629056454,
-0.03784697875380516,
0.009797703474760056,
0.091661736369133,
-0.08414088934659958,
-0.024531591683626175
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **pt** on **17.5k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **pt**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "pt", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-pt-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"pt",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"pt"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #pt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in pt on 17.5k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pt. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in pt on 17.5k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #pt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in pt on 17.5k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #pt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in pt on 17.5k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in pt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07610859721899033,
0.10090933740139008,
-0.00290420139208436,
0.005961683113127947,
0.07533302903175354,
-0.049061305820941925,
0.1321265548467636,
0.04889702424407005,
-0.001602356554940343,
0.09842492640018463,
-0.014226565137505531,
-0.04451153427362442,
0.07539253681898117,
0.13504497706890106,
0.06176644191145897,
-0.2601030766963959,
0.039615314453840256,
-0.05962291732430458,
0.05842443183064461,
0.05340956524014473,
0.12044557929039001,
-0.08477488160133362,
0.028817269951105118,
0.05644429102540016,
-0.030723990872502327,
0.027966462075710297,
-0.04622097313404083,
-0.08247371762990952,
0.05461857467889786,
0.047237101942300797,
-0.034935835748910904,
0.02712896838784218,
0.08975791931152344,
-0.19235946238040924,
0.03623092547059059,
0.04633740335702896,
0.029534483328461647,
0.011885683983564377,
0.10477697104215622,
0.024570610374212265,
0.16143205761909485,
-0.01724093034863472,
0.000893317221198231,
0.08065313845872879,
-0.055308904498815536,
-0.10684389621019363,
-0.06188257783651352,
0.1665666252374649,
0.10024673491716385,
0.10589899122714996,
-0.08041763305664062,
0.09230337291955948,
-0.01989581808447838,
0.04438231885433197,
0.08425088226795197,
-0.18571226298809052,
-0.057574186474084854,
0.0538712814450264,
0.10528060793876648,
0.02464892528951168,
-0.08298055082559586,
0.07081178575754166,
0.05210962891578674,
-0.0131369698792696,
-0.06519144028425217,
-0.03609580174088478,
0.1485036462545395,
-0.10993577539920807,
-0.12042679637670517,
-0.002796172397211194,
0.182109534740448,
0.058750346302986145,
-0.07354338467121124,
-0.1541944444179535,
0.014570736326277256,
0.20701923966407776,
-0.05833200737833977,
-0.09845439344644547,
0.00620085746049881,
0.017584795132279396,
0.04915672540664673,
-0.06689310073852539,
-0.07268523424863815,
-0.005898214876651764,
0.02825857698917389,
0.09421347081661224,
0.02524867281317711,
-0.021020859479904175,
-0.07254379242658615,
0.0017720834584906697,
-0.09897611290216446,
-0.11117848753929138,
-0.007157640531659126,
-0.0700978934764862,
-0.06547500938177109,
-0.03820222616195679,
0.003881596028804779,
-0.09366168826818466,
0.027572570368647575,
0.10280604660511017,
0.06892767548561096,
0.05765357241034508,
-0.06279073655605316,
-0.03177386894822121,
0.12398333102464676,
0.06758145987987518,
-0.12774114310741425,
-0.010604815557599068,
0.011546929366886616,
-0.019335387274622917,
0.0004937750636599958,
-0.03461434319615364,
-0.03205244988203049,
0.01706613600254059,
-0.009491357952356339,
0.04755399003624916,
0.05900222435593605,
-0.0315648652613163,
-0.03462574630975723,
-0.09816284477710724,
0.1000557392835617,
-0.07852349430322647,
0.026536013931035995,
0.04819871112704277,
-0.004206845071166754,
0.10096658766269684,
-0.05795235186815262,
0.08015328645706177,
-0.11564021557569504,
0.003377641784027219,
-0.026996642351150513,
-0.005536204669624567,
0.02276855707168579,
-0.02528228797018528,
0.032637692987918854,
-0.0019460220355540514,
0.0022215074859559536,
-0.11677592992782593,
0.00880083441734314,
-0.10264191031455994,
-0.026370136067271233,
-0.08101989328861237,
-0.04767888784408569,
-0.04375138506293297,
0.019627828150987625,
-0.008385831490159035,
-0.007760579232126474,
0.01768356002867222,
-0.0176265649497509,
-0.003082470502704382,
0.005728567019104958,
0.04384680837392807,
0.06482003629207611,
0.08261830359697342,
-0.020447226241230965,
-0.018248029053211212,
-0.10906892269849777,
0.11198367178440094,
-0.07770424336194992,
-0.024899905547499657,
-0.143252894282341,
-0.04402147978544235,
-0.041935913264751434,
0.032563187181949615,
0.009471092373132706,
0.12659050524234772,
-0.1723635047674179,
-0.07195162773132324,
0.12509529292583466,
-0.12519340217113495,
0.0047393497079610825,
0.17933295667171478,
0.0024482521694153547,
0.0813852995634079,
0.10003267973661423,
0.21835239231586456,
0.026686087250709534,
-0.17989318072795868,
-0.0071047511883080006,
-0.05058060958981514,
0.03891898691654205,
0.12833568453788757,
0.06211165338754654,
-0.06779148429632187,
0.05180568993091583,
-0.016309184953570366,
-0.0374847948551178,
-0.08359718322753906,
-0.005821710452437401,
-0.045988619327545166,
0.01760946214199066,
-0.046674299985170364,
0.022951913997530937,
-0.0016890506958588958,
-0.022416146472096443,
-0.010890222154557705,
-0.09282811731100082,
-0.059868618845939636,
0.11960025876760483,
-0.06398407369852066,
0.025446567684412003,
-0.10116463154554367,
0.060042645782232285,
0.06489972770214081,
0.004892778117209673,
-0.12583337724208832,
0.12047129124403,
0.031307559460401535,
-0.049010612070560455,
0.14333346486091614,
0.08360377699136734,
-0.027996104210615158,
0.003005994949489832,
-0.015383433550596237,
0.017605027183890343,
-0.028123443946242332,
0.01396419107913971,
-0.028247816488146782,
-0.10524030774831772,
-0.0036701345816254616,
-0.06417717039585114,
0.11069824546575546,
-0.12772004306316376,
-0.0124362763017416,
0.04146120697259903,
0.10995353758335114,
-0.014426413923501968,
-0.04184119030833244,
0.08981916308403015,
0.043321527540683746,
0.03494354337453842,
-0.01769699901342392,
0.02325371466577053,
-0.019816039130091667,
0.005480356514453888,
0.04893158748745918,
-0.14656826853752136,
-0.14940214157104492,
0.09842582792043686,
0.017237456515431404,
-0.017091043293476105,
0.05950358882546425,
0.019773289561271667,
-0.019695132970809937,
-0.046915020793676376,
0.0007707744371145964,
0.2300928831100464,
-0.011978354305028915,
0.06009818613529205,
-0.08423281461000443,
-0.011907076463103294,
0.019541172310709953,
-0.04958038404583931,
-0.0913580134510994,
0.08134534955024719,
0.010287478566169739,
-0.06778450310230255,
-0.04160714149475098,
0.048999808728694916,
0.07740272581577301,
0.14881537854671478,
0.008322460576891899,
-0.08940727263689041,
-0.029955532401800156,
-0.05787673220038414,
-0.012322760187089443,
0.03712315112352371,
-0.14915893971920013,
-0.021886521950364113,
0.027662860229611397,
0.009580099023878574,
0.052047815173864365,
-0.025996223092079163,
0.04258952662348747,
0.007930475287139416,
-0.0478597991168499,
-0.0718357264995575,
0.037779487669467926,
-0.03061298094689846,
0.039487581700086594,
-0.010333182290196419,
0.0033502441365271807,
-0.047636695206165314,
-0.05471447855234146,
-0.1449177861213684,
0.08464989066123962,
-0.06587457656860352,
-0.321437269449234,
-0.08341876417398453,
-0.04184633493423462,
-0.030531682074069977,
0.01094320509582758,
0.05115776136517525,
-0.11480754613876343,
-0.10809361189603806,
-0.06864668428897858,
0.1235494315624237,
-0.035492200404405594,
-0.06057249754667282,
0.1105920597910881,
-0.0047114393673837185,
0.02936726063489914,
-0.0965207889676094,
0.017626209184527397,
-0.03861003741621971,
-0.03897476941347122,
-0.028503552079200745,
0.0176237840205431,
0.05830688774585724,
0.130454882979393,
0.026195470243692398,
-0.002194041619077325,
0.007362855598330498,
0.217384472489357,
-0.13996070623397827,
0.08187230676412582,
0.2387552559375763,
-0.05482694134116173,
-0.008124416694045067,
0.13511262834072113,
-0.01075579784810543,
-0.053811416029930115,
0.04692834988236427,
0.005313246510922909,
-0.020159240812063217,
-0.22437340021133423,
-0.1208990141749382,
-0.04076552763581276,
-0.018033495172858238,
0.05068822577595711,
0.01757541298866272,
0.0016834925627335906,
0.017800752073526382,
-0.08834709227085114,
-0.041968755424022675,
0.06369155645370483,
0.03282413259148598,
0.14000359177589417,
0.002836367813870311,
0.05653566122055054,
-0.041240986436605453,
-0.023986104875802994,
0.10286752134561539,
-0.033138107508420944,
0.04185008257627487,
0.07374190539121628,
0.08779235184192657,
0.06640750169754028,
0.049709927290678024,
0.053467486053705215,
-0.01574227586388588,
-0.021398795768618584,
-0.00213029608130455,
-0.02903430163860321,
-0.06249169260263443,
0.016887623816728592,
0.04317194223403931,
0.14840024709701538,
-0.1383221447467804,
-0.11930268257856369,
0.025202147662639618,
0.01589348167181015,
0.12407612055540085,
0.09766469895839691,
-0.023064106702804565,
-0.09025317430496216,
0.034397225826978683,
-0.09002364426851273,
-0.03524583950638771,
0.05698404833674431,
0.07909022271633148,
-0.16391406953334808,
0.0904371589422226,
0.0736732929944992,
0.08887595683336258,
-0.049310192465782166,
0.03161604702472687,
-0.05139369145035744,
0.04910976067185402,
0.0019520141649991274,
0.0713244304060936,
-0.18041394650936127,
0.10510188341140747,
0.013610277324914932,
0.08563752472400665,
-0.056503571569919586,
0.023829929530620575,
0.03746464475989342,
0.011982915922999382,
0.13006451725959778,
-0.01123733725398779,
-0.08961649239063263,
-0.001405928866006434,
-0.11827542632818222,
0.018299050629138947,
0.05718550086021423,
-0.06114768236875534,
0.051751770079135895,
-0.004287141375243664,
-0.0042731971479952335,
-0.03437940403819084,
-0.004069285932928324,
-0.26097843050956726,
-0.13393831253051758,
0.05107033997774124,
-0.0096414340659976,
0.04785040393471718,
-0.04068724438548088,
-0.0761231854557991,
-0.13771271705627441,
0.10429205000400543,
-0.009145215153694153,
-0.017719177529215813,
-0.0722327008843422,
0.01593378745019436,
0.09998416900634766,
-0.05936283618211746,
0.015030907467007637,
0.049028780311346054,
0.14299890398979187,
-0.06427710503339767,
-0.0380798876285553,
0.02402380108833313,
-0.10008450597524643,
-0.13010278344154358,
0.01795271784067154,
0.18111369013786316,
0.11233562231063843,
0.06393580138683319,
0.09322930872440338,
0.01698087900876999,
-0.0024847027380019426,
-0.09875787049531937,
0.025643151253461838,
0.02522869035601616,
-0.08105725049972534,
0.037838391959667206,
0.0005352801526896656,
-0.2664231061935425,
-0.15082037448883057,
-0.06279018521308899,
0.07412995398044586,
0.18531082570552826,
-0.026083575561642647,
0.16596129536628723,
0.2697666585445404,
-0.08883026242256165,
-0.2232847958803177,
-0.03857943415641785,
-0.000156737063662149,
0.01732231304049492,
0.04789198562502861,
-0.20441009104251862,
0.0991743803024292,
0.0017265318892896175,
0.011026347987353802,
-0.06637447327375412,
-0.2243916392326355,
-0.13839298486709595,
0.1786261349916458,
-0.024567510932683945,
0.041323017328977585,
-0.02890399843454361,
-0.06864124536514282,
-0.03166218101978302,
-0.05157291516661644,
0.009627247229218483,
-0.10099788010120392,
0.07104617357254028,
0.05165248364210129,
0.013081962242722511,
0.024955498054623604,
0.012451752088963985,
0.1091085746884346,
0.09145978838205338,
-0.023039406165480614,
-0.08164642006158829,
0.026204485446214676,
-0.005943332798779011,
-0.010004599578678608,
0.10707313567399979,
0.05766190215945244,
0.013473814353346825,
-0.05359199270606041,
-0.08596765249967575,
-0.06742680817842484,
0.060179878026247025,
-0.07310374081134796,
-0.016368843615055084,
-0.055094361305236816,
0.09031566232442856,
0.010075654834508896,
-0.0010698450496420264,
-0.07183117419481277,
-0.09844297915697098,
-0.013757403939962387,
0.12490908801555634,
0.21846726536750793,
-0.04788198694586754,
-0.010579302906990051,
-0.045295439660549164,
-0.046201951801776886,
0.049091897904872894,
-0.004017913714051247,
0.04474245756864548,
0.050204746425151825,
0.0221259705722332,
0.09003135561943054,
-0.03383726254105568,
-0.13447442650794983,
0.026685455814003944,
0.03565124422311783,
-0.06996788084506989,
-0.19588932394981384,
-0.0453573614358902,
-0.007350131869316101,
-0.0205264650285244,
-0.031824372708797455,
0.1930210143327713,
-0.01900619827210903,
-0.05690645053982735,
0.004017008934170008,
0.06349723786115646,
-0.0047559915110468864,
0.12691396474838257,
0.04623721167445183,
0.038733989000320435,
-0.09215739369392395,
0.05531531572341919,
0.11783956736326218,
-0.037304606288671494,
0.04649297893047333,
0.08642348647117615,
-0.044624730944633484,
-0.05589790269732475,
-0.09019441157579422,
-0.009092976339161396,
0.0648784413933754,
-0.058958880603313446,
-0.0012634180020540953,
-0.10496467351913452,
0.013211384415626526,
-0.0015363438287749887,
0.01005297526717186,
-0.04681766405701637,
-0.042753174901008606,
-0.0010090115247294307,
-0.09378428012132645,
0.06506732106208801,
0.1013028472661972,
-0.029296323657035828,
-0.10992588847875595,
0.11726700514554977,
0.018736597150564194,
0.07792653888463974,
-0.03682761639356613,
-0.06361357867717743,
-0.09197408705949783,
-0.0010227154707536101,
-0.08796795457601547,
0.03138662874698639,
-0.1353328675031662,
-0.013598024845123291,
-0.04326848313212395,
-0.03317348286509514,
-0.01063778530806303,
0.06969410181045532,
-0.032415054738521576,
0.004768026992678642,
-0.02775135636329651,
0.08680817484855652,
-0.12794482707977295,
0.07518624514341354,
0.059505246579647064,
-0.045943181961774826,
0.10839860886335373,
0.017551414668560028,
-0.05450005456805229,
0.04069668799638748,
-0.20661823451519012,
-0.052339326590299606,
-0.031936366111040115,
0.043383222073316574,
-0.012824391014873981,
-0.173471599817276,
0.00253778207115829,
0.01766841858625412,
0.010951288975775242,
-0.016221268102526665,
0.05276158079504967,
-0.032236248254776,
-0.01169661246240139,
-0.06629752367734909,
-0.061060819774866104,
-0.034463848918676376,
0.0641510933637619,
0.06619778275489807,
0.00754997693002224,
0.09599664807319641,
-0.09143590182065964,
0.0777786448597908,
-0.08246564865112305,
0.025414420291781425,
-0.026673495769500732,
0.02340642176568508,
-0.05823274329304695,
-0.07153693586587906,
0.08125419169664383,
-0.015275091864168644,
0.07404933124780655,
0.029525091871619225,
-0.027536604553461075,
0.044091448187828064,
-0.0509360209107399,
-0.06465966999530792,
0.0380430743098259,
0.13406261801719666,
0.05236852541565895,
0.019878851249814034,
-0.0026923948898911476,
-0.0414663702249527,
0.0013733112718909979,
0.1481001228094101,
0.14667564630508423,
0.16406455636024475,
0.09469791501760483,
0.029078802093863487,
0.07043857872486115,
-0.045885905623435974,
-0.08538725227117538,
0.09033721685409546,
-0.06755343079566956,
0.04221576824784279,
-0.04974277690052986,
-0.0689270868897438,
0.0752856656908989,
-0.13758154213428497,
0.07261747866868973,
-0.026744946837425232,
-0.08397188782691956,
-0.11362127214670181,
-0.14419393241405487,
-0.06575793772935867,
-0.048405181616544724,
0.0035024003591388464,
-0.1087622195482254,
0.022104792296886444,
0.003700655186548829,
0.031506411731243134,
-0.09119462966918945,
0.11374126374721527,
-0.1156875416636467,
-0.12658585608005524,
0.15379638969898224,
-0.03534741327166557,
-0.0169411338865757,
-0.0075191897340118885,
0.04293879121541977,
0.024267490953207016,
0.09211782366037369,
0.051833633333444595,
0.04483532905578613,
0.012710588984191418,
0.027166571468114853,
-0.0963662639260292,
-0.0670623630285263,
0.029190408065915108,
-0.017116576433181763,
0.09712690860033035,
0.19203011691570282,
0.08817862719297409,
-0.08181913197040558,
0.009246512316167355,
0.13967664539813995,
0.024865707382559776,
-0.11083338409662247,
-0.14878229796886444,
0.03889797255396843,
-0.0350281298160553,
0.00000809237826615572,
0.0083429841324687,
-0.08945634216070175,
0.02136208489537239,
0.21510571241378784,
0.17287206649780273,
-0.035147689282894135,
0.021682534366846085,
-0.007787539158016443,
0.009281344711780548,
0.02470329776406288,
0.08682327717542648,
0.0903225690126419,
0.16985775530338287,
-0.010039164684712887,
0.05373217910528183,
-0.019099269062280655,
-0.09621642529964447,
-0.11319532990455627,
0.09104354679584503,
0.011033969931304455,
-0.032804012298583984,
-0.007966317236423492,
0.18864350020885468,
-0.10579031705856323,
-0.21576809883117676,
-0.1213524341583252,
-0.0323503315448761,
-0.11621490120887756,
0.024940185248851776,
-0.045833755284547806,
0.13132038712501526,
0.05055892467498779,
-0.0032837579492479563,
0.013142420910298824,
0.18155892193317413,
0.038872040808200836,
0.029880477115511894,
-0.037471722811460495,
0.11150119453668594,
-0.08649229258298874,
0.11485777795314789,
-0.0016853297129273415,
0.04943494126200676,
0.0311385840177536,
0.033944230526685715,
-0.06790155172348022,
0.0294051356613636,
0.034755099564790726,
-0.0006330108153633773,
0.04473673924803734,
0.1687471717596054,
-0.004940842278301716,
0.09415434300899506,
0.109861359000206,
-0.06708745658397675,
0.02330833487212658,
-0.023137874901294708,
0.005515942815691233,
-0.06292007863521576,
0.16133233904838562,
-0.15102730691432953,
0.1255439966917038,
0.10472944378852844,
-0.07253215461969376,
-0.046284668147563934,
-0.013739206828176975,
0.05187705531716347,
-0.063288114964962,
0.09908737987279892,
-0.004946399480104446,
-0.1677941232919693,
0.02367934212088585,
-0.12800590693950653,
0.06909280270338058,
-0.2577019929885864,
-0.04002333804965019,
-0.04439409449696541,
-0.020212870091199875,
0.0009005667525343597,
0.11209259182214737,
0.0745314285159111,
-0.04674028232693672,
-0.010791667737066746,
-0.04086359590291977,
0.008891668170690536,
0.09470468014478683,
-0.08477167040109634,
-0.020994193851947784
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **ro** on **17.9k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **ro**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "ro", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-ro-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"ro",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"ro"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #ro #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in ro on 17.9k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in ro. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in ro on 17.9k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in ro. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #ro #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in ro on 17.9k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in ro. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #ro #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in ro on 17.9k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in ro. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07003821432590485,
0.10125242173671722,
-0.0027908440679311752,
0.005005856975913048,
0.07894091308116913,
-0.051427699625492096,
0.13243773579597473,
0.04606064781546593,
-0.004446492064744234,
0.09741467237472534,
-0.013215779326856136,
-0.04360103979706764,
0.07463198900222778,
0.13406500220298767,
0.05678313970565796,
-0.2632535994052887,
0.03491852432489395,
-0.06944793462753296,
0.058846332132816315,
0.04948864504694939,
0.123969167470932,
-0.08127613365650177,
0.03390193358063698,
0.05548246204853058,
-0.034407153725624084,
0.03437706455588341,
-0.04536137729883194,
-0.08400769531726837,
0.05592815577983856,
0.0507533997297287,
-0.025455212220549583,
0.025006644427776337,
0.09711821377277374,
-0.19151364266872406,
0.03799930587410927,
0.042685914784669876,
0.027328526601195335,
0.015109874308109283,
0.09957260638475418,
0.011966770514845848,
0.1672525852918625,
-0.0219432320445776,
-0.006446768995374441,
0.08171143382787704,
-0.05591098591685295,
-0.0970611572265625,
-0.06805504113435745,
0.15012310445308685,
0.09369104355573654,
0.10805131494998932,
-0.07890915870666504,
0.08184230327606201,
-0.02056039683520794,
0.04708973318338394,
0.0962628722190857,
-0.1786225140094757,
-0.058364104479551315,
0.06373213231563568,
0.10780815035104752,
0.0180123932659626,
-0.0941159650683403,
0.0737638846039772,
0.05199025943875313,
-0.01414693146944046,
-0.06716440618038177,
-0.03469438478350639,
0.12962402403354645,
-0.10533970594406128,
-0.11636154353618622,
0.002668998437002301,
0.16371259093284607,
0.05850072577595711,
-0.07246284931898117,
-0.14532138407230377,
0.011104756966233253,
0.2039337158203125,
-0.0565970242023468,
-0.08908382058143616,
0.00790421199053526,
0.01410535629838705,
0.04015889763832092,
-0.07123848795890808,
-0.06948687881231308,
-0.004792503081262112,
0.024857329204678535,
0.11066839098930359,
0.021161403506994247,
-0.018214445561170578,
-0.07445713132619858,
-0.0014139265986159444,
-0.09370056539773941,
-0.11570426821708679,
-0.007621330674737692,
-0.06894011050462723,
-0.06878357380628586,
-0.029578369110822678,
0.002603924134746194,
-0.0858192965388298,
0.03001958504319191,
0.10848306119441986,
0.06450308114290237,
0.05595066770911217,
-0.0533200241625309,
-0.034335728734731674,
0.1202496886253357,
0.0692615732550621,
-0.11891806870698929,
-0.02660258486866951,
0.016278091818094254,
-0.02028363198041916,
0.005713487509638071,
-0.0360325388610363,
-0.0311537254601717,
0.018102072179317474,
-0.021645506843924522,
0.04490641877055168,
0.058837972581386566,
-0.03387660160660744,
-0.03186026215553284,
-0.08711166679859161,
0.08962477743625641,
-0.0816296935081482,
0.02762513980269432,
0.049081433564424515,
-0.009616081602871418,
0.09385336935520172,
-0.06126931682229042,
0.07960548251867294,
-0.10880012065172195,
-0.0023824002128094435,
-0.02414175495505333,
-0.006579983048141003,
0.017109263688325882,
-0.030752433463931084,
0.03198695927858353,
-0.011168052442371845,
0.006993849296122789,
-0.11919616907835007,
0.010078978724777699,
-0.10530812293291092,
-0.024406488984823227,
-0.08517895638942719,
-0.04291470721364021,
-0.05122276395559311,
0.01604306884109974,
-0.004067922476679087,
-0.003159263404086232,
0.020775562152266502,
-0.012684267945587635,
-0.00558594660833478,
0.0128393042832613,
0.04514462873339653,
0.06385671347379684,
0.08172257989645004,
-0.021505624055862427,
-0.020965583622455597,
-0.09607663750648499,
0.1147213876247406,
-0.08153434842824936,
-0.02270178310573101,
-0.14316727221012115,
-0.04099342226982117,
-0.03879319876432419,
0.03192244842648506,
0.013482789508998394,
0.11884784698486328,
-0.16147640347480774,
-0.06924387812614441,
0.11920107901096344,
-0.11924288421869278,
0.02327398769557476,
0.18112416565418243,
-0.0028459227178245783,
0.07696274667978287,
0.10062942653894424,
0.2198970913887024,
0.020707380026578903,
-0.17453768849372864,
-0.013981427997350693,
-0.04903551936149597,
0.0380830317735672,
0.13120290637016296,
0.06164535507559776,
-0.05939966440200806,
0.05402740091085434,
-0.01592337153851986,
-0.026772186160087585,
-0.07803888618946075,
-0.005810316652059555,
-0.046120092272758484,
0.022353462874889374,
-0.04255049303174019,
0.02168937586247921,
-0.0013511518482118845,
-0.021001461893320084,
-0.014093215577304363,
-0.09485866874456406,
-0.07590700685977936,
0.12436970323324203,
-0.06597528606653214,
0.023938458412885666,
-0.1011778712272644,
0.06279495358467102,
0.06586988270282745,
0.004996707662940025,
-0.12824486196041107,
0.1168452724814415,
0.028061198070645332,
-0.04411829635500908,
0.14509445428848267,
0.08608604967594147,
-0.0291863102465868,
0.006197639275342226,
-0.01305113174021244,
0.018794970586895943,
-0.03114493191242218,
0.011109837330877781,
-0.023055842146277428,
-0.10111577063798904,
-0.00922184344381094,
-0.0699680969119072,
0.11323300004005432,
-0.14681376516819,
-0.01610751822590828,
0.03590243682265282,
0.10979531705379486,
-0.012704779393970966,
-0.03822171688079834,
0.08958376199007034,
0.042636655271053314,
0.028900889679789543,
-0.019138414412736893,
0.020353591069579124,
-0.02183547057211399,
0.005910934880375862,
0.051319628953933716,
-0.14694896340370178,
-0.16819611191749573,
0.0912165716290474,
0.011801245622336864,
-0.017816059291362762,
0.07141754031181335,
0.01933445781469345,
-0.017354896292090416,
-0.051859837025403976,
0.006103387102484703,
0.23733942210674286,
-0.013195819221436977,
0.06227705627679825,
-0.07966448366641998,
-0.0049167415127158165,
0.01822279952466488,
-0.055159129202365875,
-0.08867327123880386,
0.09134604036808014,
0.0032200422137975693,
-0.08612582832574844,
-0.04086017236113548,
0.03740346431732178,
0.06620950996875763,
0.1512182205915451,
0.012925648130476475,
-0.08529392629861832,
-0.03142232075333595,
-0.0624857023358345,
-0.012814678251743317,
0.033629633486270905,
-0.12028959393501282,
-0.02483191154897213,
0.026622742414474487,
0.005744528491050005,
0.05011243373155594,
-0.026393530890345573,
0.04507163539528847,
0.007975011132657528,
-0.04824848473072052,
-0.07407515496015549,
0.039699453860521317,
-0.033136531710624695,
0.03837951272726059,
-0.012557419016957283,
-0.006814764346927404,
-0.049800872802734375,
-0.05817079544067383,
-0.13685008883476257,
0.0877850130200386,
-0.06124451011419296,
-0.3063237965106964,
-0.08713914453983307,
-0.06511955708265305,
-0.029045192524790764,
0.010333424434065819,
0.04988056793808937,
-0.11315789818763733,
-0.10908466577529907,
-0.06646975874900818,
0.12320137768983841,
-0.03470602259039879,
-0.0688483715057373,
0.11174648255109787,
-0.0010609125019982457,
0.022796835750341415,
-0.09883782267570496,
0.01789776422083378,
-0.0426890105009079,
-0.03961896896362305,
-0.024866243824362755,
0.013736961409449577,
0.05957872420549393,
0.11550471931695938,
0.02205648273229599,
-0.005460753571242094,
0.006269071251153946,
0.20790661871433258,
-0.13700605928897858,
0.07802312821149826,
0.2367534637451172,
-0.04927248880267143,
-0.004353798925876617,
0.13460060954093933,
-0.013829193077981472,
-0.05133849382400513,
0.044419266283512115,
0.005794590804725885,
-0.024637414142489433,
-0.2291867733001709,
-0.11661923676729202,
-0.04303204268217087,
-0.025238532572984695,
0.048595085740089417,
0.017574476078152657,
-0.009746421128511429,
0.015499838627874851,
-0.08550594002008438,
-0.04388315603137016,
0.0642903670668602,
0.031414736062288284,
0.1395220309495926,
0.009327187202870846,
0.05435212701559067,
-0.04767235368490219,
-0.023716863244771957,
0.10020450502634048,
-0.032567013055086136,
0.053193099796772,
0.07075811922550201,
0.09752151370048523,
0.06138323247432709,
0.04725106433033943,
0.05952385440468788,
-0.014548377133905888,
-0.021436480805277824,
-0.0004193889326415956,
-0.03193610906600952,
-0.06152556091547012,
0.014529540203511715,
0.04405638948082924,
0.14248500764369965,
-0.13571341335773468,
-0.11394744366407394,
0.03366228938102722,
0.009296473115682602,
0.12410075962543488,
0.10036534070968628,
-0.03363754600286484,
-0.09111262857913971,
0.033707961440086365,
-0.0914071574807167,
-0.03176604583859444,
0.05754691734910011,
0.07823269069194794,
-0.16350439190864563,
0.09289556741714478,
0.07056274265050888,
0.08930196613073349,
-0.040734633803367615,
0.02923632599413395,
-0.0599241703748703,
0.06473230570554733,
0.0028508168179541826,
0.07580164819955826,
-0.16468022763729095,
0.11378080397844315,
0.012661300599575043,
0.08565175533294678,
-0.05138830095529556,
0.023024512454867363,
0.043226610869169235,
0.0008262548944912851,
0.12863650918006897,
-0.010837961919605732,
-0.10861006379127502,
-0.006595311686396599,
-0.1195794865489006,
0.020999401807785034,
0.060314077883958817,
-0.06273005157709122,
0.05960068479180336,
-0.008495880290865898,
-0.00839870423078537,
-0.03896024823188782,
-0.016538754105567932,
-0.24939827620983124,
-0.1403684765100479,
0.044479984790086746,
-0.003690890269353986,
0.04791169613599777,
-0.03800845146179199,
-0.07684081047773361,
-0.12402962148189545,
0.10383599996566772,
-0.004054916091263294,
-0.01021488942205906,
-0.07514636218547821,
0.0313090942800045,
0.10040703415870667,
-0.06532936543226242,
0.009526083245873451,
0.04454555734992027,
0.1425946205854416,
-0.06806854158639908,
-0.041951946914196014,
0.020972706377506256,
-0.10122387856245041,
-0.13057652115821838,
0.021338902413845062,
0.1733512133359909,
0.11635171622037888,
0.06606319546699524,
0.09308411180973053,
0.02288075163960457,
-0.008921447210013866,
-0.09466290473937988,
0.021803654730319977,
0.027520736679434776,
-0.07688573747873306,
0.04672816023230553,
0.0043103438802063465,
-0.2773377299308777,
-0.1522703766822815,
-0.0611453503370285,
0.07860534638166428,
0.18675343692302704,
-0.027342084795236588,
0.1703367531299591,
0.2760113477706909,
-0.08902841806411743,
-0.22509030997753143,
-0.04878945276141167,
0.0014889466110616922,
0.02869739569723606,
0.05361257120966911,
-0.2033553570508957,
0.09340782463550568,
-0.006680368445813656,
0.011248674243688583,
-0.07303134351968765,
-0.21047362685203552,
-0.1296539008617401,
0.1692124903202057,
-0.027370283380150795,
0.05009028688073158,
-0.02258358895778656,
-0.06350365281105042,
-0.033388204872608185,
-0.04192483052611351,
0.00849731732159853,
-0.09355244785547256,
0.07642629742622375,
0.05454040691256523,
0.024324098601937294,
0.024330612272024155,
0.014235027134418488,
0.10634231567382812,
0.08129983395338058,
-0.02413955330848694,
-0.08037018775939941,
0.011466696858406067,
0.004680287558585405,
-0.014154694974422455,
0.10023927688598633,
0.04675377905368805,
0.016083313152194023,
-0.0550585575401783,
-0.08560148626565933,
-0.06464405357837677,
0.06246770918369293,
-0.06952536851167679,
-0.00790588278323412,
-0.05846472084522247,
0.09234961867332458,
0.014345388859510422,
0.002966441446915269,
-0.056008778512477875,
-0.0960196778178215,
-0.015508663840591908,
0.125593900680542,
0.2104243040084839,
-0.03289434313774109,
-0.0005608086357824504,
-0.045569710433483124,
-0.04314246028661728,
0.0468914620578289,
-0.010002536699175835,
0.0389702133834362,
0.04941742122173309,
0.02399315871298313,
0.08732375502586365,
-0.03550872951745987,
-0.1372462660074234,
0.03610823675990105,
0.037051476538181305,
-0.06465009599924088,
-0.18743109703063965,
-0.04419511556625366,
-0.0012121122563257813,
-0.02162037044763565,
-0.03204703330993652,
0.19855985045433044,
-0.018450377508997917,
-0.058300938457250595,
0.0007308107451535761,
0.06869577616453171,
-0.005088415462523699,
0.12072347849607468,
0.04535909742116928,
0.03920988366007805,
-0.09174198657274246,
0.05069172754883766,
0.1227332279086113,
-0.03380802273750305,
0.048945773392915726,
0.09302489459514618,
-0.04528568685054779,
-0.0534878633916378,
-0.10057162493467331,
-0.004559227731078863,
0.0572938472032547,
-0.05822233110666275,
-0.0037563000805675983,
-0.09766068309545517,
0.007218662649393082,
-0.003578576724976301,
0.01278297882527113,
-0.04946846887469292,
-0.04553784430027008,
-0.004112392198294401,
-0.09761228412389755,
0.06329459697008133,
0.10262677818536758,
-0.032524727284908295,
-0.10537117719650269,
0.09688341617584229,
0.019073059782385826,
0.08422466367483139,
-0.037250254303216934,
-0.06380604952573776,
-0.09095660597085953,
-0.002903927816078067,
-0.08352077752351761,
0.038188636302948,
-0.13353313505649567,
-0.010456115938723087,
-0.04926150664687157,
-0.03444343060255051,
-0.013933146372437477,
0.07029233127832413,
-0.03373832628130913,
0.004559885244816542,
-0.029576117172837257,
0.08696021139621735,
-0.12516658008098602,
0.07132347673177719,
0.060555677860975266,
-0.044259972870349884,
0.10532154887914658,
0.022195078432559967,
-0.048357605934143066,
0.03859551623463631,
-0.2125362604856491,
-0.05944083258509636,
-0.03465935215353966,
0.041489407420158386,
-0.009877397678792477,
-0.17891034483909607,
0.0019137318013235927,
0.013291608542203903,
0.016859153285622597,
-0.021511701866984367,
0.049980711191892624,
-0.03471913933753967,
-0.006087072193622589,
-0.07409501820802689,
-0.06602701544761658,
-0.038208842277526855,
0.06720785051584244,
0.05886179208755493,
0.011323493905365467,
0.10181045532226562,
-0.08849430084228516,
0.08251703530550003,
-0.07766766101121902,
0.02723991870880127,
-0.03083595633506775,
0.01846347190439701,
-0.07024470716714859,
-0.07704073935747147,
0.0811309814453125,
-0.015073484741151333,
0.06568586826324463,
0.035604458302259445,
-0.027046693488955498,
0.03992485627532005,
-0.06049870699644089,
-0.06300785392522812,
0.035772982984781265,
0.12299122661352158,
0.05408525466918945,
0.021652216091752052,
0.0031977775506675243,
-0.04565390199422836,
0.0054975938983261585,
0.14958466589450836,
0.1428363174200058,
0.17544947564601898,
0.10665513575077057,
0.031555693596601486,
0.07010035961866379,
-0.05319250002503395,
-0.08089364320039749,
0.09422440081834793,
-0.07128097862005234,
0.03468099609017372,
-0.05037180706858635,
-0.07563798129558563,
0.08670917898416519,
-0.13090495765209198,
0.07091783732175827,
-0.029502911493182182,
-0.0809779241681099,
-0.11501454561948776,
-0.14746010303497314,
-0.06971445679664612,
-0.04237695410847664,
0.008466220460832119,
-0.1051102951169014,
0.030136164277791977,
0.011354438029229641,
0.0279300007969141,
-0.08633241057395935,
0.1062973216176033,
-0.11659449338912964,
-0.12672528624534607,
0.14972908794879913,
-0.035725630819797516,
-0.013516026549041271,
0.00440853089094162,
0.04041515663266182,
0.026544587686657906,
0.09205673635005951,
0.05424928665161133,
0.04726775735616684,
0.023337122052907944,
0.03527890518307686,
-0.09706752002239227,
-0.07010176032781601,
0.02863134816288948,
-0.011170286685228348,
0.10522495955228806,
0.18670500814914703,
0.08882477134466171,
-0.08051953464746475,
0.010507878847420216,
0.1480407565832138,
0.024431709200143814,
-0.10864107310771942,
-0.14476542174816132,
0.03761928528547287,
-0.03285733610391617,
0.00005736263483413495,
-0.0007137876818887889,
-0.08878806233406067,
0.02128741890192032,
0.20638494193553925,
0.18683773279190063,
-0.02743603102862835,
0.020013414323329926,
-0.01363281812518835,
0.009711670689284801,
0.024666931480169296,
0.08087514340877533,
0.08675297349691391,
0.1768788844347,
-0.013888323679566383,
0.04483393207192421,
-0.018419576808810234,
-0.09287161380052567,
-0.11435452103614807,
0.09393267333507538,
0.006104767322540283,
-0.03288833424448967,
-0.005502105690538883,
0.1861969381570816,
-0.10213260352611542,
-0.23500189185142517,
-0.11577049642801285,
-0.04071243107318878,
-0.11621630191802979,
0.02557843178510666,
-0.03876012563705444,
0.13875454664230347,
0.05788617581129074,
-0.00601015891879797,
0.006989980582147837,
0.17691367864608765,
0.036342818289995193,
0.02732611633837223,
-0.02843705378472805,
0.10986028611660004,
-0.08926278352737427,
0.1107364371418953,
-0.00562610849738121,
0.047364477068185806,
0.037358157336711884,
0.03006201982498169,
-0.06828586012125015,
0.033956028521060944,
0.03460967168211937,
-0.0026851207949221134,
0.04852786287665367,
0.17279984056949615,
-0.008621707558631897,
0.09922196716070175,
0.10844766348600388,
-0.06342708319425583,
0.028488921001553535,
-0.02038407139480114,
0.005326556041836739,
-0.0586751364171505,
0.1583092361688614,
-0.1480434238910675,
0.12740452587604523,
0.10800051689147949,
-0.07237923890352249,
-0.04447079449892044,
-0.01244937814772129,
0.0507039837539196,
-0.05962871015071869,
0.09052356332540512,
-0.0087708355858922,
-0.1673009693622589,
0.0200848076492548,
-0.1264685094356537,
0.06934559345245361,
-0.2602750062942505,
-0.03804606944322586,
-0.04723165184259415,
-0.018729856237769127,
0.004552091006189585,
0.11102917790412903,
0.0771694928407669,
-0.04933689534664154,
-0.008550814352929592,
-0.051727283746004105,
0.008070661686360836,
0.09259919077157974,
-0.08971741795539856,
-0.02743295580148697
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **sk** on **12.1k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **sk**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "sk", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-sk-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"sk",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"sk"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sk #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in sk on 12.1k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sk. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sk on 12.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sk. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sk #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sk on 12.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sk. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sk #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sk on 12.1k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sk. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07763978093862534,
0.09865199029445648,
-0.0026695257984101772,
-0.0014226268976926804,
0.07958639413118362,
-0.05070554465055466,
0.13842333853244781,
0.04058186337351799,
0.01867060735821724,
0.10229440778493881,
-0.013392757624387741,
-0.03808460012078285,
0.0739201083779335,
0.13565485179424286,
0.06267588585615158,
-0.2528286576271057,
0.044400908052921295,
-0.0702095627784729,
0.03245377168059349,
0.05099189653992653,
0.12015223503112793,
-0.08780568838119507,
0.028120804578065872,
0.05682860687375069,
-0.03218185901641846,
0.023079290986061096,
-0.04446867108345032,
-0.08735517412424088,
0.052645400166511536,
0.04732668399810791,
-0.030933581292629242,
0.028360504657030106,
0.09327349811792374,
-0.18962852656841278,
0.038248445838689804,
0.03879006206989288,
0.03676486015319824,
0.008511926978826523,
0.0875580906867981,
0.023458678275346756,
0.1565294712781906,
-0.02623111568391323,
-0.002125433413311839,
0.0759381428360939,
-0.050804655998945236,
-0.08076436817646027,
-0.06485907733440399,
0.16256162524223328,
0.1068742498755455,
0.10432004928588867,
-0.07670719176530838,
0.07967043668031693,
-0.022719981148838997,
0.0427253283560276,
0.0798102468252182,
-0.17137929797172546,
-0.05458489805459976,
0.057017479091882706,
0.09147638827562332,
0.015876920893788338,
-0.08719737082719803,
0.07494424283504486,
0.045978475362062454,
-0.014643154107034206,
-0.055616721510887146,
-0.036960769444704056,
0.12204843014478683,
-0.10135316848754883,
-0.12065370380878448,
-0.0030209815595299006,
0.1732867956161499,
0.05797729268670082,
-0.07410912960767746,
-0.1506410539150238,
0.015064707957208157,
0.20446768403053284,
-0.06034763529896736,
-0.0905936136841774,
0.0035009311977773905,
0.012155722826719284,
0.04446424916386604,
-0.07412777841091156,
-0.07484794408082962,
-0.011056200601160526,
0.020823299884796143,
0.1281801462173462,
0.023853814229369164,
-0.013200298883020878,
-0.07064010202884674,
0.0015365870203822851,
-0.09546942263841629,
-0.11649960279464722,
-0.007367331068962812,
-0.06730948388576508,
-0.076559878885746,
-0.03273683786392212,
0.0029609776102006435,
-0.10306628048419952,
0.031112102791666985,
0.10616865754127502,
0.07587581872940063,
0.05397514998912811,
-0.04186910763382912,
-0.03366287797689438,
0.12368687242269516,
0.07173387706279755,
-0.12133236229419708,
-0.00734524242579937,
0.01880655065178871,
-0.015618087723851204,
0.007595947943627834,
-0.04364820197224617,
-0.03704812005162239,
0.013589482754468918,
-0.016445763409137726,
0.045427799224853516,
0.058561719954013824,
-0.025975877419114113,
-0.027361376211047173,
-0.09332186728715897,
0.09621728211641312,
-0.08377978950738907,
0.02208443358540535,
0.045095887035131454,
-0.0028399608563631773,
0.09645497798919678,
-0.06338515877723694,
0.08399137109518051,
-0.10862204432487488,
0.0132339121773839,
-0.02681419625878334,
-0.002614536788314581,
0.0242096446454525,
-0.02182133123278618,
0.03478847071528435,
-0.005479041021317244,
0.00400482676923275,
-0.11548123508691788,
0.000977015937678516,
-0.10501789301633835,
-0.030433112755417824,
-0.07557306438684464,
-0.05419635400176048,
-0.043291013687849045,
0.009273595176637173,
0.00041511026211082935,
-0.009062462486326694,
0.015770738944411278,
-0.01964726857841015,
-0.006450260058045387,
0.0010924761882051826,
0.042545486241579056,
0.06106671318411827,
0.0777084156870842,
-0.02049946039915085,
-0.015483325347304344,
-0.12462645024061203,
0.11932607740163803,
-0.09084859490394592,
-0.018533164635300636,
-0.1363244503736496,
-0.03883395344018936,
-0.0495835617184639,
0.030619027093052864,
0.013924728147685528,
0.13331757485866547,
-0.178177148103714,
-0.06907521188259125,
0.1147165447473526,
-0.12717843055725098,
0.004712245427072048,
0.1759752780199051,
-0.003942158073186874,
0.07753831148147583,
0.09763608872890472,
0.220563605427742,
0.0239796731621027,
-0.18560510873794556,
-0.02179625630378723,
-0.04951336979866028,
0.03844073787331581,
0.12560570240020752,
0.06486382335424423,
-0.0662914589047432,
0.06384889781475067,
-0.012463293969631195,
-0.04652974382042885,
-0.07351814210414886,
-0.0011644175974652171,
-0.04559946805238724,
0.017372040078043938,
-0.04811551794409752,
0.0295907910913229,
-0.008007168769836426,
-0.016453344374895096,
-0.011223534122109413,
-0.08608411997556686,
-0.045033302158117294,
0.12007535248994827,
-0.06099925562739372,
0.027702637016773224,
-0.10086723417043686,
0.061067573726177216,
0.0627613440155983,
0.003978298511356115,
-0.12729066610336304,
0.09980443865060806,
0.02779899351298809,
-0.05308098718523979,
0.15146306157112122,
0.06822030991315842,
-0.03194345533847809,
0.001483386498875916,
-0.010899499990046024,
0.02197885885834694,
-0.03342936560511589,
0.02003186009824276,
-0.0314563512802124,
-0.1007762998342514,
-0.005939613562077284,
-0.0596683993935585,
0.11818239837884903,
-0.12148404866456985,
-0.008635083213448524,
0.0485818088054657,
0.11593586206436157,
-0.01976791024208069,
-0.040034111589193344,
0.09494627267122269,
0.042064134031534195,
0.03522568568587303,
-0.017226841300725937,
0.01809781603515148,
-0.014254243113100529,
-0.00469222804531455,
0.05662145838141441,
-0.14078755676746368,
-0.17336076498031616,
0.0977979451417923,
0.018901849165558815,
-0.014906471595168114,
0.06254365295171738,
0.024415666237473488,
-0.020403914153575897,
-0.051256537437438965,
-0.0026022731326520443,
0.23140370845794678,
-0.006779736373573542,
0.05298483744263649,
-0.07818449288606644,
-0.015775609761476517,
0.015835197642445564,
-0.05225758999586105,
-0.08290862292051315,
0.07649198174476624,
0.01472472958266735,
-0.0681786984205246,
-0.03820941224694252,
0.04202820733189583,
0.06643567979335785,
0.14908550679683685,
0.0001565972634125501,
-0.08632279932498932,
-0.03088228963315487,
-0.06823374330997467,
-0.01798339933156967,
0.045184679329395294,
-0.13858284056186676,
-0.026953432708978653,
0.027988404035568237,
0.005209965631365776,
0.04851986840367317,
-0.027323300018906593,
0.044260889291763306,
0.003489572321996093,
-0.05145213380455971,
-0.06425752490758896,
0.0432516485452652,
-0.030848586931824684,
0.03591405227780342,
-0.008158204145729542,
0.014565694145858288,
-0.0481874905526638,
-0.055603109300136566,
-0.14208249747753143,
0.08564262837171555,
-0.06663107872009277,
-0.3005216717720032,
-0.09715300053358078,
-0.03306666016578674,
-0.03980659320950508,
0.008205428719520569,
0.043999698013067245,
-0.11061763763427734,
-0.10811921954154968,
-0.0694044977426529,
0.12347845733165741,
-0.02411508932709694,
-0.060383401811122894,
0.11435478925704956,
-0.011646920815110207,
0.015977049246430397,
-0.09300374239683151,
0.01355742197483778,
-0.03598504140973091,
-0.021961679682135582,
-0.0266719963401556,
0.028237544000148773,
0.06705474108457565,
0.13303951919078827,
0.023494139313697815,
-0.006113297306001186,
0.008207147940993309,
0.22263795137405396,
-0.13810130953788757,
0.09417673200368881,
0.234615758061409,
-0.05990242585539818,
-0.0042302049696445465,
0.14045323431491852,
-0.009921502321958542,
-0.044970206916332245,
0.046690747141838074,
0.010669248178601265,
-0.028112545609474182,
-0.22122271358966827,
-0.11909942328929901,
-0.043434299528598785,
-0.02751002088189125,
0.039502281695604324,
0.01652030646800995,
-0.00001712129778752569,
0.024543683975934982,
-0.08741451799869537,
-0.048374779522418976,
0.059858787804841995,
0.03524726256728172,
0.15409643948078156,
0.013201179914176464,
0.055298492312431335,
-0.04732705280184746,
-0.021821575239300728,
0.10131825506687164,
-0.03211434558033943,
0.037517037242650986,
0.06056420877575874,
0.0925702378153801,
0.0678711012005806,
0.04351058229804039,
0.06023359298706055,
-0.02128622680902481,
-0.015129782259464264,
-0.0010917403269559145,
-0.024389272555708885,
-0.06407982110977173,
0.010778899304568768,
0.04758472740650177,
0.13342443108558655,
-0.14262349903583527,
-0.11192783713340759,
0.0364035964012146,
0.016220953315496445,
0.13035443425178528,
0.09668569266796112,
-0.012803100980818272,
-0.10377783328294754,
0.03885853663086891,
-0.08807193487882614,
-0.029862506315112114,
0.052024420350790024,
0.0861087441444397,
-0.16750188171863556,
0.09565742313861847,
0.07686108350753784,
0.08616360276937485,
-0.03375741466879845,
0.02606702782213688,
-0.057183437049388885,
0.054485999047756195,
-0.0007739983848296106,
0.06864050775766373,
-0.16898831725120544,
0.0994637981057167,
0.012959892861545086,
0.08590579032897949,
-0.05688495188951492,
0.023255502805113792,
0.049290742725133896,
0.005042918957769871,
0.12620794773101807,
-0.004058378748595715,
-0.1323419213294983,
-0.0076949624344706535,
-0.12175614386796951,
0.016204554587602615,
0.051827818155288696,
-0.06461424380540848,
0.060156360268592834,
-0.0012028481578454375,
-0.0052963439375162125,
-0.040017832070589066,
-0.006921031046658754,
-0.24537698924541473,
-0.13303188979625702,
0.046890389174222946,
0.00039361085509881377,
0.04288659989833832,
-0.04037253558635712,
-0.07516145706176758,
-0.13647308945655823,
0.09113174676895142,
-0.007162578869611025,
-0.02631489373743534,
-0.07465627044439316,
0.01718871109187603,
0.10655004531145096,
-0.06731835752725601,
0.016105707734823227,
0.04852665960788727,
0.144901841878891,
-0.06183675676584244,
-0.043175045400857925,
0.01490490697324276,
-0.09706329554319382,
-0.13088847696781158,
0.02371695637702942,
0.17855367064476013,
0.1102316826581955,
0.0642688199877739,
0.08746185153722763,
0.025080222636461258,
0.0018855358939617872,
-0.10653364658355713,
0.022278932854533195,
0.03350844234228134,
-0.06907612085342407,
0.0519568994641304,
0.0007772545795887709,
-0.26793280243873596,
-0.15459668636322021,
-0.06611597537994385,
0.078148253262043,
0.1891423910856247,
-0.028526796028017998,
0.16799768805503845,
0.2683766186237335,
-0.08682003617286682,
-0.23553593456745148,
-0.041470520198345184,
0.00037991764838807285,
0.02719121240079403,
0.04577042534947395,
-0.19922032952308655,
0.1072545275092125,
-0.0024659568443894386,
0.009421717375516891,
-0.06844770163297653,
-0.22005261480808258,
-0.1466711014509201,
0.17341427505016327,
-0.02368682622909546,
0.04517549276351929,
-0.029915140941739082,
-0.0709439367055893,
-0.03211108222603798,
-0.06438734382390976,
0.01228439249098301,
-0.094358891248703,
0.06649907678365707,
0.05626462772488594,
0.026581943035125732,
0.025965390726923943,
0.010832983069121838,
0.11575399339199066,
0.09958842396736145,
-0.02431131713092327,
-0.08331284672021866,
0.016501478850841522,
0.006124836392700672,
-0.017644153907895088,
0.1097235158085823,
0.05186358094215393,
0.01822514273226261,
-0.046280235052108765,
-0.08355465531349182,
-0.06897568702697754,
0.058868519961833954,
-0.07068027555942535,
-0.015330571681261063,
-0.06428181380033493,
0.0895407646894455,
0.016358770430088043,
-0.001629836275242269,
-0.0680801272392273,
-0.09958590567111969,
-0.015043498948216438,
0.11525341123342514,
0.21576938033103943,
-0.04302387684583664,
-0.008915826678276062,
-0.040885090827941895,
-0.047733742743730545,
0.04193811118602753,
-0.008208693005144596,
0.044186998158693314,
0.05201917141675949,
0.030007949098944664,
0.08466314524412155,
-0.03586757928133011,
-0.13082106411457062,
0.031982067972421646,
0.03362130746245384,
-0.06720542162656784,
-0.1888112872838974,
-0.04352029040455818,
-0.007080191280692816,
-0.02352570742368698,
-0.035316187888383865,
0.19258266687393188,
-0.01769680716097355,
-0.05573417991399765,
0.0009079891024157405,
0.05998648330569267,
0.000045071865315549076,
0.12240392714738846,
0.052558403462171555,
0.04277476668357849,
-0.08885384351015091,
0.05188756436109543,
0.11251058429479599,
-0.0343189537525177,
0.05020986124873161,
0.08724351227283478,
-0.05515278875827789,
-0.053864941000938416,
-0.09207996726036072,
-0.012539093382656574,
0.0510563999414444,
-0.05830707028508186,
-0.004203354474157095,
-0.1065523698925972,
0.004158922005444765,
0.012908932752907276,
0.011207797564566135,
-0.046010710299015045,
-0.04123866558074951,
-0.0004104022227693349,
-0.09025312215089798,
0.06904514133930206,
0.10035262256860733,
-0.0299041997641325,
-0.11579804867506027,
0.11734028905630112,
0.013112972490489483,
0.08804839104413986,
-0.038374174386262894,
-0.05926698446273804,
-0.08606696873903275,
-0.004399141296744347,
-0.09067520499229431,
0.03591836243867874,
-0.14366813004016876,
-0.014314132742583752,
-0.04355321452021599,
-0.040054451674222946,
-0.011236529797315598,
0.07096786797046661,
-0.03231579810380936,
0.0059031895361840725,
-0.03505762666463852,
0.08333418518304825,
-0.12101402878761292,
0.06949298083782196,
0.0674315020442009,
-0.04932707920670509,
0.10543953627347946,
0.03127867728471756,
-0.05155588313937187,
0.047904036939144135,
-0.1931174397468567,
-0.048700157552957535,
-0.030301015824079514,
0.04670511558651924,
-0.01673022471368313,
-0.16762776672840118,
-0.0036457106471061707,
0.015181229449808598,
0.02517104707658291,
-0.016509246081113815,
0.04893219470977783,
-0.02377268299460411,
-0.01655041240155697,
-0.06305380165576935,
-0.05511614680290222,
-0.03912198543548584,
0.06637762486934662,
0.08013102412223816,
0.013381806202232838,
0.0950855016708374,
-0.09080138057470322,
0.07733617722988129,
-0.07777798920869827,
0.027412403374910355,
-0.03029957413673401,
0.024201588705182076,
-0.06247193366289139,
-0.0774327740073204,
0.0804266706109047,
-0.013679166324436665,
0.07279690355062485,
0.020703662186861038,
-0.03717575967311859,
0.045771755278110504,
-0.04802247881889343,
-0.05169687792658806,
0.04615962132811546,
0.1415807604789734,
0.053862567991018295,
0.015098120085895061,
-0.002238804940134287,
-0.04244588688015938,
-0.002108404878526926,
0.1402912735939026,
0.15241681039333344,
0.16171154379844666,
0.09831748902797699,
0.03544748201966286,
0.07300181686878204,
-0.0419745147228241,
-0.0772646889090538,
0.09111744910478592,
-0.08351115882396698,
0.03385155275464058,
-0.05146024376153946,
-0.05444366857409477,
0.07456328719854355,
-0.14322452247142792,
0.07722857594490051,
-0.024753987789154053,
-0.08714514970779419,
-0.11410991102457047,
-0.1483977884054184,
-0.06385970860719681,
-0.042456433176994324,
0.005536927375942469,
-0.10356464236974716,
0.020174533128738403,
0.012489733286201954,
0.0275798961520195,
-0.08845563977956772,
0.12176729738712311,
-0.1205575093626976,
-0.12226417660713196,
0.1507231444120407,
-0.03500436991453171,
-0.020594825968146324,
0.0024497383274137974,
0.04335341602563858,
0.0339396595954895,
0.0976332426071167,
0.052271053194999695,
0.042818449437618256,
0.016124911606311798,
0.030717525631189346,
-0.09376375377178192,
-0.06574718654155731,
0.03030472807586193,
-0.014953775331377983,
0.1074870154261589,
0.18183939158916473,
0.09348893910646439,
-0.07921107113361359,
0.006203567609190941,
0.1421361267566681,
0.023403480648994446,
-0.12493634968996048,
-0.15278613567352295,
0.016784870997071266,
-0.038012437522411346,
-0.006278866436332464,
0.0073074158281087875,
-0.09633886069059372,
0.021172326058149338,
0.20052510499954224,
0.16729027032852173,
-0.039584700018167496,
0.02300025150179863,
-0.01964929886162281,
0.008735100738704205,
0.018451204523444176,
0.08242808282375336,
0.08202795684337616,
0.1771504133939743,
-0.009464764036238194,
0.047937989234924316,
-0.01835906133055687,
-0.10154932737350464,
-0.11746915429830551,
0.09368257224559784,
-0.00043248001020401716,
-0.03816644102334976,
-0.007352166809141636,
0.1847027689218521,
-0.11183632165193558,
-0.2269781529903412,
-0.12240777909755707,
-0.03775870054960251,
-0.11528895795345306,
0.028357094153761864,
-0.055530767887830734,
0.13536740839481354,
0.04439738765358925,
-0.004999136086553335,
0.015248652547597885,
0.1740751713514328,
0.04011061042547226,
0.03368974104523659,
-0.021092453971505165,
0.1165059506893158,
-0.07644352316856384,
0.10363379865884781,
-0.0017098974203690886,
0.06099419668316841,
0.035322289913892746,
0.03495599329471588,
-0.06739263236522675,
0.0337311252951622,
0.036009423434734344,
-0.00343489833176136,
0.04371289536356926,
0.16996003687381744,
-0.0038914659526199102,
0.07966698706150055,
0.1114078015089035,
-0.06744497269392014,
0.025147613137960434,
-0.021181989461183548,
-0.0006426961626857519,
-0.0663357526063919,
0.15775421261787415,
-0.15678036212921143,
0.1296219378709793,
0.098052978515625,
-0.07300060987472534,
-0.04506193846464157,
-0.005827201995998621,
0.04450464993715286,
-0.06619630008935928,
0.09128766506910324,
-0.006853754166513681,
-0.16792753338813782,
0.02588152140378952,
-0.13684548437595367,
0.06976991146802902,
-0.2593415677547455,
-0.04169847443699837,
-0.04474669694900513,
-0.013678199611604214,
0.012708127498626709,
0.11181285977363586,
0.07774823904037476,
-0.050121646374464035,
-0.010307613760232925,
-0.04752957448363304,
0.008763954974710941,
0.09596122056245804,
-0.08549551665782928,
-0.026417464017868042
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **sl** on **11.3k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **sl**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "sl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-sl-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"sl",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"sl"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in sl on 11.3k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sl. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sl on 11.3k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sl on 11.3k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sl #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sl on 11.3k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sl. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07698066532611847,
0.10439711064100266,
-0.002720531774684787,
0.004586948547512293,
0.0813659057021141,
-0.05187935382127762,
0.13938511908054352,
0.03863164782524109,
0.00834662839770317,
0.09804551303386688,
-0.01083016861230135,
-0.04073362424969673,
0.06834591180086136,
0.1256818324327469,
0.062084898352622986,
-0.2598106563091278,
0.04261999577283859,
-0.0748542919754982,
0.04991738498210907,
0.04787161946296692,
0.1215287521481514,
-0.08557131141424179,
0.026192670688033104,
0.05445079505443573,
-0.030268272385001183,
0.030821535736322403,
-0.046226128935813904,
-0.08121930807828903,
0.052533864974975586,
0.04916141927242279,
-0.03274289518594742,
0.026844799518585205,
0.09944039583206177,
-0.1932334303855896,
0.03716505691409111,
0.0348828099668026,
0.032145559787750244,
0.006179573480039835,
0.09679354727268219,
0.017791446298360825,
0.1704694926738739,
-0.026472724974155426,
-0.002196322428062558,
0.07695453613996506,
-0.05163547396659851,
-0.09387266635894775,
-0.06616688519716263,
0.15174514055252075,
0.09717073291540146,
0.10325703024864197,
-0.07576588541269302,
0.07289162278175354,
-0.02700197882950306,
0.05032423883676529,
0.07306049019098282,
-0.17738713324069977,
-0.05509041249752045,
0.059488605707883835,
0.10188555717468262,
0.014123295433819294,
-0.08492699265480042,
0.07982274144887924,
0.04721564054489136,
-0.014153508469462395,
-0.06091635674238205,
-0.03263267129659653,
0.12978532910346985,
-0.09842019528150558,
-0.12049442529678345,
-0.005821361672133207,
0.173013836145401,
0.055527061223983765,
-0.07451267540454865,
-0.1548442542552948,
0.015783105045557022,
0.20770637691020966,
-0.05548863857984543,
-0.08844338357448578,
0.008552351966500282,
0.01623326912522316,
0.04679788649082184,
-0.07276853173971176,
-0.06918424367904663,
-0.009706972166895866,
0.026461301371455193,
0.1182202473282814,
0.020919762551784515,
-0.014150573872029781,
-0.07063332945108414,
-0.0012277328642085195,
-0.09259911626577377,
-0.11605032533407211,
-0.0067632258869707584,
-0.0638275146484375,
-0.07171731442213058,
-0.030173080042004585,
-0.0013689732877537608,
-0.10514887422323227,
0.03047178126871586,
0.10946851223707199,
0.07587788999080658,
0.05523575469851494,
-0.048705097287893295,
-0.0330609492957592,
0.12260080873966217,
0.06992118805646896,
-0.12056787312030792,
-0.01363739650696516,
0.015741901472210884,
-0.02007846161723137,
0.015938306227326393,
-0.03948825225234032,
-0.037736255675554276,
0.01754857785999775,
-0.020559394732117653,
0.0468321219086647,
0.058779265731573105,
-0.02889322303235531,
-0.033079326152801514,
-0.0900406688451767,
0.08476103097200394,
-0.0869351327419281,
0.026972822844982147,
0.046981509774923325,
-0.009856143034994602,
0.09113766998052597,
-0.05860323831439018,
0.08158800005912781,
-0.10910751670598984,
0.012547300197184086,
-0.02455986477434635,
-0.0006380343693308532,
0.015133903361856937,
-0.025028647854924202,
0.03402872756123543,
-0.0003992865968029946,
0.004270333796739578,
-0.11292467266321182,
-0.016657214611768723,
-0.10487683862447739,
-0.02553604170680046,
-0.0777285099029541,
-0.04527316614985466,
-0.05099805071949959,
0.01583479903638363,
-0.0022070971317589283,
-0.00978873111307621,
0.010230764746665955,
-0.02024727500975132,
-0.006238649599254131,
0.008309245109558105,
0.03882332518696785,
0.05188269540667534,
0.0801999494433403,
-0.019675957038998604,
-0.01694909855723381,
-0.1151498481631279,
0.1204565241932869,
-0.08743943274021149,
-0.014145868830382824,
-0.13060718774795532,
-0.03977509215474129,
-0.04287229850888252,
0.035524431616067886,
0.01650817319750786,
0.1285439282655716,
-0.1780930757522583,
-0.06768506020307541,
0.11495670676231384,
-0.12670719623565674,
0.018495524302124977,
0.1809127926826477,
-0.0014129483606666327,
0.06394872814416885,
0.09986605495214462,
0.22730427980422974,
0.029273279011249542,
-0.1701052188873291,
-0.015566135756671429,
-0.04756997898221016,
0.039826713502407074,
0.13017573952674866,
0.06553218513727188,
-0.06291366368532181,
0.06606493890285492,
-0.010168562643229961,
-0.027907216921448708,
-0.07703736424446106,
-0.0015294248005375266,
-0.044416896998882294,
0.014848067425191402,
-0.046847593039274216,
0.029840299859642982,
-0.004633145872503519,
-0.021773017942905426,
-0.014293813146650791,
-0.09158910810947418,
-0.04489762336015701,
0.12045969069004059,
-0.06355487555265427,
0.0243679191917181,
-0.10293935984373093,
0.06966086477041245,
0.0705195888876915,
0.0024999664165079594,
-0.1308237761259079,
0.10263874381780624,
0.02602252922952175,
-0.0361236035823822,
0.1449475735425949,
0.07677443325519562,
-0.03098953887820244,
0.008231617510318756,
-0.010902772657573223,
0.023787474259734154,
-0.029090099036693573,
0.012693307362496853,
-0.024621741846203804,
-0.10042526572942734,
-0.011944444850087166,
-0.06698641926050186,
0.12018416076898575,
-0.13885317742824554,
-0.008879940956830978,
0.0470091886818409,
0.11681412905454636,
-0.01797160506248474,
-0.044831812381744385,
0.09428537636995316,
0.04431123659014702,
0.032846055924892426,
-0.01705349050462246,
0.02097705565392971,
-0.01894156076014042,
-0.005833649076521397,
0.048606179654598236,
-0.14558854699134827,
-0.15611813962459564,
0.09427232295274734,
0.01717989705502987,
-0.008163709193468094,
0.06011802330613136,
0.02424461767077446,
-0.01748953014612198,
-0.04953625425696373,
-0.002499975496903062,
0.23523271083831787,
-0.012429513968527317,
0.057705242186784744,
-0.0782439261674881,
-0.008887579664587975,
0.009342962875962257,
-0.04833557829260826,
-0.08744409680366516,
0.08186671137809753,
0.005405800882726908,
-0.06782936304807663,
-0.039049021899700165,
0.03320116922259331,
0.07042787969112396,
0.1513354331254959,
0.0033556923735886812,
-0.08507031202316284,
-0.029978325590491295,
-0.06144199147820473,
-0.01792766898870468,
0.04069077596068382,
-0.14133985340595245,
-0.026142796501517296,
0.025153320282697678,
0.006814759690314531,
0.050389982759952545,
-0.027385029941797256,
0.04416331648826599,
0.009214064106345177,
-0.05330285057425499,
-0.07871915400028229,
0.034688178449869156,
-0.028784871101379395,
0.035892318934202194,
-0.014952556230127811,
0.00838842149823904,
-0.04657476767897606,
-0.05792127922177315,
-0.1419464349746704,
0.08469153195619583,
-0.06803970038890839,
-0.30859798192977905,
-0.09485241025686264,
-0.04136227071285248,
-0.037537939846515656,
0.012206592597067356,
0.04888073727488518,
-0.10447358340024948,
-0.11255058646202087,
-0.07034831494092941,
0.11717324703931808,
-0.026608847081661224,
-0.0645274966955185,
0.1181674674153328,
-0.008958633057773113,
0.02143087424337864,
-0.09831557422876358,
0.015130107291042805,
-0.0358249694108963,
-0.023104147985577583,
-0.032388150691986084,
0.02914285846054554,
0.062269426882267,
0.11904515326023102,
0.02408645674586296,
-0.004788482096046209,
0.007799235172569752,
0.2123260200023651,
-0.1324094831943512,
0.07546340674161911,
0.2451469600200653,
-0.0638132393360138,
-0.007098977919667959,
0.13989901542663574,
-0.009008608758449554,
-0.05420948937535286,
0.045348841696977615,
0.002508556004613638,
-0.02746085450053215,
-0.22169190645217896,
-0.1221824437379837,
-0.044827815145254135,
-0.02329316921532154,
0.034818898886442184,
0.019932376220822334,
-0.02102840133011341,
0.016338469460606575,
-0.08664649724960327,
-0.04571036994457245,
0.06507863104343414,
0.03085215389728546,
0.15496397018432617,
0.005328752100467682,
0.05712456256151199,
-0.04669783636927605,
-0.02003192901611328,
0.10262251645326614,
-0.04591593146324158,
0.03809886425733566,
0.06468704342842102,
0.09505926817655563,
0.062201835215091705,
0.044700074940919876,
0.05282068997621536,
-0.01861504279077053,
-0.018105341121554375,
-0.0009180197375826538,
-0.026556676253676414,
-0.06425260007381439,
0.004156890790909529,
0.04302442818880081,
0.1438959389925003,
-0.13325607776641846,
-0.11265005171298981,
0.042428117245435715,
0.01594437099993229,
0.12336450815200806,
0.09507840871810913,
-0.030938932672142982,
-0.09536070376634598,
0.04185020178556442,
-0.0915856584906578,
-0.03173806890845299,
0.05625581368803978,
0.09148507565259933,
-0.16009701788425446,
0.09668023884296417,
0.07879780232906342,
0.08570228517055511,
-0.04166565462946892,
0.031479235738515854,
-0.05288277566432953,
0.05334616079926491,
0.005227893125265837,
0.07179675996303558,
-0.1709626168012619,
0.0955730751156807,
0.01288667879998684,
0.08439745008945465,
-0.05542267486453056,
0.02938191033899784,
0.04646967351436615,
0.010878372006118298,
0.12620557844638824,
-0.008381241001188755,
-0.12294645607471466,
0.0072630466893315315,
-0.12244386970996857,
0.017604300752282143,
0.05828244984149933,
-0.06078585982322693,
0.057420264929533005,
-0.007505504880100489,
-0.006843739189207554,
-0.0406346395611763,
-0.01152015570551157,
-0.2540742754936218,
-0.13959923386573792,
0.04772654175758362,
0.0006960976752452552,
0.05216073617339134,
-0.04182663932442665,
-0.07504288852214813,
-0.12978585064411163,
0.1123802661895752,
-0.0045752148143947124,
-0.01912921853363514,
-0.07954064756631851,
0.02756642922759056,
0.09989993274211884,
-0.05842379108071327,
0.016627848148345947,
0.0517839677631855,
0.14045657217502594,
-0.05998515710234642,
-0.04121672362089157,
0.01878940686583519,
-0.09656698256731033,
-0.12467041611671448,
0.019623033702373505,
0.17374439537525177,
0.1048898994922638,
0.06478088349103928,
0.09057571738958359,
0.024478498846292496,
0.003323896322399378,
-0.10086045414209366,
0.016156723722815514,
0.04564967751502991,
-0.07222099602222443,
0.05295390263199806,
-0.0005103301373310387,
-0.2752009332180023,
-0.13992862403392792,
-0.06568345427513123,
0.08311732113361359,
0.18497024476528168,
-0.028950244188308716,
0.1693451851606369,
0.26287147402763367,
-0.09128668904304504,
-0.23418530821800232,
-0.040819551795721054,
-0.001445756177417934,
0.03231555595993996,
0.05597418546676636,
-0.20028641819953918,
0.10267968475818634,
-0.00015226143295876682,
0.010979260317981243,
-0.061178259551525116,
-0.22002246975898743,
-0.1424981951713562,
0.16665513813495636,
-0.024240445345640182,
0.04660233110189438,
-0.02651340886950493,
-0.07485098391771317,
-0.039636701345443726,
-0.05408520996570587,
0.0007052837172523141,
-0.09129634499549866,
0.07066072523593903,
0.05732910707592964,
0.023730361834168434,
0.02967790514230728,
0.01121665257960558,
0.11261823028326035,
0.08660095185041428,
-0.018181197345256805,
-0.08710069954395294,
0.022308118641376495,
-0.002710624597966671,
-0.014713539741933346,
0.11102323234081268,
0.04178161919116974,
0.016742663457989693,
-0.06117628887295723,
-0.08475954830646515,
-0.05916197597980499,
0.06022534519433975,
-0.0705694705247879,
-0.007508562412112951,
-0.055751606822013855,
0.08407372236251831,
0.012723803520202637,
0.0005467823939397931,
-0.06454991549253464,
-0.10222659260034561,
-0.02093169279396534,
0.11969076097011566,
0.2230309098958969,
-0.04330545663833618,
0.0018090939847752452,
-0.04470452666282654,
-0.04092753306031227,
0.03946038335561752,
-0.0044326200149953365,
0.04337601736187935,
0.05307653546333313,
0.029978366568684578,
0.08551022410392761,
-0.03399458900094032,
-0.12243804335594177,
0.031522125005722046,
0.037635814398527145,
-0.07292195409536362,
-0.19351959228515625,
-0.04730051010847092,
-0.002417243318632245,
-0.02221481129527092,
-0.03358810767531395,
0.19577033817768097,
-0.015123787336051464,
-0.05379314720630646,
0.00007849621761124581,
0.06273646652698517,
-0.0027644596993923187,
0.11775068938732147,
0.042687658220529556,
0.04095957800745964,
-0.09050197899341583,
0.056276608258485794,
0.11468472331762314,
-0.040063872933387756,
0.04687775298953056,
0.09205178916454315,
-0.04736871272325516,
-0.05349138751626015,
-0.10136127471923828,
-0.005289794411510229,
0.064571313560009,
-0.06041807308793068,
-0.01206360012292862,
-0.10821900516748428,
0.009075045585632324,
0.01626747101545334,
0.01222633384168148,
-0.04452526196837425,
-0.04426219314336777,
0.00028635055059567094,
-0.08909871429204941,
0.06900589913129807,
0.10245858132839203,
-0.03087037429213524,
-0.11400960385799408,
0.11114683747291565,
0.011646101251244545,
0.0764215886592865,
-0.038543976843357086,
-0.059798162430524826,
-0.09161896258592606,
-0.0008529440965503454,
-0.09526459872722626,
0.03680741414427757,
-0.13725893199443817,
-0.011666485108435154,
-0.04124831408262253,
-0.03730159252882004,
-0.009560889564454556,
0.07235933095216751,
-0.03146921843290329,
0.00381630752235651,
-0.03028736263513565,
0.08368761837482452,
-0.13014841079711914,
0.06922610104084015,
0.05825385823845863,
-0.04576793685555458,
0.10795693844556808,
0.030601274222135544,
-0.05291267856955528,
0.04065343737602234,
-0.20465408265590668,
-0.05675477907061577,
-0.030221598222851753,
0.04451402649283409,
-0.014337523840367794,
-0.16678066551685333,
0.00044231259380467236,
0.018274137750267982,
0.01779485121369362,
-0.017375122755765915,
0.05259883776307106,
-0.029670998454093933,
-0.014932693913578987,
-0.0598912239074707,
-0.0629911944270134,
-0.03712595999240875,
0.06228046119213104,
0.0701538547873497,
0.01103413850069046,
0.10147830843925476,
-0.09098786115646362,
0.072522833943367,
-0.07620831578969955,
0.029720021411776543,
-0.028853412717580795,
0.02145802602171898,
-0.07798121124505997,
-0.07671461999416351,
0.08150757849216461,
-0.013776733539998531,
0.07046280056238174,
0.030958788469433784,
-0.03628799691796303,
0.043894968926906586,
-0.04343416541814804,
-0.04838959127664566,
0.041494280099868774,
0.1292755901813507,
0.05017441511154175,
0.021553661674261093,
0.0015727642457932234,
-0.04374437779188156,
0.002359787467867136,
0.13645169138908386,
0.14706115424633026,
0.16818638145923615,
0.09832864254713058,
0.036747343838214874,
0.07205700129270554,
-0.042645931243896484,
-0.08925990015268326,
0.08435707539319992,
-0.07346245646476746,
0.03855958580970764,
-0.05730470269918442,
-0.061337053775787354,
0.07321810722351074,
-0.13892123103141785,
0.07602189481258392,
-0.027988672256469727,
-0.08512242883443832,
-0.1156795546412468,
-0.1530480533838272,
-0.06339888274669647,
-0.041378624737262726,
0.007130855228751898,
-0.10959857702255249,
0.028357602655887604,
0.016849003732204437,
0.02692793309688568,
-0.08779449015855789,
0.11141042411327362,
-0.1295405924320221,
-0.11842828243970871,
0.144988551735878,
-0.038242094218730927,
-0.014725860208272934,
0.00099472445435822,
0.03669634461402893,
0.02446002885699272,
0.09437894076108932,
0.05150536447763443,
0.04605212062597275,
0.02178659290075302,
0.026874087750911713,
-0.09391359239816666,
-0.06438402086496353,
0.02868729643523693,
-0.014825348742306232,
0.10408418625593185,
0.18427012860774994,
0.09120658785104752,
-0.08301174640655518,
0.00726296054199338,
0.15039731562137604,
0.018876492977142334,
-0.11672112345695496,
-0.14568348228931427,
0.028268422931432724,
-0.035478025674819946,
-0.001446560607291758,
0.002923655556514859,
-0.0956486314535141,
0.02378815971314907,
0.19646503031253815,
0.17170169949531555,
-0.038873013108968735,
0.0226342361420393,
-0.018139246851205826,
0.009262691251933575,
0.02082790620625019,
0.07955703884363174,
0.08432003110647202,
0.1743929386138916,
-0.00591084873303771,
0.0439482219517231,
-0.018705839291214943,
-0.0984465703368187,
-0.12119130045175552,
0.08887527883052826,
-0.004176171030849218,
-0.03172934055328369,
-0.004984316881746054,
0.1845245063304901,
-0.11369860172271729,
-0.2129148244857788,
-0.11837350577116013,
-0.036272067576646805,
-0.11562039703130722,
0.02527621015906334,
-0.03842842951416969,
0.1365259438753128,
0.0477282777428627,
-0.007061537820845842,
0.010667745023965836,
0.17257651686668396,
0.03717401251196861,
0.028557850047945976,
-0.02434576116502285,
0.10995662212371826,
-0.07979180663824081,
0.11346672475337982,
-0.0038045374676585197,
0.055791884660720825,
0.03530912473797798,
0.03706691414117813,
-0.06955911219120026,
0.028859050944447517,
0.03443801775574684,
-0.010090124793350697,
0.046539053320884705,
0.17003488540649414,
-0.008508987724781036,
0.10563653707504272,
0.10433699935674667,
-0.06042807176709175,
0.027908354997634888,
-0.012934934347867966,
0.0015841200947761536,
-0.057991813868284225,
0.1572049856185913,
-0.15302394330501556,
0.12886428833007812,
0.10358099639415741,
-0.06971666216850281,
-0.04506978020071983,
-0.01107894629240036,
0.04906412959098816,
-0.0667295977473259,
0.08978409320116043,
-0.004936843644827604,
-0.17042715847492218,
0.02399107627570629,
-0.11944334208965302,
0.06677768379449844,
-0.2610224783420563,
-0.045160870999097824,
-0.04724203422665596,
-0.016662387177348137,
0.007599522825330496,
0.11008995026350021,
0.08856312930583954,
-0.04977133497595787,
-0.011229455471038818,
-0.05763532593846321,
0.011696086265146732,
0.09578564018011093,
-0.08755215257406235,
-0.02738303691148758
] |
null | null |
transformers
|
# Wav2Vec2-base-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained only in **sv** on **16.3k** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **sv**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "sv", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-sv-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"sv",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"sv"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sv #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-base-VoxPopuli-V2
Facebook's Wav2Vec2 base model pretrained only in sv on 16.3k unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sv. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sv on 16.3k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sv. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sv #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sv on 16.3k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sv. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #sv #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-base-VoxPopuli-V2\n\nFacebook's Wav2Vec2 base model pretrained only in sv on 16.3k unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in sv. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07679860293865204,
0.10716700553894043,
-0.0027788635343313217,
0.0018164973007515073,
0.08374935388565063,
-0.05092080309987068,
0.13497382402420044,
0.04113892465829849,
0.0014721702318638563,
0.09892378002405167,
-0.009381087496876717,
-0.050462380051612854,
0.07218795269727707,
0.13224580883979797,
0.05973760411143303,
-0.25370967388153076,
0.041185371577739716,
-0.06892663240432739,
0.04279979318380356,
0.04503004252910614,
0.12388334423303604,
-0.08729388564825058,
0.030238714069128036,
0.057732146233320236,
-0.032332051545381546,
0.028209595009684563,
-0.04148610308766365,
-0.0810098946094513,
0.05149761959910393,
0.0481409952044487,
-0.03637145087122917,
0.03037567250430584,
0.10117534548044205,
-0.18573790788650513,
0.03638574853539467,
0.03624320030212402,
0.030519604682922363,
0.009918511845171452,
0.10073106735944748,
0.0237844530493021,
0.1538321077823639,
-0.025259854272007942,
-0.008357571437954903,
0.07872128486633301,
-0.05095265060663223,
-0.08568595349788666,
-0.06247206777334213,
0.1495690643787384,
0.09737402945756912,
0.10209779441356659,
-0.07845443487167358,
0.07105428725481033,
-0.023492716252803802,
0.044962622225284576,
0.0726723000407219,
-0.17504099011421204,
-0.056135546416044235,
0.060554370284080505,
0.09918458014726639,
0.01762993261218071,
-0.08225186914205551,
0.08259398490190506,
0.05112561583518982,
-0.013362699188292027,
-0.05755504593253136,
-0.034827202558517456,
0.1246010810136795,
-0.10123606771230698,
-0.11781229078769684,
0.000030113242246443406,
0.187088742852211,
0.05667963996529579,
-0.0710386335849762,
-0.15109506249427795,
0.015947023406624794,
0.21206697821617126,
-0.05715115740895271,
-0.09632454812526703,
0.009975498542189598,
0.014650111086666584,
0.050126150250434875,
-0.07134318351745605,
-0.07160886377096176,
-0.01109106931835413,
0.0346980094909668,
0.10445896536111832,
0.01882064715027809,
-0.017481312155723572,
-0.07227782905101776,
-0.0037610852159559727,
-0.11013161391019821,
-0.1149870753288269,
-0.005121614784002304,
-0.06657487899065018,
-0.07410790026187897,
-0.03477337583899498,
-0.00216377223841846,
-0.11722247302532196,
0.030197899788618088,
0.09463603794574738,
0.06768626719713211,
0.05370412394404411,
-0.04166661947965622,
-0.03508995473384857,
0.12053987383842468,
0.07027574628591537,
-0.12168370932340622,
-0.013338450342416763,
0.01608189195394516,
-0.024029895663261414,
0.006015064660459757,
-0.038639795035123825,
-0.04098377749323845,
0.014645986258983612,
-0.02430890128016472,
0.04424668103456497,
0.05198145657777786,
-0.03276844322681427,
-0.029312781989574432,
-0.09189636260271072,
0.08943013846874237,
-0.08206679672002792,
0.02535802125930786,
0.047756172716617584,
-0.0031242757104337215,
0.08958150446414948,
-0.06436298042535782,
0.07820598036050797,
-0.11405870318412781,
0.007985075004398823,
-0.02368275448679924,
-0.0035179525148123503,
0.019908323884010315,
-0.028066005557775497,
0.030628127977252007,
-0.0007018142496235669,
0.00415176572278142,
-0.11439107358455658,
-0.0017125445883721113,
-0.10478989779949188,
-0.02432125248014927,
-0.08041490614414215,
-0.046470608562231064,
-0.050691716372966766,
0.01668534427881241,
-0.002586427377536893,
-0.006087014451622963,
0.0155924828723073,
-0.017498357221484184,
-0.013852233067154884,
0.006413235329091549,
0.04132911562919617,
0.057999081909656525,
0.08157707005739212,
-0.0203629732131958,
-0.01814144290983677,
-0.10228469222784042,
0.1167348101735115,
-0.08207318931818008,
-0.021631918847560883,
-0.13149374723434448,
-0.03861914202570915,
-0.03983374685049057,
0.033729828894138336,
0.015091044828295708,
0.13074041903018951,
-0.1796455681324005,
-0.07132945209741592,
0.11402775347232819,
-0.1240929663181305,
0.014237316325306892,
0.17744684219360352,
0.0021294306498020887,
0.06627126783132553,
0.09941857308149338,
0.22698874771595,
0.029962381348013878,
-0.17261184751987457,
-0.00931009091436863,
-0.04519069567322731,
0.03651461377739906,
0.12783491611480713,
0.06405840069055557,
-0.06403474509716034,
0.06367979943752289,
-0.009461267851293087,
-0.04364428669214249,
-0.08069068193435669,
-0.0038125712890177965,
-0.047309812158346176,
0.014425893314182758,
-0.04674963653087616,
0.02566974237561226,
-0.005782983265817165,
-0.015731237828731537,
-0.015170615166425705,
-0.08472481369972229,
-0.059646524488925934,
0.11835218220949173,
-0.06680084764957428,
0.02462122030556202,
-0.09782806038856506,
0.06865368038415909,
0.06468788534402847,
0.009185469709336758,
-0.12779292464256287,
0.11684007942676544,
0.030666088685393333,
-0.04758680984377861,
0.14128975570201874,
0.09024621546268463,
-0.029682381078600883,
0.009600607678294182,
-0.013584724627435207,
0.021722296252846718,
-0.02724415436387062,
0.01431160420179367,
-0.020317353308200836,
-0.10197412222623825,
-0.008119944483041763,
-0.06673045456409454,
0.10937236249446869,
-0.1382485032081604,
-0.01049004029482603,
0.0419391505420208,
0.11205647885799408,
-0.016287032514810562,
-0.042210694402456284,
0.08859694749116898,
0.04289720207452774,
0.03091614693403244,
-0.016546064987778664,
0.019416142255067825,
-0.019491374492645264,
-0.002326170913875103,
0.05212174728512764,
-0.13819387555122375,
-0.15599708259105682,
0.09266000986099243,
0.019041769206523895,
-0.011721632443368435,
0.06431259959936142,
0.024015046656131744,
-0.01440085656940937,
-0.04918452352285385,
0.0011775847524404526,
0.2264760583639145,
-0.013301810249686241,
0.0599377304315567,
-0.07959102094173431,
-0.007889551110565662,
0.017792392522096634,
-0.05216487869620323,
-0.09130654484033585,
0.0843217521905899,
0.004689543507993221,
-0.0704672783613205,
-0.04062344506382942,
0.03826255723834038,
0.07012674957513809,
0.15701590478420258,
0.003909616731107235,
-0.0865853950381279,
-0.029658349230885506,
-0.06381066143512726,
-0.016278181225061417,
0.04420146718621254,
-0.13795222342014313,
-0.02815953455865383,
0.026799548417329788,
0.008729906752705574,
0.04818207398056984,
-0.024586698040366173,
0.04313407838344574,
0.008340897969901562,
-0.054868340492248535,
-0.070730060338974,
0.035149455070495605,
-0.03560825064778328,
0.036385878920555115,
-0.014137495309114456,
0.01201669778674841,
-0.04741079732775688,
-0.05940600484609604,
-0.14540937542915344,
0.08375709503889084,
-0.06884080916643143,
-0.3119267523288727,
-0.09344327449798584,
-0.04106689617037773,
-0.041059624403715134,
0.018092481419444084,
0.048220645636320114,
-0.11541547626256943,
-0.11367595195770264,
-0.06731455773115158,
0.11979243904352188,
-0.029786698520183563,
-0.06356887519359589,
0.11892089247703552,
-0.0025031804107129574,
0.031210821121931076,
-0.09589516371488571,
0.016656292602419853,
-0.04131299629807472,
-0.020803460851311684,
-0.0311027429997921,
0.03158313408493996,
0.06743951141834259,
0.1220070868730545,
0.022116098552942276,
-0.002351926639676094,
0.011057311668992043,
0.21963472664356232,
-0.13217653334140778,
0.0765409991145134,
0.23984676599502563,
-0.056482378393411636,
-0.010339743457734585,
0.13969506323337555,
-0.00684399576857686,
-0.051120173186063766,
0.040062516927719116,
-0.0016463598003610969,
-0.025689322501420975,
-0.22161325812339783,
-0.12682479619979858,
-0.04636140540242195,
-0.030965769663453102,
0.03907225653529167,
0.01779746636748314,
-0.017109885811805725,
0.016519758850336075,
-0.08531180769205093,
-0.045942626893520355,
0.0636700689792633,
0.02980995737016201,
0.15038339793682098,
0.007886044681072235,
0.052776988595724106,
-0.04294603690505028,
-0.019375117495656013,
0.10261665284633636,
-0.0400116965174675,
0.03827744722366333,
0.06849822402000427,
0.09784693270921707,
0.06221415475010872,
0.03996202349662781,
0.05447180196642876,
-0.01937832124531269,
-0.014579638838768005,
-0.005270985420793295,
-0.025980623438954353,
-0.0638684332370758,
0.01186463050544262,
0.04398510977625847,
0.142984539270401,
-0.13532985746860504,
-0.11822253465652466,
0.04661807790398598,
0.01919032633304596,
0.12174172699451447,
0.09708458930253983,
-0.02582741156220436,
-0.09974348545074463,
0.041202329099178314,
-0.08695918321609497,
-0.031120620667934418,
0.05206752568483353,
0.09934224933385849,
-0.15789754688739777,
0.09299075603485107,
0.07661522924900055,
0.08804302662611008,
-0.03532203659415245,
0.03687109798192978,
-0.052831023931503296,
0.06490986049175262,
0.006216129288077354,
0.07653330266475677,
-0.17870397865772247,
0.10023859888315201,
0.014865984208881855,
0.09066526591777802,
-0.055189523845911026,
0.029350493103265762,
0.03797207772731781,
0.01587194949388504,
0.13409115374088287,
-0.006014870014041662,
-0.11748795956373215,
0.005212732590734959,
-0.11084165424108505,
0.018395695835351944,
0.05664177983999252,
-0.059992775321006775,
0.05053287371993065,
-0.0038011623546481133,
-0.0071845571510493755,
-0.038361500948667526,
-0.009948552586138248,
-0.2579629421234131,
-0.13980048894882202,
0.04835261404514313,
-0.0025699681136757135,
0.05060095712542534,
-0.03912774473428726,
-0.07262873649597168,
-0.14282986521720886,
0.10644996911287308,
-0.008469654247164726,
-0.0150809520855546,
-0.07632291316986084,
0.026063203811645508,
0.09516187012195587,
-0.06047264486551285,
0.013691791333258152,
0.050608858466148376,
0.15268144011497498,
-0.06826885044574738,
-0.03958388417959213,
0.015239297412335873,
-0.10413087904453278,
-0.1273985654115677,
0.01769426092505455,
0.17682519555091858,
0.10758858919143677,
0.06394056975841522,
0.0916958600282669,
0.02084655873477459,
0.0005499862018041313,
-0.10008979588747025,
0.025077732279896736,
0.035498108714818954,
-0.07695063203573227,
0.03832409530878067,
-0.0010494297603145242,
-0.27232080698013306,
-0.14305517077445984,
-0.062232282012701035,
0.07992438226938248,
0.18727964162826538,
-0.03086346946656704,
0.17334359884262085,
0.27066898345947266,
-0.09056161344051361,
-0.23220258951187134,
-0.04075980186462402,
-0.00034956700983457267,
0.03428409993648529,
0.053475696593523026,
-0.1990652233362198,
0.09940076619386673,
-0.004282895941287279,
0.011124820448458195,
-0.07259438931941986,
-0.21007253229618073,
-0.1402585655450821,
0.16468827426433563,
-0.024574873968958855,
0.044871553778648376,
-0.018824463710188866,
-0.07321622222661972,
-0.03688003122806549,
-0.05393914133310318,
0.006640185136348009,
-0.09058575332164764,
0.07139264047145844,
0.05703607201576233,
0.012208020314574242,
0.02562819980084896,
0.01162240281701088,
0.11429653316736221,
0.08962592482566833,
-0.023338867351412773,
-0.08109690994024277,
0.0290303323417902,
0.009347925893962383,
-0.012593110091984272,
0.10758741199970245,
0.05065782740712166,
0.01611773669719696,
-0.04947127029299736,
-0.08210048824548721,
-0.06417369842529297,
0.0637405589222908,
-0.06946968287229538,
-0.014073547907173634,
-0.053029291331768036,
0.085543192923069,
0.014250350184738636,
-0.0006507569341920316,
-0.06461207568645477,
-0.1005883440375328,
-0.02287750132381916,
0.12099312245845795,
0.21950431168079376,
-0.05335362255573273,
-0.00020522516570053995,
-0.04004836454987526,
-0.041095323860645294,
0.045190900564193726,
-0.004099106416106224,
0.04647155851125717,
0.0511467270553112,
0.026234081014990807,
0.08786693960428238,
-0.03620067611336708,
-0.12375985831022263,
0.03024284727871418,
0.03608112037181854,
-0.07659037411212921,
-0.1914374828338623,
-0.05108394846320152,
-0.01687326468527317,
-0.01926048845052719,
-0.03150831162929535,
0.193736732006073,
-0.01725638099014759,
-0.051435746252536774,
0.0005643838085234165,
0.06356288492679596,
-0.00573303597047925,
0.1244942918419838,
0.047014348208904266,
0.041620757430791855,
-0.09430735558271408,
0.05670126527547836,
0.11490822583436966,
-0.0373927466571331,
0.049269020557403564,
0.08551465719938278,
-0.05157161504030228,
-0.05307028815150261,
-0.09669049829244614,
0.0003091610851697624,
0.0698266476392746,
-0.06267564743757248,
-0.010454172268509865,
-0.11005380749702454,
0.009256800636649132,
-0.011351245455443859,
0.012466954067349434,
-0.04961610957980156,
-0.04659799858927727,
-0.003275325521826744,
-0.09384056180715561,
0.0673205628991127,
0.10369613766670227,
-0.03268560767173767,
-0.1155773475766182,
0.1071404293179512,
0.009625737555325031,
0.08068100363016129,
-0.03953450918197632,
-0.05960676819086075,
-0.08782242983579636,
-0.0038772777188569307,
-0.08846418559551239,
0.03207062557339668,
-0.13890887796878815,
-0.013479801826179028,
-0.04384762421250343,
-0.0343361459672451,
-0.01172465831041336,
0.07546195387840271,
-0.031228823587298393,
0.005496426485478878,
-0.03396911546587944,
0.08640088886022568,
-0.1222422644495964,
0.07521820068359375,
0.05261223390698433,
-0.044199611991643906,
0.1067611500620842,
0.023641735315322876,
-0.054358433932065964,
0.043546032160520554,
-0.21638686954975128,
-0.06079646199941635,
-0.030739150941371918,
0.046187859028577805,
-0.01150990929454565,
-0.16531458497047424,
-0.00018207305402029306,
0.02179533615708351,
0.011340244673192501,
-0.021625546738505363,
0.04240499809384346,
-0.026416640728712082,
-0.010974995791912079,
-0.06437110155820847,
-0.05859483405947685,
-0.03548344597220421,
0.06815704703330994,
0.07253672182559967,
0.00929302629083395,
0.09821426868438721,
-0.09180132299661636,
0.07639311999082565,
-0.07876777648925781,
0.027439391240477562,
-0.025707945227622986,
0.0205695703625679,
-0.08374075591564178,
-0.07355222851037979,
0.08294055610895157,
-0.016400160267949104,
0.07231605798006058,
0.03337732329964638,
-0.025393687188625336,
0.0454971082508564,
-0.041282981634140015,
-0.05930435284972191,
0.044836387038230896,
0.13201037049293518,
0.04904726520180702,
0.020850930362939835,
-0.005920827854424715,
-0.03947789594531059,
0.005894732195883989,
0.14500057697296143,
0.13968510925769806,
0.1597621589899063,
0.09861062467098236,
0.03709699586033821,
0.07228880375623703,
-0.04492584243416786,
-0.09075556695461273,
0.10485571622848511,
-0.07427176833152771,
0.04551202803850174,
-0.05156504735350609,
-0.08020620048046112,
0.07146401703357697,
-0.14287178218364716,
0.07394389063119888,
-0.029508713632822037,
-0.08724192529916763,
-0.11219948530197144,
-0.1581537425518036,
-0.06706927716732025,
-0.047305814921855927,
0.006239231210201979,
-0.11107239127159119,
0.02797638066112995,
0.006877253297716379,
0.026700224727392197,
-0.0926491990685463,
0.1117686927318573,
-0.11428350955247879,
-0.12153249979019165,
0.14815029501914978,
-0.03793875500559807,
-0.014883690513670444,
0.0005670128157362342,
0.03904923051595688,
0.029660172760486603,
0.09350219368934631,
0.05132129043340683,
0.04609568417072296,
0.0254630409181118,
0.02348395623266697,
-0.09138786792755127,
-0.06522683054208755,
0.030446600168943405,
-0.014591539278626442,
0.09795168787240982,
0.18375271558761597,
0.0878850519657135,
-0.08500175923109055,
0.009451521560549736,
0.1462952047586441,
0.022525358945131302,
-0.10790646821260452,
-0.1508222371339798,
0.02210976928472519,
-0.030173975974321365,
0.0072282301262021065,
0.006510575767606497,
-0.09310536086559296,
0.02085382491350174,
0.2075987458229065,
0.17798280715942383,
-0.03490644320845604,
0.021739481016993523,
-0.015713144093751907,
0.007379390299320221,
0.0201115645468235,
0.07889029383659363,
0.07808754593133926,
0.1650623381137848,
-0.004414756782352924,
0.05448701232671738,
-0.020045505836606026,
-0.09881018847227097,
-0.11518453806638718,
0.09840364754199982,
0.0019032652489840984,
-0.03486822172999382,
-0.0018056853441521525,
0.1814388483762741,
-0.10631854087114334,
-0.21959879994392395,
-0.1236504390835762,
-0.03702160343527794,
-0.11358164250850677,
0.024813205003738403,
-0.04042241349816322,
0.13682976365089417,
0.04928334057331085,
-0.007617220748215914,
0.013346463441848755,
0.17931637167930603,
0.03565698489546776,
0.035062648355960846,
-0.025863168761134148,
0.10986639559268951,
-0.09200188517570496,
0.11198829114437103,
-0.0015132217667996883,
0.04972650855779648,
0.030130045488476753,
0.035718727856874466,
-0.06892173737287521,
0.030796727165579796,
0.03306412324309349,
-0.006521363276988268,
0.048833176493644714,
0.17419591546058655,
-0.009037270210683346,
0.1023617535829544,
0.10483966022729874,
-0.07343778759241104,
0.023921983316540718,
-0.011081090196967125,
0.0031297695823013783,
-0.05514656379818916,
0.16038784384727478,
-0.15053820610046387,
0.12695764005184174,
0.10206453502178192,
-0.06836199760437012,
-0.04502846300601959,
-0.011061882600188255,
0.055834464728832245,
-0.0654432401061058,
0.09899186342954636,
-0.0029195595998317003,
-0.171800896525383,
0.02452654391527176,
-0.12078820168972015,
0.06613559275865555,
-0.2504405379295349,
-0.044398341327905655,
-0.04816592484712601,
-0.017361246049404144,
0.010878657922148705,
0.1101459413766861,
0.08913078904151917,
-0.05029457435011864,
-0.01089678518474102,
-0.040133118629455566,
0.013542598113417625,
0.09332577884197235,
-0.08116444200277328,
-0.027214430272579193
] |
null | null |
transformers
|
# Wav2Vec2-Base-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the sv unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "sv", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-base-sv-voxpopuli
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"sv",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"sv"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #sv #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Base-VoxPopuli
Facebook's Wav2Vec2 base model pretrained on the sv unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the sv unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #sv #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the sv unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
69,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #sv #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the sv unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.06179434806108475,
0.03996327519416809,
-0.0043082027696073055,
-0.009139150381088257,
0.12381065636873245,
-0.03662294149398804,
0.07258233428001404,
0.0006713981856592,
0.03911222517490387,
0.015690356492996216,
0.014482486061751842,
0.023442517966032028,
0.08897922188043594,
0.12472398579120636,
-0.014854127541184425,
-0.28450843691825867,
0.07200611382722855,
0.014759852550923824,
0.0667368471622467,
0.04547286778688431,
0.11100105196237564,
-0.07701800763607025,
0.047252729535102844,
0.05425161123275757,
-0.08344690501689911,
0.01976003497838974,
0.025072060525417328,
-0.08586905896663666,
0.1251402348279953,
0.11084316670894623,
0.08227374404668808,
0.05931656062602997,
0.036653462797403336,
-0.16008563339710236,
0.03349373862147331,
0.06207486614584923,
-0.05804918706417084,
-0.013857033103704453,
0.11957556009292603,
-0.022162986919283867,
0.20951884984970093,
-0.028690535575151443,
-0.03849398344755173,
0.0877411887049675,
-0.1282498836517334,
-0.12925004959106445,
-0.06983163952827454,
0.12724754214286804,
0.13884831964969635,
0.06577211618423462,
-0.07883503288030624,
0.03241252526640892,
-0.036831051111221313,
0.06756385415792465,
0.07066658139228821,
-0.2893196940422058,
-0.03612598776817322,
0.11232829093933105,
0.07199472934007645,
-0.02905353344976902,
-0.09923655539751053,
0.10110734403133392,
0.026130368933081627,
-0.010039829649031162,
-0.013163906522095203,
-0.09141267091035843,
0.029623843729496002,
-0.08107598125934601,
-0.1165105476975441,
0.011315119452774525,
0.19057710468769073,
0.03887122496962547,
-0.059757329523563385,
-0.06536204367876053,
-0.0291629396378994,
0.19070865213871002,
-0.061254262924194336,
-0.1699536144733429,
0.010178504511713982,
0.039314862340688705,
0.07741596549749374,
-0.1729905605316162,
-0.0657428428530693,
-0.006346452049911022,
-0.05000036954879761,
0.09478267282247543,
0.027204472571611404,
-0.01808883249759674,
-0.06552679091691971,
0.010784580372273922,
-0.10400683432817459,
-0.06999960541725159,
0.004525971598923206,
-0.10526823997497559,
-0.08950333297252655,
-0.027367226779460907,
-0.0843995213508606,
-0.08568271994590759,
-0.029569845646619797,
0.08469334244728088,
0.017440350726246834,
0.0614435188472271,
-0.08295509964227676,
0.03455561771988869,
0.00035277093411423266,
0.095076784491539,
-0.1299498826265335,
-0.016382576897740364,
0.009833660908043385,
-0.046493273228406906,
-0.002668471308425069,
-0.03564944863319397,
-0.08084800094366074,
-0.07207280397415161,
-0.028976690024137497,
0.06141156703233719,
0.0014659900916740298,
0.025986045598983765,
-0.06156882643699646,
-0.09135248512029648,
0.034690652042627335,
-0.07353811711072922,
0.019999079406261444,
0.031208042055368423,
0.010389933362603188,
0.20121677219867706,
0.005258334334939718,
0.06333661824464798,
-0.146773561835289,
0.019239231944084167,
-0.01961846463382244,
0.00426134280860424,
-0.010390639305114746,
-0.04610959067940712,
0.03112155571579933,
-0.035791896283626556,
-0.008852154947817326,
-0.13155698776245117,
-0.08988790214061737,
-0.08339085429906845,
0.0007559360819868743,
-0.03571872413158417,
-0.06550885736942291,
-0.04304899275302887,
-0.0011155714746564627,
-0.019039953127503395,
-0.027822837233543396,
-0.02277211658656597,
-0.019217312335968018,
0.001552195055410266,
-0.025555415078997612,
0.07762176543474197,
-0.05725065618753433,
0.08298633992671967,
0.008732516318559647,
-0.022395234555006027,
-0.14362332224845886,
0.12793591618537903,
-0.08221831172704697,
-0.057335544377565384,
-0.14726871252059937,
-0.036777690052986145,
-0.06976872682571411,
0.06547655910253525,
-0.006119429599493742,
0.16157396137714386,
-0.2015058398246765,
-0.11688166856765747,
0.252612441778183,
-0.11777989566326141,
0.016854261979460716,
0.17315886914730072,
0.018700087442994118,
0.03505776822566986,
0.16737769544124603,
0.1430673450231552,
0.03852285072207451,
-0.11447493731975555,
0.04590865969657898,
-0.043177444487810135,
-0.017138754948973656,
0.05642973631620407,
0.05082777887582779,
-0.010743926279246807,
-0.0013984141405671835,
-0.006721130106598139,
-0.07586400210857391,
-0.05355318263173103,
-0.00022647228615824133,
-0.06108522787690163,
0.03172553703188896,
-0.022629177197813988,
0.08712605386972427,
-0.007445338182151318,
-0.0038162334822118282,
0.023650439456105232,
-0.08964540809392929,
-0.022537371143698692,
0.06819877028465271,
-0.057235606014728546,
0.07630058377981186,
-0.10701004415750504,
0.05507490411400795,
0.11519504338502884,
0.06550069898366928,
-0.1367809772491455,
0.05514642968773842,
-0.022494064643979073,
0.08566036820411682,
0.09431692957878113,
0.19219644367694855,
-0.02616853266954422,
-0.04463281109929085,
-0.08605708181858063,
0.025868983939290047,
-0.023508530110120773,
-0.041366301476955414,
-0.02087533287703991,
-0.10004867613315582,
-0.029676664620637894,
-0.042725592851638794,
0.06273625791072845,
-0.17649860680103302,
0.009816222824156284,
0.07613009959459305,
0.07860057055950165,
0.015349569730460644,
0.015718581154942513,
0.005049039609730244,
0.09684687852859497,
0.03628598153591156,
0.00818649772554636,
0.07181112468242645,
-0.010043162852525711,
-0.06180895119905472,
0.1196322962641716,
-0.062131643295288086,
0.01962514966726303,
0.13866502046585083,
-0.10798593610525131,
-0.00389264733530581,
0.0055395206436514854,
0.030991313979029655,
0.00428650202229619,
0.0076279048807919025,
-0.01922176592051983,
0.20737692713737488,
0.024714546278119087,
0.08004062622785568,
-0.08131018280982971,
0.023544859141111374,
-0.015118292532861233,
-0.033745598047971725,
-0.06346103549003601,
0.06714452058076859,
0.03688494488596916,
-0.09099601209163666,
0.013638920150697231,
0.1150515228509903,
0.001565506448969245,
0.14992289245128632,
0.01820509508252144,
-0.021160133183002472,
0.014503216370940208,
-0.062035612761974335,
-0.025016706436872482,
-0.008464474231004715,
-0.16451138257980347,
-0.019198618829250336,
0.024849414825439453,
0.01408044807612896,
0.06521299481391907,
-0.05492410808801651,
-0.0006868426571600139,
0.009533973410725594,
-0.0782543420791626,
-0.041358768939971924,
0.042163703590631485,
-0.010836685076355934,
0.07197065651416779,
-0.050315409898757935,
0.0002804847899824381,
-0.01575574465095997,
-0.030820682644844055,
-0.1046258881688118,
0.11691232025623322,
-0.06261058896780014,
-0.36486440896987915,
-0.10217837989330292,
-0.09789207577705383,
-0.09157165139913559,
0.04493361711502075,
0.04390750452876091,
-0.1059206873178482,
-0.07752490043640137,
0.00407605804502964,
0.16661491990089417,
-0.029454175382852554,
-0.0793541893362999,
0.04794247820973396,
0.005418735556304455,
-0.006364335305988789,
-0.08927590399980545,
0.005944820120930672,
-0.035775233060121536,
-0.12461993098258972,
-0.012393694370985031,
-0.020620955154299736,
0.04010458290576935,
0.14020821452140808,
0.03485313430428505,
-0.023150308057665825,
-0.02365133725106716,
0.19992543756961823,
-0.1292596459388733,
0.08345373719930649,
0.2633099853992462,
-0.024703001603484154,
0.01848616451025009,
0.14239785075187683,
-0.013551624491810799,
-0.07973383367061615,
-0.006907838396728039,
0.06655535846948624,
-0.005819088313728571,
-0.264954537153244,
-0.13093537092208862,
-0.06406327337026596,
-0.031224288046360016,
0.025124024599790573,
-0.010852737352252007,
0.01062453631311655,
0.04349752515554428,
-0.10738543421030045,
-0.02049100212752819,
0.05483182147145271,
0.025797875598073006,
0.2190435379743576,
-0.044033635407686234,
0.14217816293239594,
-0.025208834558725357,
-0.02410687692463398,
0.06904784590005875,
0.02689504623413086,
0.08162939548492432,
0.1154368594288826,
0.058620087802410126,
0.08440490812063217,
0.02215983159840107,
0.02453424409031868,
-0.003129539778456092,
0.0038491669110953808,
-0.029209544882178307,
-0.0520298033952713,
-0.021734323352575302,
-0.047778014093637466,
0.025967616587877274,
0.10751231759786606,
-0.17116358876228333,
-0.12154679745435715,
0.021295960992574692,
0.03518267720937729,
0.1391676813364029,
0.0572279654443264,
-0.07937600463628769,
-0.04036632180213928,
0.052066776901483536,
-0.08286093920469284,
-0.04788755625486374,
0.06436818838119507,
0.08469387888908386,
-0.14681969583034515,
0.15115581452846527,
0.031552769243717194,
0.09780685603618622,
-0.014492114074528217,
0.05877535045146942,
-0.16080738604068756,
-0.016129976138472557,
0.03461004048585892,
0.07106746733188629,
-0.25071659684181213,
0.21148723363876343,
0.020368287339806557,
0.06520972400903702,
-0.071263886988163,
0.006743656937032938,
0.047552306205034256,
0.14408838748931885,
0.13406555354595184,
-0.004216398578137159,
-0.0717221349477768,
0.01886940561234951,
-0.011039279401302338,
0.03912459686398506,
0.03596406430006027,
-0.02210277132689953,
0.04494328796863556,
-0.004253572318702936,
0.014022257179021835,
-0.011804047040641308,
0.11495146155357361,
-0.2280343621969223,
-0.14883175492286682,
0.027091875672340393,
0.012713721953332424,
0.11804980039596558,
-0.005579853896051645,
-0.0658949464559555,
-0.11537125706672668,
0.10444137454032898,
-0.005441755056381226,
-0.01648411527276039,
-0.10523565858602524,
0.036594416946172714,
0.02471271902322769,
-0.09933392703533173,
0.035296518355607986,
0.05880654603242874,
0.1336965709924698,
-0.10233950614929199,
-0.06312032043933868,
0.04626571014523506,
-0.09128217399120331,
-0.061511725187301636,
0.04651074856519699,
0.17880339920520782,
0.10094574838876724,
0.03265858069062233,
0.1143966093659401,
-0.03881749510765076,
0.050909530371427536,
-0.11291981488466263,
0.0718795582652092,
0.01708325371146202,
-0.0193692147731781,
0.02104262448847294,
-0.05832511559128761,
-0.25483304262161255,
-0.10606768727302551,
-0.014666537754237652,
0.17170584201812744,
0.18945856392383575,
0.018893668428063393,
0.15502358973026276,
0.24258673191070557,
-0.09778174012899399,
-0.25820648670196533,
-0.04322918877005577,
-0.02382262609899044,
0.04171862453222275,
0.024329621344804764,
-0.2662629187107086,
0.061466217041015625,
0.05583306401968002,
0.006979852449148893,
-0.08445757627487183,
-0.19877910614013672,
-0.13125093281269073,
0.19655084609985352,
0.02100900001823902,
0.15262457728385925,
-0.08966142684221268,
-0.049087926745414734,
-0.09065131843090057,
-0.07796245813369751,
0.07643769681453705,
-0.1303165704011917,
0.08399571478366852,
0.056674543768167496,
-0.00598931172862649,
0.0067956699058413506,
0.055039238184690475,
0.12003177404403687,
0.06584974378347397,
0.012952052988111973,
-0.0283814687281847,
0.03603416308760643,
0.03630106896162033,
0.02878211997449398,
0.02982143498957157,
0.026708584278821945,
-0.03212331235408783,
-0.05431138724088669,
-0.10870585590600967,
-0.10159614682197571,
0.0969541147351265,
-0.058545224368572235,
-0.005025274120271206,
-0.018959561362862587,
0.10151554644107819,
0.017825305461883545,
0.01439967192709446,
-0.05405552685260773,
-0.14886099100112915,
0.03192521631717682,
0.1268152892589569,
0.2358742505311966,
-0.13312044739723206,
0.01151171512901783,
-0.036207716912031174,
-0.043754942715168,
0.08891801536083221,
-0.0048967688344419,
0.07100417464971542,
0.03755726292729378,
-0.010435695759952068,
0.09350454807281494,
0.01756322756409645,
-0.06977955251932144,
0.007426216267049313,
0.04456450417637825,
-0.0693909227848053,
-0.24852941930294037,
-0.06779420375823975,
-0.011809130199253559,
0.02047630399465561,
0.016528550535440445,
0.18905894458293915,
-0.011310473084449768,
-0.05833154916763306,
-0.018988007679581642,
0.04472725838422775,
-0.04311206191778183,
0.05170781537890434,
0.032084137201309204,
0.03327083960175514,
-0.12078427523374557,
0.06388461589813232,
0.08657888323068619,
-0.14660145342350006,
0.06750266999006271,
0.03663124144077301,
-0.05018092691898346,
-0.09386743605136871,
-0.131724014878273,
0.037983931601047516,
-0.009744621813297272,
-0.08411664515733719,
0.028129637241363525,
-0.16521915793418884,
0.036905527114868164,
0.11742748320102692,
0.030787067487835884,
-0.016755059361457825,
-0.058954477310180664,
-0.04927758872509003,
-0.01971280761063099,
0.0015574059216305614,
0.12162929773330688,
-0.06303184479475021,
-0.13969242572784424,
0.14639562368392944,
0.007356461603194475,
0.07310932874679565,
-0.04804948344826698,
-0.038157958537340164,
-0.1398201286792755,
0.010391242802143097,
-0.16385841369628906,
0.014118012972176075,
-0.1234702318906784,
0.0004408723034430295,
-0.04480865225195885,
-0.02020399458706379,
-0.04547927528619766,
0.02718373015522957,
-0.10224324464797974,
0.01825099065899849,
-0.007725273258984089,
0.08207281678915024,
-0.10753823071718216,
0.07943937182426453,
0.07969068735837936,
-0.02257338911294937,
0.06853468716144562,
0.019325431436300278,
-0.032864876091480255,
0.11176741123199463,
-0.16641278564929962,
-0.054537076503038406,
0.04157958924770355,
0.03462973237037659,
-0.004297242965549231,
-0.14150388538837433,
0.011587793007493019,
0.03130141273140907,
0.037591077387332916,
-0.005605524871498346,
0.07670280337333679,
-0.05383184552192688,
-0.008100478909909725,
-0.043074753135442734,
-0.09427163749933243,
-0.008162164129316807,
0.07771768420934677,
0.1065593734383583,
0.004513921216130257,
0.11365141719579697,
-0.08236024528741837,
0.04690235108137131,
-0.1018810123205185,
0.0675291121006012,
-0.040067411959171295,
-0.028436223044991493,
0.051351554691791534,
-0.14263951778411865,
0.05546186491847038,
0.001898025395348668,
0.09503590315580368,
-0.006208283826708794,
0.013829740695655346,
0.024518288671970367,
-0.09555453807115555,
-0.11511413753032684,
0.041450757533311844,
0.13063819706439972,
0.08144412934780121,
-0.0077616567723453045,
0.02049202285706997,
0.0024422593414783478,
0.034797485917806625,
0.20539043843746185,
0.20687080919742584,
0.19610396027565002,
0.03261766955256462,
0.09679467231035233,
0.020128564909100533,
-0.054101455956697464,
-0.03312279284000397,
0.018630126491189003,
-0.0903792604804039,
0.030619937926530838,
-0.055997367948293686,
-0.05422724783420563,
0.08547580242156982,
-0.15021398663520813,
0.12243618816137314,
0.021683726459741592,
-0.08600505441427231,
-0.1603238582611084,
-0.17367713153362274,
-0.06777310371398926,
-0.09817729145288467,
-0.01284749899059534,
-0.13129031658172607,
-0.01586964540183544,
0.014006204903125763,
0.020796921104192734,
-0.12015601247549057,
0.09914299100637436,
-0.14171960949897766,
-0.16504065692424774,
0.17928370833396912,
-0.03681565448641777,
0.005586243234574795,
-0.008292093873023987,
0.0018250587163493037,
0.008882624097168446,
0.08735349029302597,
0.01595919206738472,
0.037743330001831055,
-0.011552529409527779,
0.052037209272384644,
-0.07394885271787643,
-0.04999975115060806,
-0.0002482907730154693,
0.026981189846992493,
0.1180984303355217,
0.19360485672950745,
0.037287309765815735,
-0.05828341469168663,
0.007621511351317167,
0.14733275771141052,
0.023434696719050407,
-0.08611082285642624,
-0.14730794727802277,
0.05807487294077873,
0.03740786388516426,
0.031375207006931305,
-0.038886625319719315,
-0.06462722271680832,
0.010533999651670456,
0.28747931122779846,
0.1597435176372528,
-0.05658479034900665,
0.027919728308916092,
0.0016865715151652694,
0.03801447153091431,
0.0794300064444542,
0.10301730036735535,
0.0815613865852356,
0.14679454267024994,
-0.0315307192504406,
-0.010828944854438305,
0.013467751443386078,
-0.06863845139741898,
-0.07578497380018234,
0.15724997222423553,
0.03709278255701065,
-0.08332160860300064,
0.018274864181876183,
0.15070092678070068,
-0.15927056968212128,
-0.08971108496189117,
-0.08281507343053818,
-0.05170900374650955,
-0.10290191322565079,
-0.024262821301817894,
-0.04722978174686432,
0.10832173377275467,
0.1017414927482605,
-0.01100770104676485,
-0.029135124757885933,
0.19200590252876282,
0.050560351461172104,
-0.011709155514836311,
-0.05108541250228882,
0.1235961765050888,
-0.03825337812304497,
0.06100554019212723,
-0.018544550985097885,
0.037458937615156174,
0.04829386994242668,
0.048296552151441574,
-0.004055526107549667,
0.050094425678253174,
-0.007794790435582399,
0.02951684594154358,
0.07274965196847916,
0.13237348198890686,
0.0021270618308335543,
0.05183246731758118,
0.09585288166999817,
-0.14482556283473969,
0.03195944428443909,
0.04673362895846367,
-0.04147500544786453,
0.0005360132199712098,
0.17353367805480957,
-0.19807511568069458,
0.05765528231859207,
0.15372981131076813,
-0.02926260232925415,
-0.016370953992009163,
-0.0431395061314106,
0.06775499880313873,
-0.019670572131872177,
0.05402030795812607,
-0.03241217881441116,
-0.13992278277873993,
0.002889519790187478,
-0.060787662863731384,
0.03002523072063923,
-0.18278729915618896,
0.00597395421937108,
-0.041767019778490067,
0.00029780753538943827,
-0.050340741872787476,
0.09744861721992493,
0.016511093825101852,
-0.054389044642448425,
0.01822894997894764,
-0.10246046632528305,
0.036032095551490784,
0.09640417993068695,
-0.0888821929693222,
-0.04190683364868164
] |
null | null |
transformers
|
# Wav2Vec2-Base
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/drive/1FjTsqbYKphl9kL-eILgUc-bl4zVThL8F?usp=sharing) for more information on how to fine-tune the model.
|
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["librispeech_asr"]}
| null |
facebook/wav2vec2-base
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Base
Facebook's Wav2Vec2
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
# Usage
See this notebook for more information on how to fine-tune the model.
|
[
"# Wav2Vec2-Base \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Base \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
66,
358,
18
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Base \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
-0.1358424723148346,
0.02790866047143936,
-0.0017438423819839954,
-0.013750601559877396,
0.0063188523054122925,
-0.02787906862795353,
0.10246994346380234,
0.06723356246948242,
-0.043961599469184875,
0.10252230614423752,
-0.015905767679214478,
-0.09395447373390198,
0.06000400707125664,
0.091766357421875,
0.06941279768943787,
-0.2529228925704956,
0.03629458695650101,
-0.03926122933626175,
0.21959984302520752,
0.04890015348792076,
0.08994443714618683,
-0.11502085626125336,
-0.02345331944525242,
0.03544192761182785,
-0.0668688714504242,
-0.04064323008060455,
-0.020190570503473282,
-0.06379770487546921,
0.0857364684343338,
-0.0037585007958114147,
0.027002030983567238,
0.08293315023183823,
0.06452257186174393,
-0.13722263276576996,
0.009025120176374912,
0.06555096805095673,
0.06285345554351807,
0.03649862855672836,
0.09263994544744492,
0.014116945676505566,
0.10534540563821793,
-0.047208793461322784,
-0.02196883037686348,
0.08302007615566254,
-0.09369900822639465,
-0.11926770955324173,
-0.11535859853029251,
0.05398879945278168,
0.05101568251848221,
0.06506592035293579,
-0.07474595308303833,
0.013832426629960537,
0.012857363559305668,
0.06375129520893097,
0.08572866022586823,
-0.2605689465999603,
0.0051624043844640255,
-0.02685793861746788,
0.04621460288763046,
0.045796096324920654,
-0.059630244970321655,
0.056242894381284714,
0.041538678109645844,
0.016306638717651367,
0.04585003852844238,
-0.011221286840736866,
0.005738711450248957,
-0.08531000465154648,
-0.17506614327430725,
0.004587188828736544,
0.08588951826095581,
0.0064900340512394905,
-0.12217932939529419,
-0.19256164133548737,
-0.06253734976053238,
-0.037560876458883286,
-0.0322762206196785,
-0.03641224652528763,
0.02545253187417984,
0.03031950071454048,
0.041635315865278244,
-0.04431193321943283,
-0.09539466351270676,
-0.001464672852307558,
-0.05809708684682846,
0.08135753124952316,
0.02757122926414013,
0.015539733693003654,
-0.007149750366806984,
0.03268295153975487,
-0.19232146441936493,
-0.06726258993148804,
-0.04838845133781433,
-0.017431410029530525,
-0.08568790555000305,
-0.08573751151561737,
-0.038621000945568085,
-0.22116287052631378,
-0.03261388838291168,
0.05907190591096878,
-0.042105868458747864,
0.04217573627829552,
-0.03750498220324516,
0.05373673886060715,
0.08352214097976685,
0.20259836316108704,
-0.06973028928041458,
-0.09014740586280823,
0.040231943130493164,
0.01107503566890955,
0.03522568568587303,
-0.0231629591435194,
-0.03644205257296562,
-0.003922389354556799,
0.03978511318564415,
0.014728671871125698,
-0.019346021115779877,
0.001048624748364091,
-0.07718373090028763,
-0.012192530557513237,
-0.008254945278167725,
-0.1322263777256012,
0.008950312621891499,
0.01661960408091545,
-0.008873207494616508,
0.07305983453989029,
0.08124881237745285,
0.007892733439803123,
-0.10114553570747375,
0.00765165826305747,
0.003981207963079214,
0.003516335738822818,
-0.028695425018668175,
-0.09953568875789642,
0.02971799671649933,
-0.024975258857011795,
-0.07010910660028458,
-0.11750208586454391,
-0.06092095747590065,
-0.04984322935342789,
0.01221723947674036,
-0.026001079007983208,
0.015256992541253567,
-0.05132211744785309,
0.008385098539292812,
-0.0014723180793225765,
-0.02575714886188507,
-0.06657266616821289,
-0.020798832178115845,
-0.001763034611940384,
0.0743236392736435,
0.08809810876846313,
0.006251298822462559,
0.059295687824487686,
-0.05093575268983841,
0.029896197840571404,
-0.1749337762594223,
0.13912983238697052,
-0.03660465031862259,
-0.062286555767059326,
-0.10757887363433838,
-0.02119172178208828,
-0.07523094117641449,
0.04940452799201012,
0.06496614962816238,
0.05996290221810341,
-0.22944733500480652,
-0.062331393361091614,
0.16840417683124542,
-0.11006487905979156,
-0.01450533326715231,
0.15762294828891754,
0.024582093581557274,
0.14511774480342865,
0.1120743677020073,
0.1550646722316742,
0.09555859118700027,
-0.18249669671058655,
-0.0802341029047966,
-0.008118311874568462,
0.0004495008906815201,
0.09925999492406845,
0.0622713677585125,
-0.0577181838452816,
0.06741060316562653,
-0.0024856997188180685,
0.05209369584918022,
0.03981998562812805,
-0.027078934013843536,
-0.03419820964336395,
0.012652380391955376,
-0.08272656798362732,
0.04211919754743576,
-0.004913186654448509,
0.010032447054982185,
-0.016650617122650146,
-0.10745612531900406,
0.0001789630769053474,
0.08176005631685257,
-0.08426576107740402,
0.061715733259916306,
-0.11443191766738892,
0.033470701426267624,
-0.0726025179028511,
0.006074120756238699,
-0.19871310889720917,
0.1282084882259369,
0.052114080637693405,
0.031153995543718338,
0.08392029255628586,
0.025068731978535652,
0.010511201806366444,
0.0627862736582756,
-0.012530656531453133,
-0.012395871803164482,
-0.012233803980052471,
0.012120888568460941,
-0.09089987725019455,
-0.06394390016794205,
-0.02988114021718502,
-0.05430883541703224,
0.03249996900558472,
-0.07070305943489075,
-0.016670987010002136,
0.07680613547563553,
0.02313442900776863,
0.021306563168764114,
-0.05619790777564049,
0.05619540438055992,
0.05621905252337456,
0.015004976652562618,
-0.04141613468527794,
0.03803199157118797,
0.013767075724899769,
0.020977096632122993,
0.10105109214782715,
-0.21119165420532227,
-0.1983327716588974,
0.06400279700756073,
-0.008109143003821373,
-0.06632009148597717,
0.1230265274643898,
-0.029627392068505287,
-0.042686380445957184,
-0.1127065122127533,
-0.08057411015033722,
0.22409778833389282,
0.010001875460147858,
0.11822204291820526,
-0.06124839559197426,
-0.01559279765933752,
0.029792815446853638,
0.0012405820889398456,
-0.035643719136714935,
0.054472509771585464,
0.001368168625049293,
-0.09089832007884979,
-0.028155459091067314,
0.05254315212368965,
0.03631994128227234,
0.15464365482330322,
-0.029961371794342995,
-0.14137935638427734,
-0.011377235874533653,
-0.025872817263007164,
-0.016545163467526436,
0.11191649734973907,
-0.04998942092061043,
-0.09009423851966858,
-0.00006654724711552262,
0.04619462415575981,
0.059050608426332474,
-0.11739443242549896,
0.12306763976812363,
0.053168121725320816,
-0.050452228635549545,
-0.06016809865832329,
-0.0314786322414875,
-0.03059975616633892,
0.05737273767590523,
0.0010225627338513732,
-0.012108436785638332,
-0.039610959589481354,
-0.03402753546833992,
-0.15332625806331635,
0.07066306471824646,
-0.08164909482002258,
-0.2620503306388855,
-0.06849201768636703,
0.016622979193925858,
0.028764819726347923,
0.016484778374433517,
0.02513132244348526,
-0.005324257537722588,
-0.06358546763658524,
-0.11224813759326935,
0.17045947909355164,
-0.05521649867296219,
0.027189895510673523,
0.06662913411855698,
-0.023598846048116684,
0.03106175921857357,
-0.10995358228683472,
0.006947295740246773,
-0.049226272851228714,
-0.0060044145211577415,
0.01993444934487343,
0.00475718080997467,
0.016210338100790977,
0.1692318469285965,
-0.02014979161322117,
-0.009915529750287533,
-0.05587819963693619,
0.19252903759479523,
-0.07172226905822754,
0.09464213997125626,
0.14954514801502228,
-0.06417428702116013,
0.022736061364412308,
0.07333963364362717,
-0.0042625307105481625,
-0.06917433440685272,
0.02152312360703945,
0.004224007483571768,
-0.03918254002928734,
-0.13401056826114655,
-0.0864729955792427,
-0.06537773460149765,
0.0680404007434845,
0.059396132826805115,
-0.0012712286552414298,
-0.08662094175815582,
-0.0332597978413105,
-0.030582237988710403,
-0.008914316073060036,
0.08652399480342865,
0.027997329831123352,
0.04506535455584526,
-0.06588312238454819,
0.06664872914552689,
-0.06828746199607849,
-0.01594146154820919,
0.06127970665693283,
0.007072244305163622,
0.15238724648952484,
0.05967600643634796,
0.1344686597585678,
0.07831671088933945,
0.037569284439086914,
0.06162085384130478,
0.06247924640774727,
-0.05249958112835884,
0.030589863657951355,
-0.010017162188887596,
-0.07844892144203186,
-0.018981732428073883,
0.024695023894309998,
0.10121922940015793,
-0.04650333896279335,
-0.09048988670110703,
0.03380772843956947,
0.04734443128108978,
0.2593173384666443,
0.11016354709863663,
-0.11815422773361206,
-0.09331516921520233,
-0.014606345444917679,
-0.11977147310972214,
-0.021727463230490685,
0.047828253358602524,
0.12493386119604111,
-0.037787824869155884,
0.029267778620123863,
0.03228847682476044,
0.09451153874397278,
-0.07696036994457245,
0.018161358311772346,
-0.022776290774345398,
0.05425909161567688,
0.022047702223062515,
0.017251785844564438,
-0.17349644005298615,
0.07356914132833481,
0.0443117655813694,
0.14950765669345856,
-0.016678478568792343,
0.019633648917078972,
0.007091067265719175,
-0.018075622618198395,
0.10904712229967117,
0.005741809960454702,
-0.056677598506212234,
-0.05327831581234932,
-0.09627251327037811,
-0.024565119296312332,
0.13188739120960236,
0.020665835589170456,
0.07968799769878387,
-0.004234310705214739,
-0.008390857838094234,
0.022857580333948135,
0.024851158261299133,
-0.17992110550403595,
-0.10406825691461563,
0.08026882261037827,
0.0656365156173706,
-0.026393335312604904,
-0.028955930843949318,
-0.04393766447901726,
-0.17746751010417938,
0.12775804102420807,
-0.13530565798282623,
-0.03342171758413315,
-0.0724450871348381,
-0.0599447563290596,
0.12252147495746613,
-0.02735963463783264,
0.05334857851266861,
0.06528843939304352,
0.13753673434257507,
-0.11918903887271881,
-0.0843704491853714,
0.03420061990618706,
-0.07992947846651077,
-0.10691606253385544,
-0.025341592729091644,
0.1737562119960785,
0.08379969000816345,
0.0642395094037056,
0.028791828081011772,
0.06441004574298859,
0.00036064497544430196,
-0.03940030187368393,
0.054686471819877625,
0.12142942100763321,
-0.04430127516388893,
-0.03164103999733925,
-0.015236557461321354,
-0.10986011475324631,
-0.08269437402486801,
-0.031085951253771782,
0.130753293633461,
0.20602355897426605,
-0.08820481598377228,
0.17633487284183502,
0.10861067473888397,
-0.10850214958190918,
-0.2685745358467102,
0.03538178652524948,
0.04984322935342789,
0.10288885235786438,
-0.008412470109760761,
-0.20392364263534546,
0.040734026581048965,
-0.03458956629037857,
-0.02850291319191456,
-0.03337738290429115,
-0.2206289917230606,
-0.14186257123947144,
0.16449934244155884,
-0.029388032853603363,
0.1381395012140274,
-0.03359519690275192,
-0.009595850482583046,
0.0008038848754949868,
0.03523355349898338,
0.089096799492836,
-0.13581354916095734,
0.1239049956202507,
0.08966071158647537,
0.004466160666197538,
0.04477256163954735,
-0.011305251158773899,
0.043966565281152725,
0.02767420932650566,
-0.017554493620991707,
0.0321248397231102,
0.08414492756128311,
0.05478285998106003,
-0.03442816063761711,
0.13176514208316803,
0.09444913268089294,
-0.0006533382693305612,
-0.04711456969380379,
-0.04551086947321892,
-0.026334503665566444,
0.03185503929853439,
-0.00925673171877861,
-0.014596070162951946,
-0.039676837623119354,
0.04317012429237366,
0.04630206152796745,
-0.013253811746835709,
-0.019456319510936737,
-0.07709536701440811,
-0.12373123317956924,
0.08334871381521225,
0.19332273304462433,
-0.03857073932886124,
-0.03192569687962532,
0.00943265575915575,
-0.05296047776937485,
0.06882378458976746,
-0.07213317602872849,
0.04977043718099594,
0.07877858728170395,
0.03540793061256409,
0.08912663906812668,
0.010009879246354103,
-0.14098452031612396,
-0.0014760165940970182,
0.05418327450752258,
-0.08078430593013763,
-0.1808505654335022,
-0.03238636255264282,
0.021349210292100906,
-0.03697134181857109,
0.006253400351852179,
0.13620050251483917,
-0.09191130101680756,
-0.014828776009380817,
0.001240805140696466,
0.06048819050192833,
-0.08275221288204193,
0.10707037895917892,
-0.019693128764629364,
0.04129888862371445,
-0.04439010098576546,
0.16974592208862305,
0.10960716754198074,
-0.10471329092979431,
0.05164707452058792,
0.06929705291986465,
-0.053279586136341095,
-0.01920405589044094,
-0.12686650454998016,
-0.009833233430981636,
0.06710750609636307,
-0.0545165129005909,
-0.013372947461903095,
-0.07443685829639435,
-0.016130436211824417,
0.17248062789440155,
-0.01576758548617363,
0.06723780930042267,
-0.06685550510883331,
0.049950193613767624,
-0.09560924768447876,
0.05665060505270958,
0.03736261650919914,
0.007826179265975952,
-0.03469807654619217,
0.2481524497270584,
-0.0028092225547879934,
0.06486457586288452,
-0.05749966576695442,
-0.045373521745204926,
-0.04671969264745712,
0.03699585050344467,
-0.05763406306505203,
0.034282386302948,
-0.06335967779159546,
-0.03637143597006798,
0.0034149230923503637,
0.0265136007219553,
0.01313459500670433,
0.061559710651636124,
-0.04016772657632828,
0.018649032339453697,
-0.016177114099264145,
0.01839951053261757,
-0.06136082485318184,
0.028524287045001984,
0.008793476969003677,
-0.06354902684688568,
0.10331554710865021,
0.0019828008953481913,
-0.05411737412214279,
0.02409372292459011,
-0.10540595650672913,
-0.0811755433678627,
0.013436812907457352,
0.007977688685059547,
-0.03757569193840027,
-0.13370691239833832,
0.002263203263282776,
0.034630607813596725,
-0.00565580278635025,
-0.021793745458126068,
0.045478206127882004,
-0.03869566321372986,
-0.010504714213311672,
-0.07242007553577423,
0.046854954212903976,
-0.082193523645401,
0.03870616853237152,
-0.00008772066212259233,
0.10383814573287964,
0.055827412754297256,
-0.10532976686954498,
0.0417298749089241,
-0.09164068102836609,
0.017522506415843964,
-0.016552163287997246,
0.006378674414008856,
-0.06351162493228912,
-0.06288118660449982,
0.0691763386130333,
-0.04483222961425781,
0.11430459469556808,
0.0062248483300209045,
-0.035477831959724426,
0.024698201566934586,
-0.06077737733721733,
-0.11876329779624939,
0.043308112770318985,
0.16234023869037628,
-0.00827130489051342,
-0.007294083014130592,
-0.05802067369222641,
0.040000129491090775,
0.024808622896671295,
0.1205332949757576,
0.10931442677974701,
0.1375279277563095,
0.10894405096769333,
0.08424719423055649,
-0.007223485503345728,
-0.0550372451543808,
-0.10457176715135574,
0.049504395574331284,
-0.045625727623701096,
0.024723468348383904,
-0.06647678464651108,
0.09242431819438934,
0.07150471210479736,
-0.06145349144935608,
0.11137015372514725,
-0.03597996011376381,
-0.07784249633550644,
-0.108773373067379,
-0.09532782435417175,
-0.011554276570677757,
0.010256483219563961,
-0.008559271693229675,
-0.0982079952955246,
0.013728288002312183,
-0.05466718599200249,
0.02601437084376812,
-0.027970844879746437,
0.12587013840675354,
-0.17307184636592865,
-0.11636102199554443,
0.16898669302463531,
-0.023776184767484665,
0.039298783987760544,
0.0030624147038906813,
0.007328100968152285,
0.1151747778058052,
0.057740017771720886,
0.09825938194990158,
0.04491075500845909,
0.014988848008215427,
0.03992222249507904,
-0.04478137195110321,
-0.0813288688659668,
-0.005923817865550518,
-0.030130906030535698,
0.09202048182487488,
0.1245080754160881,
0.08079777657985687,
-0.10797618329524994,
0.0023743112105876207,
0.11346179991960526,
-0.06265439838171005,
-0.1107468381524086,
-0.15346293151378632,
0.17061102390289307,
-0.034293681383132935,
0.02154800109565258,
-0.00934670027345419,
-0.11285945773124695,
0.010366536676883698,
0.21074475347995758,
0.11471372842788696,
0.00958776380866766,
0.006562482565641403,
-0.03570614010095596,
0.004434062633663416,
-0.011983214877545834,
0.05031862482428551,
0.03458311781287193,
0.25837501883506775,
0.00471865339204669,
0.03996741771697998,
-0.03049810230731964,
-0.05963096395134926,
-0.05552699789404869,
0.1407659649848938,
-0.042013801634311676,
0.0300179123878479,
-0.06093534082174301,
0.07756656408309937,
-0.06517963111400604,
-0.20597952604293823,
-0.07851654291152954,
-0.07348743081092834,
-0.060090478509664536,
0.011339830234646797,
-0.02349773608148098,
0.03815783932805061,
0.02484247274696827,
-0.021180903539061546,
0.043953776359558105,
0.12569287419319153,
0.014388619922101498,
-0.01463390327990055,
0.06508863717317581,
-0.028983663767576218,
-0.1027354747056961,
0.09476014226675034,
0.011154627427458763,
0.0754915103316307,
0.03568508103489876,
0.05842408910393715,
-0.07513171434402466,
0.07515965402126312,
-0.03814060986042023,
-0.09657419472932816,
0.01655474305152893,
0.23143333196640015,
-0.015748627483844757,
0.16002632677555084,
0.060307856649160385,
-0.06490092724561691,
0.030068356543779373,
-0.005813858937472105,
-0.04850385710597038,
-0.07327928394079208,
0.02308201976120472,
-0.03011275641620159,
0.15129715204238892,
0.05817457661032677,
-0.03873258829116821,
-0.0037020284216850996,
-0.043055400252342224,
-0.0020280464086681604,
-0.004826739896088839,
0.11335251480340958,
-0.020616015419363976,
-0.16106285154819489,
0.008736217394471169,
-0.07693125307559967,
0.01819964498281479,
-0.22727471590042114,
-0.06327322125434875,
-0.009127690456807613,
-0.03073905222117901,
-0.04399280622601509,
0.09179480373859406,
0.09895450621843338,
-0.029716026037931442,
-0.04241929575800896,
-0.08233664929866791,
0.06195851415395737,
0.09181404113769531,
-0.11054535955190659,
-0.05447934940457344
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the 100k unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "multilingual", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-100k-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"multilingual",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the 100k unlabeled subset of VoxPopuli corpus.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the 100k unlabeled subset of VoxPopuli corpus.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the 100k unlabeled subset of VoxPopuli corpus.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
74,
204,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the 100k unlabeled subset of VoxPopuli corpus.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.04591205343604088,
0.07065430283546448,
-0.003031386062502861,
0.008902686648070812,
0.1254868358373642,
-0.023936623707413673,
0.10880502313375473,
0.012232017703354359,
0.01130619179457426,
0.07034638524055481,
-0.01621941290795803,
0.0013438023161143064,
0.09525950253009796,
0.12010228633880615,
0.050465699285268784,
-0.2716038227081299,
0.05463917553424835,
-0.05588378384709358,
0.11878745257854462,
0.04710531607270241,
0.12227240949869156,
-0.09742152690887451,
0.041586585342884064,
0.03816518932580948,
-0.1119982972741127,
0.004708698019385338,
0.009534638375043869,
-0.07626448571681976,
0.07008151710033417,
0.06458345055580139,
0.04100477322936058,
0.0335574708878994,
0.0919012799859047,
-0.16810005903244019,
0.027572542428970337,
0.0632554143667221,
0.004847004543989897,
0.00968682486563921,
0.09032047539949417,
0.004083693027496338,
0.16511321067810059,
-0.06207488104701042,
0.01169819850474596,
0.09708627313375473,
-0.1091712936758995,
-0.1080494299530983,
-0.10233653336763382,
0.17250709235668182,
0.10995928198099136,
0.08177367597818375,
-0.07681732624769211,
0.049573712050914764,
-0.00030944973696023226,
0.05409232899546623,
0.08132748305797577,
-0.2449171096086502,
-0.037458837032318115,
0.12238334119319916,
0.06353648006916046,
-0.004688784945756197,
-0.09409604966640472,
0.09139055758714676,
0.028586916625499725,
-0.0010320324217900634,
-0.017998170107603073,
-0.051379408687353134,
0.07600337266921997,
-0.08102993667125702,
-0.10111704468727112,
-0.0016928345430642366,
0.1548120677471161,
0.03884873911738396,
-0.07831788063049316,
-0.14853762090206146,
-0.019879251718521118,
0.18977589905261993,
-0.05650748312473297,
-0.12369232624769211,
0.017105283215641975,
0.034904200583696365,
0.0332009457051754,
-0.154865562915802,
-0.06512892246246338,
-0.009072105400264263,
-0.03827379643917084,
0.09989824891090393,
-0.005524041596800089,
-0.014366338029503822,
-0.04543634131550789,
0.01968199573457241,
-0.12981146574020386,
-0.09320216625928879,
-0.003674060571938753,
-0.08312840759754181,
-0.07842669636011124,
-0.03746800869703293,
-0.04147043079137802,
-0.1065841019153595,
-0.0032556585501879454,
0.09246755391359329,
0.012728709727525711,
0.05963250622153282,
-0.07817307114601135,
0.006347024813294411,
0.06498907506465912,
0.148100808262825,
-0.0950610414147377,
-0.001518661854788661,
0.014303840696811676,
-0.021033506840467453,
0.027004053816199303,
-0.04460824653506279,
-0.05692543461918831,
-0.017951389774680138,
-0.0098757054656744,
0.03770514950156212,
0.04722641780972481,
0.003009181469678879,
-0.047526467591524124,
-0.07238273322582245,
0.13878275454044342,
-0.10400782525539398,
0.052143216133117676,
0.04694512113928795,
0.005753441713750362,
0.10586801171302795,
-0.03340831398963928,
0.0866582989692688,
-0.12298610806465149,
0.016914626583456993,
-0.01951589062809944,
0.007823705673217773,
-0.00542947743088007,
-0.04848499223589897,
0.004171916749328375,
-0.028433803468942642,
-0.029932955279946327,
-0.11648138612508774,
-0.05926740914583206,
-0.09079573303461075,
-0.014380562119185925,
-0.05264456570148468,
-0.05898968130350113,
-0.042470693588256836,
0.024216916412115097,
-0.005826219916343689,
0.004926770925521851,
-0.040130238980054855,
-0.004773575812578201,
-0.010885350406169891,
-0.005888016428798437,
0.05142930895090103,
-0.005146955139935017,
0.06253264099359512,
-0.0064730034209787846,
-0.02589910291135311,
-0.1997738927602768,
0.15090157091617584,
-0.04423429071903229,
0.0033166971988976,
-0.13284777104854584,
-0.0019673386123031378,
-0.042056936770677567,
0.026906471699476242,
0.011037216521799564,
0.12136584520339966,
-0.1803906112909317,
-0.09090197086334229,
0.2282644659280777,
-0.12202417850494385,
0.017760751768946648,
0.17794471979141235,
0.020291896536946297,
0.04710124060511589,
0.14603187143802643,
0.1590988039970398,
-0.001653309096582234,
-0.12606985867023468,
0.010592899285256863,
-0.05768030509352684,
0.001076210173778236,
0.13127313554286957,
0.032445840537548065,
-0.04893702268600464,
0.030676057562232018,
-0.008604484610259533,
-0.035464096814394,
-0.07265373319387436,
0.0045781503431499004,
-0.054493248462677,
-0.00044197248644195497,
-0.03787350654602051,
0.03421184793114662,
-0.0014021489769220352,
-0.033920057117938995,
-0.027731988579034805,
-0.10636714845895767,
-0.08713378012180328,
0.07098669558763504,
-0.049535684287548065,
0.06321781128644943,
-0.10914931446313858,
0.025827297940850258,
0.12347562611103058,
0.04122673347592354,
-0.1545415222644806,
0.060761045664548874,
0.030092541128396988,
0.0398273840546608,
0.12939143180847168,
0.05423956736922264,
-0.043101731687784195,
-0.005540275946259499,
-0.04168716445565224,
0.007916909642517567,
-0.018730366602540016,
-0.019877329468727112,
-0.04273161292076111,
-0.08794531971216202,
-0.01994984783232212,
-0.05349336564540863,
0.0907285138964653,
-0.1730394810438156,
0.02356974221765995,
0.04947880655527115,
0.08723993599414825,
0.007042716722935438,
-0.023934263736009598,
0.023806102573871613,
0.056012943387031555,
0.03421716392040253,
-0.026006659492850304,
0.06331530213356018,
0.0014877929352223873,
-0.037247221916913986,
0.07077373564243317,
-0.12406384944915771,
-0.12166623771190643,
0.12246100604534149,
-0.027258194983005524,
-0.034345194697380066,
0.014455070719122887,
0.017756467685103416,
0.005655791610479355,
-0.054309237748384476,
-0.025919489562511444,
0.23858332633972168,
0.021372133865952492,
0.06226824223995209,
-0.0701431930065155,
-0.008889397606253624,
-0.006335196550935507,
-0.038635581731796265,
-0.08004674315452576,
0.05439962074160576,
-0.015873033553361893,
-0.1694573312997818,
-0.012424789369106293,
0.07883007079362869,
0.0662212148308754,
0.11794695258140564,
0.016258904710412025,
-0.039192527532577515,
0.0035288811195641756,
-0.0754956528544426,
-0.016616467386484146,
-0.00964482594281435,
-0.09791679680347443,
-0.05012200400233269,
0.015174462459981441,
0.014634793624281883,
0.033358383923769,
-0.06361385434865952,
0.04012950137257576,
0.015880057588219643,
-0.06506910920143127,
-0.0264724288135767,
-0.00899085495620966,
-0.026975389569997787,
0.06121361255645752,
-0.004576145205646753,
0.047031205147504807,
-0.04360493645071983,
-0.04554789885878563,
-0.13910140097141266,
0.09850381314754486,
-0.06916042417287827,
-0.3325287699699402,
-0.0878758579492569,
-0.06447360664606094,
-0.061827365309000015,
0.025525430217385292,
0.03904515877366066,
-0.09376159310340881,
-0.09298911690711975,
-0.06575493514537811,
0.10920748859643936,
0.008619576692581177,
-0.08956734091043472,
0.08940330147743225,
-0.005237503442913294,
-0.013596940785646439,
-0.09598005563020706,
0.0041188341565430164,
-0.0179356187582016,
-0.11315536499023438,
-0.021679146215319633,
-0.00651669641956687,
0.047724056988954544,
0.1372983157634735,
0.018561119213700294,
-0.019054053351283073,
-0.01916658692061901,
0.23829765617847443,
-0.12532959878444672,
0.09578664600849152,
0.24356494843959808,
-0.02397238463163376,
0.03502167388796806,
0.1515538990497589,
-0.013256370089948177,
-0.05428680032491684,
0.0282958522439003,
0.0221991129219532,
-0.0074071758426725864,
-0.24477632343769073,
-0.10372208058834076,
-0.038841430097818375,
-0.03939979150891304,
0.039793793112039566,
0.02775404416024685,
0.022791670635342598,
0.0257477518171072,
-0.0941305086016655,
-0.05325377732515335,
0.08693475276231766,
0.05225643888115883,
0.16174785792827606,
-0.006869347766041756,
0.11316127330064774,
-0.04879145696759224,
0.006361860781908035,
0.07973133027553558,
0.01393046323210001,
0.05918467789888382,
0.09766214340925217,
0.12158925831317902,
0.04253353178501129,
0.045300696045160294,
0.018558094277977943,
0.004223121330142021,
-0.015015726909041405,
0.0028628255240619183,
-0.04347296059131622,
-0.04922861233353615,
-0.030895430594682693,
0.03510931134223938,
0.11741109192371368,
-0.12895996868610382,
-0.12038251757621765,
0.0369366854429245,
0.0504666231572628,
0.14852283895015717,
0.08267728239297867,
-0.06693269312381744,
-0.08681944012641907,
0.0451032929122448,
-0.0717608630657196,
-0.04742952808737755,
0.04062720015645027,
0.10209739953279495,
-0.167504221200943,
0.11751300096511841,
0.07555417716503143,
0.11690908670425415,
-0.03566465154290199,
0.034620944410562515,
-0.08740362524986267,
0.02580472268164158,
0.017774561420083046,
0.05699584260582924,
-0.20521029829978943,
0.11443964391946793,
0.03238721936941147,
0.10027333348989487,
-0.05209184065461159,
0.022143732756376266,
0.04264320060610771,
0.12570609152317047,
0.13110141456127167,
-0.010589958168566227,
-0.10418646037578583,
0.006648000329732895,
-0.056992702186107635,
0.023761849850416183,
0.0655544102191925,
-0.050023723393678665,
0.044672176241874695,
-0.02119819074869156,
-0.0032667643390595913,
-0.02169898897409439,
0.03574042394757271,
-0.2739708125591278,
-0.16213251650333405,
0.02234961837530136,
0.017487986013293266,
0.0516093485057354,
-0.019907960668206215,
-0.04424149915575981,
-0.0954693853855133,
0.12950117886066437,
0.008318345993757248,
-0.022742414847016335,
-0.11084669828414917,
0.013400397263467312,
0.03210005164146423,
-0.08581078052520752,
0.009107355959713459,
0.02864808402955532,
0.16336973011493683,
-0.08259478211402893,
-0.05586012080311775,
0.03800776228308678,
-0.07920697331428528,
-0.11500703543424606,
0.019646180793642998,
0.1462683379650116,
0.11530709266662598,
0.04725714027881622,
0.10353923588991165,
0.00908939354121685,
0.03267472982406616,
-0.1097114235162735,
0.06321967393159866,
0.06225314736366272,
-0.002988806227222085,
0.01441879291087389,
-0.033620335161685944,
-0.270671546459198,
-0.16053250432014465,
-0.024776751175522804,
0.1600174754858017,
0.15687447786331177,
-0.00842483900487423,
0.16198322176933289,
0.2704675495624542,
-0.11255063116550446,
-0.25549501180648804,
-0.04073826223611832,
-0.029356442391872406,
0.02935345470905304,
0.02749216929078102,
-0.24956674873828888,
0.0943191647529602,
0.011028594337403774,
0.008394278585910797,
-0.07501258701086044,
-0.1962694674730301,
-0.14048485457897186,
0.1596951186656952,
0.005162380635738373,
-0.0031420376617461443,
-0.07502579689025879,
-0.06645852327346802,
-0.057333286851644516,
-0.10904237627983093,
0.07906263321638107,
-0.12281891703605652,
0.07774773985147476,
0.0896381363272667,
0.019267737865447998,
0.01246833335608244,
0.040242668241262436,
0.09650593250989914,
0.05119007080793381,
0.0027205480728298426,
-0.05057385563850403,
0.010476162657141685,
0.03925824537873268,
-0.014603380113840103,
0.07409492135047913,
0.02922363393008709,
0.003734052646905184,
-0.026079464703798294,
-0.08560459315776825,
-0.07887956500053406,
0.07178151607513428,
-0.0712796077132225,
-0.021232107654213905,
-0.050372347235679626,
0.08925746381282806,
0.04238510876893997,
0.0013341325102373958,
-0.050622474402189255,
-0.11332486569881439,
0.0168522410094738,
0.12028496712446213,
0.23472441732883453,
-0.10752968490123749,
0.023588530719280243,
-0.03687214478850365,
-0.04743041470646858,
0.047692809253931046,
-0.01848790980875492,
0.06225117668509483,
0.04960492625832558,
0.009221161715686321,
0.10079598426818848,
-0.00526711530983448,
-0.10934412479400635,
0.004680072423070669,
0.017307080328464508,
-0.05699032172560692,
-0.2157566249370575,
-0.04999222233891487,
0.05129604414105415,
-0.006931718438863754,
-0.0031014741398394108,
0.16595600545406342,
-0.007398273330181837,
-0.07671593874692917,
-0.01135189738124609,
0.058424923568964005,
-0.026480432599782944,
0.045763395726680756,
0.02052624709904194,
0.041673727333545685,
-0.08937491476535797,
0.06448957324028015,
0.13588936626911163,
-0.10603487491607666,
0.043817874044179916,
0.10543925315141678,
-0.060567185282707214,
-0.05704145506024361,
-0.12102481722831726,
0.060430072247982025,
0.012710083276033401,
-0.06745392829179764,
0.04512588307261467,
-0.13729552924633026,
0.006830405909568071,
0.06203991174697876,
0.013552485965192318,
-0.025140369310975075,
-0.038272518664598465,
-0.005179325584322214,
-0.06709973514080048,
0.03486970439553261,
0.060127586126327515,
-0.0356813445687294,
-0.11814840883016586,
0.12650790810585022,
0.022818975150585175,
0.07890042662620544,
-0.030963551253080368,
-0.053176987916231155,
-0.10999105870723724,
-0.00502412486821413,
-0.08416774123907089,
0.01757197640836239,
-0.12812848389148712,
-0.003733797464519739,
-0.04449060186743736,
-0.028503093868494034,
-0.011981609277427197,
0.03868011385202408,
-0.046170517802238464,
-0.0328899510204792,
-0.029830757528543472,
0.062420785427093506,
-0.14151814579963684,
0.07323376089334488,
0.06340949237346649,
-0.04201528802514076,
0.10194361209869385,
0.00809490866959095,
-0.049953337758779526,
0.04943709075450897,
-0.19852152466773987,
-0.036106254905462265,
-0.009836197830736637,
0.035874299705028534,
0.00026216881815344095,
-0.15442630648612976,
0.001861361786723137,
0.0008802927331998944,
0.01683770678937435,
-0.00438370555639267,
0.09312283992767334,
-0.04143610969185829,
0.010583031922578812,
-0.0556679293513298,
-0.0985594317317009,
-0.041308604180812836,
0.06606347858905792,
0.08927872031927109,
0.016695499420166016,
0.08704128861427307,
-0.08323538303375244,
0.05890659987926483,
-0.10591773688793182,
0.03975033387541771,
-0.01854703761637211,
0.001360874273814261,
-0.01977028325200081,
-0.09184344857931137,
0.07010206580162048,
0.0010754000395536423,
0.08491156250238419,
0.03468118607997894,
-0.030987244099378586,
0.061679817736148834,
-0.0832090973854065,
-0.12003297358751297,
0.04035506770014763,
0.11375696957111359,
0.06612592190504074,
-0.009897438809275627,
0.04642974212765694,
-0.026895606890320778,
0.003383822739124298,
0.14244337379932404,
0.19978706538677216,
0.16619986295700073,
0.09268175810575485,
0.06910140812397003,
0.03254490718245506,
-0.04311702400445938,
-0.0732976421713829,
0.06362741440534592,
-0.08811669051647186,
0.011530415154993534,
-0.07551225274801254,
-0.026967203244566917,
0.09522315114736557,
-0.1523337960243225,
0.09527947753667831,
-0.03897795081138611,
-0.09567014127969742,
-0.13611279428005219,
-0.1583729237318039,
-0.06441672146320343,
-0.037727344781160355,
-0.02021590992808342,
-0.11260221153497696,
0.014367877505719662,
-0.007798714563250542,
-0.005290558561682701,
-0.09627003222703934,
0.06628824025392532,
-0.13639885187149048,
-0.1016697809100151,
0.17499256134033203,
-0.02081052027642727,
0.00964992307126522,
0.006954851560294628,
0.023177288472652435,
0.019179755821824074,
0.12673713266849518,
0.06693422794342041,
0.07786641269922256,
0.015042327344417572,
0.06518291682004929,
-0.07611709833145142,
-0.07387525588274002,
0.02045663818717003,
0.007862999103963375,
0.08489759266376495,
0.13215596973896027,
0.0607294961810112,
-0.051705390214920044,
0.0066198487766087055,
0.1683422327041626,
0.0033386112190783024,
-0.1097140908241272,
-0.15136615931987762,
0.06447409838438034,
-0.02029639482498169,
0.013003244996070862,
-0.012667757458984852,
-0.10730132460594177,
0.0018287569982931018,
0.24438989162445068,
0.1784360706806183,
-0.055305324494838715,
0.016584962606430054,
-0.017473453655838966,
0.023735828697681427,
0.04341447725892067,
0.10896294564008713,
0.0953427329659462,
0.17130082845687866,
-0.024449286982417107,
0.04376722499728203,
-0.018216770142316818,
-0.09231369942426682,
-0.11115860193967819,
0.11129198223352432,
-0.0076118456199765205,
-0.007462929934263229,
-0.012597700580954552,
0.15155285596847534,
-0.11192361265420914,
-0.1304398477077484,
-0.05462164804339409,
-0.031704336404800415,
-0.08522617816925049,
0.02751029096543789,
-0.008798765018582344,
0.09116679430007935,
0.0811857283115387,
0.009997248649597168,
0.004293573088943958,
0.18832749128341675,
0.02664230205118656,
0.027573397383093834,
0.014730647206306458,
0.09804525971412659,
-0.08980414271354675,
0.15510189533233643,
0.0033770552836358547,
0.04593478888273239,
0.05115741491317749,
0.01914650946855545,
-0.07613063603639603,
0.04252702742815018,
0.005601412151008844,
0.03372225910425186,
0.061843112111091614,
0.16282953321933746,
0.0032730121165513992,
0.052089471369981766,
0.10740198194980621,
-0.12758886814117432,
0.04803430661559105,
0.034063659608364105,
-0.0006106252549216151,
-0.045335475355386734,
0.15518639981746674,
-0.18074269592761993,
0.1092953085899353,
0.11416854709386826,
-0.042073242366313934,
-0.012590914964675903,
-0.02456270344555378,
0.019982485100626945,
-0.0462893582880497,
0.08836239576339722,
-0.02210244908928871,
-0.15016241371631622,
0.016575701534748077,
-0.047504402697086334,
0.03737820312380791,
-0.23857565224170685,
-0.01757810451090336,
-0.013220276683568954,
-0.0029137125238776207,
-0.034404657781124115,
0.10569454729557037,
0.05006280913949013,
-0.06280817836523056,
-0.015523075126111507,
-0.10313594341278076,
0.018455058336257935,
0.11800406128168106,
-0.07288340479135513,
-0.0360538475215435
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the 10k unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "multilingual", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-10k-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"multilingual",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the 10k unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the 10k unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the 10k unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
74,
133,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the 10k unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.06708840280771255,
0.043392859399318695,
-0.005330626852810383,
0.008585494942963123,
0.12068687379360199,
-0.02305225096642971,
0.09583000093698502,
0.011271252296864986,
0.034554723650217056,
0.007918449118733406,
0.005260889418423176,
0.036674994975328445,
0.07358381897211075,
0.08131810277700424,
0.021542280912399292,
-0.3101080060005188,
0.039836566895246506,
0.008163714781403542,
0.07002418488264084,
0.05930987372994423,
0.12635892629623413,
-0.08140452951192856,
0.01621788553893566,
0.06675752252340317,
-0.07651479542255402,
0.006193866487592459,
0.01953054405748844,
-0.09087013453245163,
0.12041646987199783,
0.07563610374927521,
0.07869300991296768,
0.04377134144306183,
0.036996323615312576,
-0.16465085744857788,
0.03196890279650688,
0.044321831315755844,
-0.03919536992907524,
0.001810456975363195,
0.12545515596866608,
-0.03157855197787285,
0.20491352677345276,
-0.031253110617399216,
-0.036654990166425705,
0.09116902202367783,
-0.11761020123958588,
-0.15403088927268982,
-0.05561310797929764,
0.1522568017244339,
0.11890888959169388,
0.08970434963703156,
-0.06873996555805206,
0.03498277813196182,
-0.04951368272304535,
0.07370522618293762,
0.10424041002988815,
-0.2967343032360077,
-0.03891737014055252,
0.11891863495111465,
0.08888895809650421,
-0.03503626585006714,
-0.10088556259870529,
0.07473450899124146,
0.005544803570955992,
0.016249004751443863,
-0.025993969291448593,
-0.09449055790901184,
0.03912518918514252,
-0.08415687829256058,
-0.10231877863407135,
-0.0076826587319374084,
0.1981867551803589,
0.02625659666955471,
-0.06078967824578285,
-0.08655717968940735,
-0.01313144899904728,
0.18909002840518951,
-0.06691358983516693,
-0.1361217349767685,
0.012819702737033367,
0.03818375617265701,
0.05123675987124443,
-0.15065337717533112,
-0.09378372132778168,
-0.010653657838702202,
-0.04716135561466217,
0.10837734490633011,
0.03747257590293884,
-0.014644969254732132,
-0.0773039162158966,
0.029217097908258438,
-0.05195033177733421,
-0.07178176939487457,
0.01054059062153101,
-0.09426494687795639,
-0.05483854189515114,
0.002140818629413843,
-0.0664716437458992,
-0.08709977567195892,
-0.014400064945220947,
0.07596516609191895,
0.02962695062160492,
0.047684088349342346,
-0.025949988514184952,
0.036196768283843994,
0.024101359769701958,
0.07123172283172607,
-0.1202026754617691,
0.009554117918014526,
0.015311489813029766,
-0.04022471234202385,
-0.006677977740764618,
-0.02234545536339283,
-0.06412079930305481,
-0.06529514491558075,
0.0034846002236008644,
0.060149818658828735,
0.03264392539858818,
0.019062494859099388,
-0.07634948939085007,
-0.09309300780296326,
0.044328104704618454,
-0.0677289366722107,
0.01411023922264576,
0.03354722633957863,
0.002229314064607024,
0.18530744314193726,
-0.0038326524663716555,
0.07091237604618073,
-0.15659283101558685,
0.01689419522881508,
-0.00934185367077589,
0.023964898660779,
-0.011966315098106861,
-0.04973052442073822,
0.02378460019826889,
-0.014500655233860016,
-0.015902576968073845,
-0.14104488492012024,
-0.05724354460835457,
-0.08174411207437515,
-0.004233763087540865,
-0.026775071397423744,
-0.08294860273599625,
-0.03432021662592888,
0.007560862693935633,
-0.019407598301768303,
-0.023698780685663223,
0.003354463493451476,
-0.016103750094771385,
0.021852541714906693,
-0.019595863297581673,
0.06921866536140442,
-0.029042420908808708,
0.0805111974477768,
0.012964469380676746,
-0.033492568880319595,
-0.11937210708856583,
0.11946620047092438,
-0.06788501888513565,
-0.05128512904047966,
-0.14639584720134735,
-0.06059445068240166,
-0.05178065598011017,
0.05424352362751961,
-0.00029652772354893386,
0.1353934407234192,
-0.17559723556041718,
-0.10353754460811615,
0.2756160795688629,
-0.09216360747814178,
0.0392083115875721,
0.18543927371501923,
0.0284824650734663,
0.044637616723775864,
0.15549249947071075,
0.1220623105764389,
0.04345095157623291,
-0.13904613256454468,
0.05957011505961418,
-0.040059637278318405,
-0.0062127732671797276,
0.04136306419968605,
0.060418661683797836,
-0.020024068653583527,
0.010582574643194675,
0.003911925479769707,
-0.05070791393518448,
-0.04768088087439537,
-0.012975603342056274,
-0.05904504284262657,
0.03138716146349907,
-0.024150701239705086,
0.07460929453372955,
0.020583124831318855,
0.011141495779156685,
0.00325829372741282,
-0.08189002424478531,
-0.04995002970099449,
0.0787515938282013,
-0.04482737183570862,
0.06949512660503387,
-0.09448667615652084,
0.03565366193652153,
0.10925725102424622,
0.04877765476703644,
-0.15670804679393768,
0.0690564289689064,
-0.012410719878971577,
0.10052338987588882,
0.10711407661437988,
0.21227501332759857,
-0.026439525187015533,
-0.05130549520254135,
-0.07699506729841232,
0.02660440281033516,
-0.03318234905600548,
-0.04148472845554352,
-0.032933469861745834,
-0.08681845664978027,
-0.018146967515349388,
-0.038707632571458817,
0.07007807493209839,
-0.1549556702375412,
0.002959298901259899,
0.03919121250510216,
0.06090648099780083,
0.012022835202515125,
0.008149294182658195,
0.028386656194925308,
0.10744425654411316,
0.038397230207920074,
0.01847829855978489,
0.10486654192209244,
-0.0062127746641635895,
-0.035681966692209244,
0.09617244452238083,
-0.05321275815367699,
0.009830424562096596,
0.12554088234901428,
-0.10758425295352936,
0.0022004188504070044,
0.006035835947841406,
0.017872216179966927,
-0.004109731409698725,
0.010928335599601269,
-0.012471741996705532,
0.2378547191619873,
0.017860209569334984,
0.08099108934402466,
-0.08133718371391296,
0.012676768936216831,
-0.013247746974229813,
-0.0543966107070446,
-0.04673538729548454,
0.10332221537828445,
0.019747674465179443,
-0.09904635697603226,
-0.0037141696084290743,
0.09725456684827805,
-0.006433282513171434,
0.13968880474567413,
0.014616252854466438,
-0.01808655820786953,
0.012705708853900433,
-0.05961497873067856,
-0.01857214979827404,
-0.005798812489956617,
-0.15561775863170624,
-0.03052946738898754,
0.029467200860381126,
0.028278332203626633,
0.06335745751857758,
-0.07954956591129303,
-0.012009195052087307,
-0.004137967247515917,
-0.07887933403253555,
-0.05045916512608528,
0.04417138919234276,
-0.008334399200975895,
0.07548907399177551,
-0.042303234338760376,
0.006248962599784136,
-0.011396219953894615,
-0.04048449173569679,
-0.1072198897600174,
0.11242381483316422,
-0.07493709772825241,
-0.37210437655448914,
-0.0798795074224472,
-0.0917484313249588,
-0.0721578374505043,
0.01550260093063116,
0.05171220377087593,
-0.08895565569400787,
-0.06215886399149895,
0.003524128347635269,
0.14592769742012024,
-0.0413561575114727,
-0.09598648548126221,
0.04066317901015282,
0.022699102759361267,
-0.008191846311092377,
-0.10782873630523682,
0.007848545908927917,
-0.03995576128363609,
-0.13602539896965027,
0.017332782968878746,
-0.017196934670209885,
0.029854988679289818,
0.12679731845855713,
0.05493269860744476,
-0.025153104215860367,
-0.023341218009591103,
0.20304252207279205,
-0.10739412158727646,
0.06281089037656784,
0.28441762924194336,
0.015426835045218468,
0.026188485324382782,
0.11581221967935562,
0.005678780842572451,
-0.04763016104698181,
0.0026454487815499306,
0.05305828899145126,
-0.004510991275310516,
-0.26874852180480957,
-0.12715032696723938,
-0.06928825378417969,
-0.02231665328145027,
0.02624625153839588,
0.012003481388092041,
-0.0014111375203356147,
0.03302714601159096,
-0.09169325977563858,
-0.048478372395038605,
0.05888451635837555,
0.026151809841394424,
0.19866999983787537,
-0.04027695953845978,
0.13095314800739288,
-0.03724251314997673,
-0.021976187825202942,
0.06541142612695694,
0.056026577949523926,
0.0719199925661087,
0.09118741750717163,
0.07772095501422882,
0.0935494676232338,
0.06338559091091156,
0.02886386215686798,
0.007079183589667082,
-0.01910024881362915,
-0.014414263889193535,
-0.05114574730396271,
-0.027294853702187538,
-0.03133876994252205,
-0.004258370958268642,
0.14210115373134613,
-0.1506327986717224,
-0.1268455982208252,
-0.005692000035196543,
0.02743748016655445,
0.15450482070446014,
0.06648560613393784,
-0.05141540244221687,
-0.05268813669681549,
0.04123532772064209,
-0.09075717628002167,
-0.04371226951479912,
0.061292704194784164,
0.08099445700645447,
-0.17815209925174713,
0.12115728855133057,
0.03913402184844017,
0.10555751621723175,
-0.03382682427763939,
0.0468079037964344,
-0.14227236807346344,
0.004656347446143627,
0.04747036471962929,
0.0810370072722435,
-0.2646194398403168,
0.2062041014432907,
0.013130566105246544,
0.06747040152549744,
-0.08216200768947601,
-0.006114539690315723,
0.03879741579294205,
0.080374576151371,
0.1198277547955513,
-0.00211891601793468,
-0.006161990109831095,
-0.004712365567684174,
-0.03027643822133541,
0.035773541778326035,
0.03177445009350777,
-0.026024315506219864,
0.03772095590829849,
-0.004274254664778709,
0.009271148592233658,
-0.015484575182199478,
0.0777825117111206,
-0.2516535222530365,
-0.1414235234260559,
0.042102620005607605,
0.029368596151471138,
0.06557388603687286,
-0.018639929592609406,
-0.08017256110906601,
-0.12594467401504517,
0.1171126738190651,
-0.016548078507184982,
-0.03198571130633354,
-0.09468947350978851,
0.0270992461591959,
0.020450744777917862,
-0.108237624168396,
0.02419239841401577,
0.037603456526994705,
0.09937110543251038,
-0.09359811246395111,
-0.051077842712402344,
0.04405538737773895,
-0.09301872551441193,
-0.07735936343669891,
0.03490472212433815,
0.1988794505596161,
0.10702326148748398,
0.03568478301167488,
0.10006958246231079,
-0.046292513608932495,
0.013760675676167011,
-0.1121668741106987,
0.06060618907213211,
0.009745159186422825,
0.01260108221322298,
0.028312409296631813,
-0.06011525169014931,
-0.2640461027622223,
-0.11570840328931808,
-0.025542430579662323,
0.18944567441940308,
0.17171631753444672,
0.004268107004463673,
0.1669265180826187,
0.24375781416893005,
-0.08426240086555481,
-0.24849621951580048,
-0.07665335386991501,
-0.013995937071740627,
0.04525967314839363,
0.02999257668852806,
-0.2667728066444397,
0.06627901643514633,
0.04976801201701164,
0.00010647213639458641,
-0.07822515070438385,
-0.225739985704422,
-0.12752462923526764,
0.16380839049816132,
0.040574971586465836,
0.13300934433937073,
-0.09429576247930527,
-0.04634402319788933,
-0.06185956671833992,
-0.13396091759204865,
0.07884445041418076,
-0.0904432162642479,
0.10239630937576294,
0.04045505076646805,
0.01459322590380907,
0.0022737502586096525,
0.039639245718717575,
0.11394467949867249,
0.06271777302026749,
-0.00784554798156023,
-0.0366949625313282,
0.007141684647649527,
0.02615731954574585,
0.02979562245309353,
0.03010907769203186,
0.006150608882308006,
-0.014556986279785633,
-0.06977837532758713,
-0.09966079890727997,
-0.1213112622499466,
0.08422783017158508,
-0.06818793714046478,
-0.014101318083703518,
-0.0171602014452219,
0.10087335854768753,
0.003852413734421134,
0.012895628809928894,
-0.06282169371843338,
-0.13126008212566376,
0.028666846454143524,
0.08848171681165695,
0.24308720231056213,
-0.130880668759346,
-0.031674958765506744,
-0.06911332160234451,
-0.04863904044032097,
0.07597526907920837,
0.015878383070230484,
0.051850516349077225,
0.04356280341744423,
0.004139276687055826,
0.08769477158784866,
0.018537867814302444,
-0.0777660459280014,
0.022882044315338135,
0.03801662102341652,
-0.0702834352850914,
-0.23572538793087006,
-0.06053084135055542,
0.008834821172058582,
0.013060116209089756,
0.030196037143468857,
0.17141671478748322,
-0.0016233382048085332,
-0.07531307637691498,
-0.012006097473204136,
0.03717793896794319,
-0.035699907690286636,
0.057131655514240265,
0.04310370609164238,
0.044595565646886826,
-0.09957201033830643,
0.04657813906669617,
0.0909726545214653,
-0.12857449054718018,
0.035447895526885986,
0.07849345356225967,
-0.05302048102021217,
-0.09816625714302063,
-0.12608350813388824,
0.012959897518157959,
0.012970121577382088,
-0.06910517811775208,
0.031228361651301384,
-0.1539510190486908,
0.031341303139925,
0.07155612111091614,
0.0344955250620842,
-0.022924577817320824,
-0.06089542806148529,
-0.036770813167095184,
-0.02797684073448181,
0.009902588091790676,
0.12261462956666946,
-0.07409945875406265,
-0.12165708839893341,
0.14736062288284302,
0.024633878841996193,
0.10273212194442749,
-0.034639012068510056,
-0.05709037557244301,
-0.13139215111732483,
0.02034205198287964,
-0.11235915124416351,
0.011825978755950928,
-0.14698922634124756,
0.005724036134779453,
-0.05095108970999718,
-0.025384729728102684,
-0.02351592294871807,
0.03579799830913544,
-0.0898902490735054,
0.011371451430022717,
0.0024681203067302704,
0.08476560562849045,
-0.0996183454990387,
0.06151707470417023,
0.06919340044260025,
-0.016970332711935043,
0.09598658233880997,
0.026015086099505424,
-0.04963283613324165,
0.08185725659132004,
-0.1928095668554306,
-0.0352201946079731,
0.032828569412231445,
0.030246030539274216,
-0.014786029234528542,
-0.15762634575366974,
0.007010359317064285,
0.025016574189066887,
0.0577317476272583,
-0.001898499089293182,
0.0807785838842392,
-0.0688985139131546,
-0.0053245751187205315,
-0.04934367164969444,
-0.09666548669338226,
-0.03695337101817131,
0.06380320340394974,
0.09279496222734451,
0.020791828632354736,
0.11203350126743317,
-0.08294197916984558,
0.07019048929214478,
-0.10015099495649338,
0.07064370065927505,
-0.04305776581168175,
-0.03203604742884636,
0.0046301670372486115,
-0.11719045788049698,
0.06583742052316666,
-0.000690135988406837,
0.06869687139987946,
0.0076487259939312935,
-0.03440821170806885,
0.008631501346826553,
-0.08810805529356003,
-0.09932977706193924,
0.022035742178559303,
0.11446638405323029,
0.067212775349617,
-0.011374044232070446,
0.03505493327975273,
-0.0010069621494039893,
0.0013779208529740572,
0.20310473442077637,
0.197438046336174,
0.21586208045482635,
0.08417767286300659,
0.07451098412275314,
0.006493489257991314,
-0.04505126178264618,
-0.053835123777389526,
-0.005507264751940966,
-0.06509231775999069,
0.031014829874038696,
-0.08189790695905685,
-0.03264208137989044,
0.08049675822257996,
-0.13436582684516907,
0.1072649285197258,
0.003743590787053108,
-0.07753258943557739,
-0.1418343037366867,
-0.1903231292963028,
-0.048403188586235046,
-0.056030917912721634,
-0.027443308383226395,
-0.12244882434606552,
-0.03048066981136799,
0.022723974660038948,
0.02799203060567379,
-0.1166078969836235,
0.06140958145260811,
-0.11184445768594742,
-0.1517496258020401,
0.18682220578193665,
-0.033603865653276443,
0.0271346103399992,
-0.025310007855296135,
0.025721749290823936,
-0.013780985958874226,
0.08947199583053589,
0.028813906013965607,
0.03603868931531906,
-0.035266224294900894,
0.04360443353652954,
-0.08807779848575592,
-0.06433551758527756,
-0.0008655466372147202,
0.037096984684467316,
0.1394045203924179,
0.2168247550725937,
0.045385923236608505,
-0.06326194852590561,
0.009055636823177338,
0.1552225798368454,
0.039446871727705,
-0.11717178672552109,
-0.12694025039672852,
0.08186019957065582,
-0.001520430902019143,
-0.005603441037237644,
-0.023017261177301407,
-0.054267942905426025,
0.014188837260007858,
0.2862967252731323,
0.1998349279165268,
-0.06618703901767731,
0.03421814367175102,
0.006469716317951679,
0.029734091833233833,
0.08801298588514328,
0.11339742690324783,
0.09971366077661514,
0.2171560674905777,
-0.036638423800468445,
-0.0039014273788779974,
-0.008085199631750584,
-0.048906370997428894,
-0.11064456403255463,
0.11696945130825043,
0.027327733114361763,
-0.07014178484678268,
0.011342320591211319,
0.15214163064956665,
-0.13538706302642822,
-0.057012200355529785,
-0.07738788425922394,
-0.043775759637355804,
-0.10766056180000305,
-0.0007830215618014336,
-0.01788443513214588,
0.1142595037817955,
0.09405118227005005,
-0.017177460715174675,
-0.005214938893914223,
0.17726720869541168,
0.04617765173316002,
0.008819125592708588,
-0.015898946672677994,
0.12437348067760468,
-0.01767600141465664,
0.06643855571746826,
-0.015473010018467903,
0.07636985927820206,
0.06165638938546181,
0.05654699727892876,
-0.03422066196799278,
0.04960345849394798,
0.0040255580097436905,
0.029174944385886192,
0.07321492582559586,
0.12158288806676865,
0.01760151796042919,
0.020240161567926407,
0.09679217636585236,
-0.1241675391793251,
0.03530125319957733,
0.05817429721355438,
-0.012701954692602158,
-0.017863614484667778,
0.18947088718414307,
-0.1906825602054596,
0.05960969999432564,
0.14065125584602356,
-0.040498025715351105,
-0.048776306211948395,
-0.041689902544021606,
0.0372796431183815,
-0.03321455791592598,
0.04006635770201683,
-0.05746817961335182,
-0.13189679384231567,
0.01683729514479637,
-0.09179108589887619,
0.026779932901263237,
-0.2020334154367447,
-0.014335574582219124,
-0.03125960752367973,
-0.015532993711531162,
-0.059441905468702316,
0.08084455877542496,
0.011396403424441814,
-0.05318634584546089,
0.02271883748471737,
-0.07265491783618927,
0.011120927520096302,
0.10964861512184143,
-0.10461492091417313,
-0.049302831292152405
] |
null | null |
transformers
|
# Wav2Vec2-Large-960h-Lv60 + Self-Training
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The large model pretrained and fine-tuned on 960 hours of Libri-Light and Librispeech on 16kHz sampled speech audio. Model was trained with [Self-Training objective](https://arxiv.org/abs/2010.11430). When using the model make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h-lv60-self")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h-lv60-self")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-large-960h-lv60-self** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h-lv60-self").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h-lv60-self")
def map_to_pred(batch):
inputs = processor(batch["audio"]["array"], return_tensors="pt", padding="longest")
input_values = inputs.input_values.to("cuda")
attention_mask = inputs.attention_mask.to("cuda")
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, remove_columns=["audio"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 1.9 | 3.9 |
|
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["librispeech_asr"], "model-index": [{"name": "wav2vec2-large-960h-lv60", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (clean)", "type": "librispeech_asr", "config": "clean", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 1.9, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (other)", "type": "librispeech_asr", "config": "other", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 3.9, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-960h-lv60-self
|
[
"transformers",
"pytorch",
"tf",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:librispeech_asr",
"arxiv:2010.11430",
"arxiv:2006.11477",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11430",
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #wav2vec2 #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.11430 #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
Wav2Vec2-Large-960h-Lv60 + Self-Training
========================================
Facebook's Wav2Vec2
The large model pretrained and fine-tuned on 960 hours of Libri-Light and Librispeech on 16kHz sampled speech audio. Model was trained with Self-Training objective. When using the model make sure that your speech input is also sampled at 16Khz.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
Usage
=====
To transcribe audio files the model can be used as a standalone acoustic model as follows:
Evaluation
----------
This code snippet shows how to evaluate facebook/wav2vec2-large-960h-lv60-self on LibriSpeech's "clean" and "other" test data.
*Result (WER)*:
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #wav2vec2 #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.11430 #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
104
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #wav2vec2 #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.11430 #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
-0.12863503396511078,
0.1433687061071396,
-0.003889550222083926,
0.02292335219681263,
0.0620010606944561,
-0.036779534071683884,
0.07662758231163025,
0.11929243057966232,
0.05535826459527016,
0.006087116431444883,
0.07834972441196442,
0.1293887346982956,
0.01783042959868908,
0.059855490922927856,
-0.05044364184141159,
-0.15576408803462982,
0.06968189030885696,
-0.002704123966395855,
0.020362963899970055,
0.07756810635328293,
0.11281051486730576,
-0.04274824634194374,
0.054833464324474335,
0.037760455161333084,
-0.04963633790612221,
0.04766876623034477,
0.04592757299542427,
-0.13541458547115326,
0.10348395258188248,
0.04893769323825836,
0.0036214629653841257,
0.0635637566447258,
0.04251599684357643,
-0.09799522161483765,
0.03056308440864086,
0.011390157975256443,
-0.025482650846242905,
0.0648922547698021,
-0.0033906344324350357,
-0.03512600436806679,
0.04157900810241699,
0.056772902607917786,
-0.041997987776994705,
0.07977572083473206,
-0.07849545776844025,
-0.26889604330062866,
-0.0673796534538269,
0.10855435580015182,
0.0033079481218010187,
0.06982803344726562,
-0.017503898590803146,
0.12698952853679657,
-0.08174652606248856,
0.08725455403327942,
0.12153630703687668,
-0.25924763083457947,
0.032336968928575516,
-0.013992201536893845,
0.00798481609672308,
0.043839819729328156,
-0.03575035184621811,
0.02681872993707657,
0.02796444110572338,
0.012858418747782707,
0.03884877637028694,
-0.07038018107414246,
-0.22039222717285156,
0.010550254955887794,
-0.11059416085481644,
-0.040817294269800186,
0.2891804277896881,
0.043133921921253204,
0.017192615196108818,
-0.024894392117857933,
-0.05512302741408348,
0.03624974191188812,
-0.022201647982001305,
0.013355148956179619,
0.00113069755025208,
0.035005588084459305,
0.05855751782655716,
-0.029845330864191055,
-0.10394985228776932,
-0.07071404904127121,
-0.14428937435150146,
0.10037253797054291,
-0.02663816511631012,
0.052507344633340836,
-0.1160341128706932,
0.01777794025838375,
-0.003375700209289789,
-0.11610336601734161,
0.000052418283303268254,
-0.01995866186916828,
-0.008070386946201324,
0.06867706030607224,
-0.014714549295604229,
-0.018944498151540756,
0.13156743347644806,
0.04476828873157501,
0.033973194658756256,
0.00629922840744257,
-0.04628056660294533,
0.09803728759288788,
-0.027310343459248543,
0.11882070451974869,
-0.09277878701686859,
-0.019415782764554024,
0.0633355900645256,
0.045797012746334076,
0.05826732516288757,
-0.041879620403051376,
-0.11234865337610245,
-0.03253583610057831,
0.033447783440351486,
0.041751958429813385,
0.06463044881820679,
0.030928030610084534,
-0.029588868841528893,
0.0022672037594020367,
0.09768928587436676,
-0.14073340594768524,
-0.007391930557787418,
0.052670907229185104,
0.030662264674901962,
0.0503053218126297,
0.060985710471868515,
0.03231913223862648,
-0.07652074843645096,
-0.02391689084470272,
-0.02909751422703266,
0.007600334472954273,
0.04556699097156525,
-0.05704773962497711,
0.053859926760196686,
-0.05132155492901802,
0.01800655573606491,
-0.15885399281978607,
-0.03942127525806427,
-0.0024965526536107063,
-0.005200658459216356,
0.008109889924526215,
-0.07059301435947418,
0.0050285314209759235,
-0.050129834562540054,
0.054131537675857544,
-0.10388907045125961,
0.038895245641469955,
-0.07354944199323654,
0.06754844635725021,
0.015980036929249763,
0.10413297265768051,
-0.14329928159713745,
0.08579394221305847,
-0.06693670153617859,
-0.024773526936769485,
-0.03894130513072014,
0.05948366969823837,
-0.11282801628112793,
0.07519911974668503,
-0.10513835400342941,
-0.01961708813905716,
-0.11334571242332458,
0.03901809826493263,
-0.012563244439661503,
0.08316779136657715,
-0.184972882270813,
-0.10350551456212997,
0.1389147937297821,
-0.09784574061632156,
-0.13781362771987915,
0.11877142637968063,
0.04124932736158371,
0.012141908518970013,
0.06294276565313339,
0.30803343653678894,
-0.008279753848910332,
-0.11674173921346664,
-0.016429001465439796,
0.08956199884414673,
-0.06144719198346138,
-0.1226302981376648,
0.0820692852139473,
-0.07369127124547958,
0.02482907846570015,
0.010742240585386753,
-0.014460956677794456,
0.08085063844919205,
0.02812126651406288,
-0.07911625504493713,
-0.047548625618219376,
-0.09619217365980148,
-0.0284389890730381,
-0.001965030562132597,
0.028263000771403313,
-0.014285105280578136,
-0.03223622962832451,
-0.03374974802136421,
0.07577204704284668,
-0.01963046006858349,
0.05099916085600853,
-0.11087027192115784,
0.11843327432870865,
-0.057006921619176865,
0.02065146155655384,
-0.16016319394111633,
0.15184316039085388,
-0.052735112607479095,
-0.011309121735394001,
0.08802958577871323,
0.06893786787986755,
0.06639915704727173,
-0.08098936080932617,
-0.01578941009938717,
-0.02243582159280777,
0.10969623178243637,
0.07500191032886505,
0.009164908900856972,
-0.17956341803073883,
0.03208022937178612,
-0.0606469064950943,
0.08895932137966156,
-0.048837810754776,
-0.022473149001598358,
0.08638781309127808,
0.09080188721418381,
-0.004344158805906773,
0.04681685194373131,
0.06675977259874344,
-0.019893702119588852,
0.006534758489578962,
-0.006818287540227175,
0.05325218662619591,
0.006060659419745207,
-0.05848899483680725,
0.2241034358739853,
-0.16363525390625,
0.21462808549404144,
0.211879700422287,
-0.06323179602622986,
0.06317301094532013,
0.1192149966955185,
0.0044820611365139484,
-0.008864021860063076,
0.06694836169481277,
-0.05568001791834831,
0.13802941143512726,
-0.029488002881407738,
0.1416003555059433,
-0.07676715403795242,
0.003254657844081521,
0.01673257350921631,
-0.033643919974565506,
0.0042803636752069,
0.11477922648191452,
-0.00579628674313426,
-0.14532263576984406,
0.11118603497743607,
0.21331508457660675,
-0.07891448587179184,
0.16610734164714813,
-0.059772636741399765,
-0.07279156893491745,
0.057085584849119186,
-0.04500988498330116,
-0.036421846598386765,
0.1109275296330452,
-0.1685122400522232,
-0.052317772060632706,
0.08908487856388092,
-0.004368141759186983,
0.046014778316020966,
-0.1437448412179947,
-0.0006774357752874494,
-0.021414734423160553,
-0.06461542844772339,
-0.1471797376871109,
0.09412422776222229,
-0.02289571426808834,
0.09989064186811447,
-0.05837377533316612,
-0.18863384425640106,
0.03133125230669975,
-0.03602392226457596,
-0.1054474413394928,
0.0714225172996521,
-0.09516089409589767,
-0.25189149379730225,
-0.0783655047416687,
-0.042857296764850616,
-0.015779411420226097,
0.0049413940869271755,
0.10137449204921722,
-0.09554282575845718,
-0.028785744681954384,
-0.05004720389842987,
-0.0213873703032732,
-0.030823204666376114,
-0.006870431825518608,
0.0668221116065979,
0.002951911184936762,
0.076850526034832,
-0.14842960238456726,
-0.022107653319835663,
-0.04203655198216438,
0.059264399111270905,
0.05569460988044739,
0.005272441543638706,
0.07920562475919724,
0.15703575313091278,
0.05603104084730148,
0.04813219606876373,
-0.005226167384535074,
0.16125357151031494,
-0.08643978089094162,
-0.004793667700141668,
0.16648055613040924,
-0.03464614227414131,
0.007729620672762394,
0.16187307238578796,
0.03342961519956589,
-0.017331521958112717,
-0.037004027515649796,
-0.040172964334487915,
-0.06021704524755478,
-0.2026337832212448,
-0.10550955682992935,
-0.11470641195774078,
-0.035330601036548615,
0.013162868097424507,
0.08492878079414368,
0.052183620631694794,
-0.02491912618279457,
0.015285909175872803,
-0.05145924165844917,
-0.005534518975764513,
-0.01471050176769495,
0.22149096429347992,
-0.03345748409628868,
0.10803166776895523,
-0.10451068729162216,
-0.04716338962316513,
0.08531897515058517,
0.07699116319417953,
0.04116198420524597,
0.10080857574939728,
0.0685868188738823,
0.028126144781708717,
0.14289605617523193,
0.09165310114622116,
0.08490823209285736,
0.006761410739272833,
-0.015199960209429264,
-0.015772584825754166,
-0.08724931627511978,
-0.022439192980527878,
0.08234035223722458,
0.1317434012889862,
-0.06102427840232849,
0.0035257444251328707,
-0.09603997319936752,
0.035307154059410095,
0.1711442768573761,
0.10233656316995621,
-0.17992763221263885,
0.0032420274801552296,
0.02803053706884384,
-0.052073732018470764,
-0.015934493392705917,
0.06419017910957336,
0.0030508770141750574,
-0.021213900297880173,
0.07343299686908722,
0.052530739456415176,
0.07912726700305939,
-0.019200311973690987,
0.045144762843847275,
-0.10385683178901672,
-0.031164001673460007,
0.031126689165830612,
0.05042242258787155,
-0.2468707263469696,
0.2660231292247772,
0.03356432914733887,
0.013512437231838703,
-0.00607326440513134,
-0.002696594223380089,
0.09168056398630142,
0.08525892347097397,
0.15779882669448853,
0.026470568031072617,
-0.06137015298008919,
-0.03972890228033066,
-0.08656482398509979,
0.06532445549964905,
0.01007450744509697,
0.06019384041428566,
-0.05230531096458435,
-0.022684918716549873,
-0.037795282900333405,
0.04500607028603554,
-0.005053041502833366,
-0.13034191727638245,
-0.08660488575696945,
0.054965633898973465,
0.2532023787498474,
0.04908604919910431,
-0.012602792121469975,
-0.07693000137805939,
-0.16220322251319885,
0.00009125815995503217,
-0.10018311440944672,
-0.00047046897816471756,
-0.08150306344032288,
-0.1402784138917923,
0.12137997895479202,
-0.046577971428632736,
0.023006631061434746,
-0.021242626011371613,
-0.021022887900471687,
-0.02044815942645073,
-0.14162537455558777,
0.12477412074804306,
-0.10168913006782532,
-0.03976665437221527,
-0.004903820343315601,
0.18127165734767914,
-0.04117517173290253,
0.07829117774963379,
0.026308568194508553,
0.056412599980831146,
-0.08794382214546204,
-0.048565641045570374,
0.13509969413280487,
0.05499546602368355,
-0.058732014149427414,
0.006518421228975058,
-0.025326242670416832,
-0.16394422948360443,
-0.01289502251893282,
-0.010713472031056881,
0.21231959760189056,
0.15507467091083527,
-0.07550301402807236,
0.17807620763778687,
0.22971397638320923,
-0.02593526616692543,
-0.27904394268989563,
-0.1414455771446228,
-0.07644586265087128,
0.009341094642877579,
-0.028507180511951447,
-0.13571211695671082,
0.0818297490477562,
-0.05134477838873863,
-0.09621048718690872,
0.0585845522582531,
-0.1378009170293808,
-0.09786026924848557,
0.2953853905200958,
-0.10310579836368561,
0.22312399744987488,
-0.12469521909952164,
-0.06037432700395584,
-0.06344114243984222,
-0.16677246987819672,
0.07670135051012039,
-0.1484929919242859,
0.09211522340774536,
-0.0018695011967793107,
0.036806195974349976,
-0.002749038627371192,
-0.027015376836061478,
0.09613314270973206,
0.05783212557435036,
-0.040568385273218155,
-0.04335014522075653,
-0.01919328235089779,
0.06131167709827423,
-0.0010248300386592746,
0.1289045363664627,
-0.1064741313457489,
0.03291730210185051,
-0.10510991513729095,
-0.006011888850480318,
-0.11328107118606567,
0.08321983367204666,
0.06860902905464172,
-0.010432761162519455,
0.006892541889101267,
-0.037863753736019135,
0.012953697703778744,
-0.0015015885001048446,
0.15684136748313904,
-0.09538203477859497,
0.02422172762453556,
0.19170890748500824,
0.14737433195114136,
-0.18817077577114105,
-0.07255113869905472,
-0.0394575260579586,
-0.05946134775876999,
0.10210113227367401,
-0.15041957795619965,
0.10134974122047424,
0.05007770285010338,
0.04853314906358719,
0.07728814333677292,
0.049451105296611786,
-0.06890090554952621,
-0.009952574037015438,
0.09971841424703598,
-0.10476411134004593,
-0.08762507140636444,
0.001065323711372912,
0.021552924066781998,
0.03226619213819504,
0.0702362209558487,
0.15398980677127838,
-0.03854670748114586,
-0.004704691935330629,
0.007816516794264317,
0.014211469329893589,
-0.11603892594575882,
0.10901674628257751,
0.14454935491085052,
0.0357067845761776,
-0.15707935392856598,
0.10667508095502853,
0.00035887982812710106,
-0.0730748251080513,
0.04053732752799988,
0.03381374478340149,
-0.07173780351877213,
-0.12866701185703278,
-0.10952749103307724,
0.028751956298947334,
-0.029911121353507042,
-0.11348605901002884,
-0.04670732468366623,
-0.09699834138154984,
0.030397025868296623,
0.13105672597885132,
0.05067489296197891,
0.03495467081665993,
-0.06269754469394684,
-0.08886212855577469,
0.02689141407608986,
0.021272625774145126,
-0.04499489441514015,
-0.007709659170359373,
-0.14986425638198853,
-0.0512227937579155,
0.006936005782335997,
0.07337385416030884,
-0.06843864172697067,
-0.03677130118012428,
-0.0641331747174263,
0.0452328622341156,
-0.08673572540283203,
-0.018266988918185234,
-0.05820188298821449,
0.018015461042523384,
0.014888346195220947,
-0.11026767641305923,
-0.04741234332323074,
0.05522526055574417,
-0.12965916097164154,
-0.00622152304276824,
0.011446114629507065,
0.09498736262321472,
-0.12674830853939056,
-0.012535412795841694,
0.019318219274282455,
-0.006598239298909903,
0.11430873721837997,
0.1346781849861145,
-0.13926123082637787,
0.09561137855052948,
-0.20142866671085358,
-0.17536583542823792,
0.12899957597255707,
0.047235991805791855,
0.03979220986366272,
-0.0728563442826271,
-0.02913622558116913,
0.11306175589561462,
0.045580312609672546,
0.010593020357191563,
0.09819541126489639,
-0.060963019728660583,
-0.026408974081277847,
-0.1064816415309906,
-0.04962663725018501,
-0.013670045882463455,
-0.0039727590046823025,
0.16056033968925476,
0.1049533486366272,
0.139187753200531,
-0.032817691564559937,
-0.021793877705931664,
-0.10843578726053238,
0.03649025410413742,
-0.04922841116786003,
-0.1376054883003235,
-0.1318567842245102,
-0.014638138003647327,
0.037019889801740646,
-0.049960050731897354,
0.19084975123405457,
-0.0052359262481331825,
-0.08948326855897903,
0.04187179356813431,
0.018355917185544968,
-0.047355134040117264,
0.003432190278545022,
0.2540024220943451,
0.04178653284907341,
-0.004351835232228041,
-0.005046091508120298,
-0.023181870579719543,
0.028774861246347427,
0.14064238965511322,
0.001334987347945571,
0.16056758165359497,
0.09802443534135818,
0.10578706860542297,
0.12957793474197388,
-0.061414603143930435,
-0.032692715525627136,
0.02639099210500717,
-0.05384235084056854,
0.09835685789585114,
-0.061031993478536606,
0.11372395604848862,
0.17559267580509186,
0.030962742865085602,
0.054521992802619934,
-0.08158332854509354,
-0.01077605877071619,
-0.1529514193534851,
-0.08836986124515533,
-0.07253040373325348,
-0.14181244373321533,
-0.003227401291951537,
-0.0311697106808424,
0.04684918746352196,
0.10020443052053452,
0.029295681044459343,
0.005740863736718893,
0.03443756327033043,
0.021160967648029327,
-0.065836600959301,
0.08059900999069214,
-0.052344102412462234,
-0.043747399002313614,
-0.07597222179174423,
0.013134505599737167,
0.10529393702745438,
0.010718030855059624,
-0.013167896308004856,
0.0034836509730666876,
-0.08306365460157394,
0.044571276754140854,
-0.123865507543087,
-0.08329086005687714,
-0.024602200835943222,
0.02521423064172268,
0.03229662403464317,
0.11070632934570312,
0.08962772786617279,
-0.03915626183152199,
0.06552234292030334,
0.17554351687431335,
-0.08119147270917892,
-0.13441017270088196,
-0.07151100784540176,
0.1319790780544281,
-0.030623260885477066,
0.04582921788096428,
-0.03645573556423187,
-0.05873038247227669,
-0.02098764479160309,
0.20040905475616455,
0.2674155533313751,
-0.0770091563463211,
0.05607936903834343,
-0.0697324201464653,
0.02818276174366474,
-0.04449952393770218,
0.001745874178595841,
0.15643565356731415,
0.16702868044376373,
-0.011340446770191193,
-0.037348248064517975,
-0.06940081715583801,
-0.046641748398542404,
-0.043524280190467834,
0.08016831427812576,
-0.01140284352004528,
-0.10214146971702576,
-0.018071437254548073,
0.08651729673147202,
-0.06875824183225632,
-0.06490954756736755,
-0.16780221462249756,
-0.13813473284244537,
-0.0712367594242096,
-0.021302690729498863,
0.11043204367160797,
0.12806583940982819,
-0.012414369732141495,
-0.07128677517175674,
-0.02403559349477291,
0.04152464121580124,
-0.009597677737474442,
-0.1871153563261032,
-0.009598834440112114,
0.04378046467900276,
-0.1661711037158966,
0.034785423427820206,
-0.012952619232237339,
0.10662230104207993,
0.04098910465836525,
0.0980340763926506,
-0.05131399631500244,
0.13353751599788666,
0.0003508392837829888,
-0.08861357718706131,
0.013216870836913586,
0.08857044577598572,
0.02160196378827095,
0.01978193037211895,
0.050129249691963196,
-0.04643833637237549,
0.04947609081864357,
-0.03940914198756218,
-0.07976388931274414,
-0.06453723460435867,
0.007716465275734663,
-0.04714548960328102,
0.07098744809627533,
0.011041144840419292,
-0.04424671456217766,
-0.031408462673425674,
-0.05584976449608803,
-0.012904556468129158,
0.05068685859441757,
-0.16855183243751526,
-0.09322504699230194,
-0.057772763073444366,
-0.01094722654670477,
-0.09958959370851517,
-0.009615112096071243,
-0.12453150749206543,
-0.036508459597826004,
-0.10635997354984283,
-0.032855015248060226,
-0.03590938448905945,
0.028209500014781952,
0.08899704366922379,
0.020099489018321037,
0.010726084001362324,
-0.008378217928111553,
0.09164778143167496,
0.07643423229455948,
-0.13963548839092255,
-0.07318765670061111
] |
null | null |
transformers
|
# Wav2Vec2-Large-960h-Lv60
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The large model pretrained and fine-tuned on 960 hours of Libri-Light and Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h-lv60")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h-lv60")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-large-960h-lv60** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h-lv60").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h-lv60")
def map_to_pred(batch):
inputs = processor(batch["audio"]["array"], return_tensors="pt", padding="longest")
input_values = inputs.input_values.to("cuda")
attention_mask = inputs.attention_mask.to("cuda")
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, batched=True, batch_size=16, remove_columns=["speech"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 2.2 | 4.5 |
|
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["librispeech_asr"], "model-index": [{"name": "wav2vec2-large-960h-lv60", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Librispeech (clean)", "type": "librispeech_asr", "args": "en"}, "metrics": [{"type": "wer", "value": 2.2, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-960h-lv60
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
Wav2Vec2-Large-960h-Lv60
========================
Facebook's Wav2Vec2
The large model pretrained and fine-tuned on 960 hours of Libri-Light and Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
Usage
=====
To transcribe audio files the model can be used as a standalone acoustic model as follows:
Evaluation
----------
This code snippet shows how to evaluate facebook/wav2vec2-large-960h-lv60 on LibriSpeech's "clean" and "other" test data.
*Result (WER)*:
|
[] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
80
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
-0.09994034469127655,
0.15672197937965393,
-0.003979680594056845,
-0.010504922829568386,
0.04814675822854042,
-0.05179327353835106,
0.09652528166770935,
0.13292744755744934,
0.0550529845058918,
-0.01746097020804882,
0.13607122004032135,
0.16761989891529083,
0.018877452239394188,
0.017432715743780136,
-0.04829683154821396,
-0.1936899870634079,
0.11105281114578247,
-0.0070680854842066765,
0.05246560275554657,
0.08940525352954865,
0.1252441704273224,
-0.06294722855091095,
0.03709675371646881,
0.07407060265541077,
-0.02951635606586933,
0.02919841930270195,
0.03839854151010513,
-0.17014452815055847,
0.11699375510215759,
0.0446942076086998,
-0.00309010106138885,
0.04979639872908592,
0.04100799560546875,
-0.19004181027412415,
0.008905432187020779,
-0.0007880099583417177,
-0.00756528414785862,
0.05321062356233597,
0.011110739782452583,
-0.013898410834372044,
0.05526064708828926,
0.049790360033512115,
-0.019848786294460297,
0.07900265604257584,
-0.061661772429943085,
-0.27664104104042053,
-0.05436552315950394,
0.06343358755111694,
0.028194472193717957,
0.10023218393325806,
-0.021047526970505714,
0.1525554209947586,
-0.08808038383722305,
0.06833010911941528,
0.11563674360513687,
-0.3469681143760681,
0.0339442677795887,
0.006021938286721706,
0.05323352292180061,
-0.0021699354983866215,
-0.04232701659202576,
0.04634016379714012,
0.03370596840977669,
0.008670153096318245,
0.018560944125056267,
-0.04160216450691223,
-0.19577844440937042,
0.040361445397138596,
-0.08970966190099716,
-0.061439815908670425,
0.26980048418045044,
0.0307624489068985,
0.03404213488101959,
-0.060870904475450516,
-0.025812415406107903,
0.06430500745773315,
-0.01047444436699152,
-0.017468536272644997,
0.028292834758758545,
0.08239815384149551,
0.03071722202003002,
-0.04935169219970703,
-0.13477586209774017,
-0.042988914996385574,
-0.15986371040344238,
0.09925023466348648,
-0.0015256828628480434,
0.07444337755441666,
-0.13799741864204407,
-0.002559780376031995,
0.014119709841907024,
-0.09803885221481323,
-0.02236226573586464,
-0.02719508670270443,
0.05254628136754036,
0.07801006734371185,
-0.07074040919542313,
0.028128471225500107,
0.13477690517902374,
0.07958557456731796,
0.057755425572395325,
-0.0042318012565374374,
-0.00923868827521801,
0.09986437112092972,
-0.022932853549718857,
0.14784683287143707,
-0.07857492566108704,
-0.018652353435754776,
0.04512561857700348,
-0.013172734528779984,
0.0552462637424469,
-0.028927728533744812,
-0.10076373815536499,
-0.05639120563864708,
-0.0006135691655799747,
0.07801991701126099,
0.08783907443284988,
0.03245329484343529,
-0.0021086204797029495,
0.02408134937286377,
0.05020323768258095,
-0.10658980906009674,
0.004000387154519558,
0.026824094355106354,
0.020526902750134468,
0.08161667734384537,
0.062047816812992096,
0.0636315867304802,
-0.07764013856649399,
-0.04666902869939804,
-0.024252967908978462,
0.018044278025627136,
0.037981778383255005,
-0.049395106732845306,
0.046787697821855545,
-0.06898139417171478,
0.019334878772497177,
-0.16363656520843506,
-0.06790751218795776,
-0.013496829196810722,
-0.004180805291980505,
0.0034283711574971676,
-0.08054778724908829,
-0.022861022502183914,
-0.05058922618627548,
0.04811149090528488,
-0.10824684798717499,
0.004229494370520115,
-0.07602601498365402,
0.0426856204867363,
-0.0023607860784977674,
0.0868322029709816,
-0.14914382994174957,
0.06924305856227875,
-0.09316651523113251,
-0.03759767860174179,
0.07548276335000992,
0.04597236216068268,
-0.07670766115188599,
0.09744828939437866,
-0.12821482121944427,
-0.022949861362576485,
-0.06388860940933228,
0.032882802188396454,
-0.005879181902855635,
0.10404790192842484,
-0.1771034151315689,
-0.06348178535699844,
0.15290313959121704,
-0.08886250108480453,
-0.1466882824897766,
0.10450183600187302,
0.0331830270588398,
0.016925057396292686,
0.07307018339633942,
0.279561847448349,
-0.0515255406498909,
-0.11974363029003143,
-0.015398507006466389,
0.10942471772432327,
-0.05080229416489601,
-0.14858025312423706,
0.10404349863529205,
-0.10589085519313812,
-0.0012664333917200565,
0.04285683110356331,
-0.03772968798875809,
0.06427165120840073,
0.01152300275862217,
-0.10087178647518158,
-0.05981237068772316,
-0.09183458238840103,
-0.010614477097988129,
-0.030328180640935898,
0.0439443401992321,
-0.03437130153179169,
-0.02786169946193695,
-0.04993163421750069,
0.07540389895439148,
-0.005334680434316397,
0.059648070484399796,
-0.10968195647001266,
0.09309215843677521,
-0.05579632893204689,
0.014956208877265453,
-0.12836235761642456,
0.15431202948093414,
-0.04999496415257454,
0.008682242594659328,
0.08381158113479614,
0.057719502598047256,
0.05493738129734993,
-0.10546298325061798,
0.017779294401407242,
0.005223066546022892,
0.12472978979349136,
0.07742223143577576,
-0.0216740220785141,
-0.1391879767179489,
0.018780004233121872,
-0.07041581720113754,
0.03616955876350403,
-0.023048045113682747,
-0.01988968811929226,
0.005118479486554861,
0.07758163660764694,
-0.04478932544589043,
0.055151019245386124,
0.028173144906759262,
-0.028765715658664703,
-0.013619834557175636,
0.011851613409817219,
0.07036595046520233,
0.012072943150997162,
-0.04447196051478386,
0.27187275886535645,
-0.09883427619934082,
0.18962249159812927,
0.21973593533039093,
-0.1180383712053299,
0.09298653155565262,
0.07698873430490494,
0.006587753538042307,
0.0070977238938212395,
0.050289325416088104,
-0.024059196934103966,
0.15248818695545197,
-0.03653047978878021,
0.11946730315685272,
-0.07376935333013535,
0.017888495698571205,
-0.001372376922518015,
-0.06127865985035896,
-0.038349397480487823,
0.06750839203596115,
0.04597640037536621,
-0.108768031001091,
0.11392222344875336,
0.24821113049983978,
-0.08700080215930939,
0.07470157742500305,
-0.046958256512880325,
-0.03439876809716225,
0.03620253875851631,
-0.04647192358970642,
-0.06049192696809769,
0.06643301993608475,
-0.21009331941604614,
-0.07053490728139877,
0.10170692205429077,
-0.006067706737667322,
0.070982426404953,
-0.13820703327655792,
-0.012909258715808392,
-0.00021398386161308736,
-0.056440506130456924,
-0.11455456167459488,
0.06811633706092834,
-0.015741318464279175,
0.0904674157500267,
-0.04846362769603729,
-0.17396977543830872,
0.04012364149093628,
-0.030996855348348618,
-0.08908029645681381,
0.08952771127223969,
-0.11502201110124588,
-0.21758247911930084,
-0.08013749867677689,
-0.04924774169921875,
-0.028631562367081642,
0.021552635356783867,
0.12098759412765503,
-0.10586168617010117,
-0.04026675969362259,
-0.04432997852563858,
-0.045440807938575745,
-0.07335563749074936,
0.01803252473473549,
0.021337632089853287,
0.01727166958153248,
0.03829893469810486,
-0.16000905632972717,
-0.028966840356588364,
-0.04907345771789551,
0.023330092430114746,
0.02127969078719616,
0.023693712428212166,
0.07051624357700348,
0.14969976246356964,
0.05323303863406181,
0.03314851224422455,
-0.017796341329813004,
0.16039161384105682,
-0.028308525681495667,
-0.05558767169713974,
0.195232555270195,
0.005195188336074352,
0.012513349764049053,
0.18252383172512054,
0.01826612278819084,
-0.03444193676114082,
-0.03960515931248665,
-0.07401548326015472,
-0.04646586254239082,
-0.2188819944858551,
-0.10636724531650543,
-0.09986929595470428,
-0.0500376857817173,
0.02757975645363331,
0.07071875035762787,
0.06236191466450691,
0.005305442027747631,
0.009854964911937714,
-0.07603465020656586,
-0.012817973271012306,
0.024288197979331017,
0.26190319657325745,
-0.050243183970451355,
0.134974867105484,
-0.09510327130556107,
-0.059521157294511795,
0.06349806487560272,
0.07933349907398224,
0.08396197855472565,
0.13022615015506744,
0.07928697764873505,
0.049098070710897446,
0.20189246535301208,
0.08049531280994415,
0.06912904232740402,
0.038491975516080856,
0.012523780576884747,
-0.026332547888159752,
-0.06441373378038406,
-0.03667750209569931,
0.11407212167978287,
0.1734594851732254,
-0.08854156732559204,
-0.004553665407001972,
-0.14633995294570923,
0.05054014176130295,
0.2004224956035614,
0.1069064810872078,
-0.19038641452789307,
0.013418018817901611,
0.046192239969968796,
-0.07037939131259918,
-0.007100182585418224,
0.10256596654653549,
-0.01953952945768833,
-0.04880101978778839,
0.06945372372865677,
0.04306718334555626,
0.07251489162445068,
0.010881857015192509,
0.05534839630126953,
-0.11997528374195099,
-0.1035163626074791,
0.06753060966730118,
0.08038312941789627,
-0.252866268157959,
0.2553052604198456,
-0.008265177719295025,
0.004707979504019022,
-0.04565448313951492,
0.021799420937895775,
0.06996733695268631,
0.04317484050989151,
0.14612562954425812,
0.007522329688072205,
-0.09845900535583496,
0.0032774724531918764,
-0.06495580822229385,
0.08403877168893814,
0.010265855118632317,
0.06869877874851227,
-0.03763652592897415,
-0.03557247668504715,
-0.021753938868641853,
0.04972285404801369,
0.060322824865579605,
-0.13156087696552277,
-0.11970002204179764,
0.03282460570335388,
0.2611459791660309,
0.034568000584840775,
-0.012090081349015236,
-0.05131208151578903,
-0.1900172233581543,
0.09021332859992981,
-0.07895183563232422,
-0.0051175677217543125,
-0.06069634482264519,
-0.14186114072799683,
0.12854620814323425,
-0.047294676303863525,
0.033504679799079895,
-0.04839537292718887,
-0.016179224476218224,
-0.04091415926814079,
-0.12896950542926788,
0.1135995015501976,
-0.11565989255905151,
-0.028824226930737495,
-0.018273333087563515,
0.15243059396743774,
-0.08301622420549393,
0.051477544009685516,
0.07664838433265686,
0.07398044317960739,
-0.14120881259441376,
-0.07262075692415237,
0.0789695531129837,
0.10261105000972748,
-0.04176008701324463,
-0.01668832078576088,
0.0037598840426653624,
-0.20764215290546417,
-0.03775281459093094,
-0.006899429950863123,
0.26076892018318176,
0.14530591666698456,
-0.09979395568370819,
0.17776042222976685,
0.24644635617733002,
-0.05451468750834465,
-0.25898754596710205,
-0.17771661281585693,
-0.1011369377374649,
-0.01726383902132511,
-0.030077775940299034,
-0.1101495549082756,
0.09943443536758423,
-0.04419746994972229,
-0.1271216869354248,
-0.002746659331023693,
-0.17761337757110596,
-0.10279858857393265,
0.28469493985176086,
-0.09617377817630768,
0.28288936614990234,
-0.12127405405044556,
-0.08423254638910294,
-0.07050242274999619,
-0.13495901226997375,
0.10035592317581177,
-0.10248948633670807,
0.07166191190481186,
0.006786031182855368,
0.03077247552573681,
0.009338365867733955,
-0.04066146910190582,
0.14647351205348969,
0.037557557225227356,
-0.05363530293107033,
-0.06313575059175491,
-0.03451986983418465,
0.03056071512401104,
-0.01711619459092617,
0.14030559360980988,
-0.13151761889457703,
0.04926973953843117,
-0.10324045270681381,
-0.027063939720392227,
-0.12432371079921722,
0.09752646833658218,
0.058626286685466766,
-0.015897240489721298,
-0.016369741410017014,
-0.07452412694692612,
0.03721873462200165,
0.02153068035840988,
0.17677882313728333,
-0.06668791174888611,
0.04087761044502258,
0.2228117287158966,
0.09857417643070221,
-0.15382839739322662,
-0.0375385619699955,
-0.047561902552843094,
-0.08563750982284546,
0.11239806562662125,
-0.13429303467273712,
0.04388841241598129,
0.055609725415706635,
0.02195378951728344,
0.07028993219137192,
0.04446094483137131,
-0.041502147912979126,
0.005277352873235941,
0.0958406925201416,
-0.07541544735431671,
-0.09101897478103638,
0.0195827204734087,
0.08270826935768127,
0.02607560157775879,
0.0753965824842453,
0.1598633974790573,
-0.04491923376917839,
-0.018234319984912872,
-0.030534494668245316,
0.02992910146713257,
-0.12286841124296188,
0.08727570623159409,
0.10637083649635315,
0.017075706273317337,
-0.15404629707336426,
0.1040857657790184,
0.022567734122276306,
-0.08403640240430832,
0.03293527290225029,
-0.00028104265220463276,
-0.06141887977719307,
-0.13681161403656006,
-0.12356788665056229,
-0.028350893408060074,
-0.059156544506549835,
-0.11527425795793533,
-0.013421024195849895,
-0.12736965715885162,
0.05070468410849571,
0.07936841994524002,
0.06673160195350647,
0.042172662913799286,
-0.06017174571752548,
-0.09561259299516678,
0.04182331636548042,
-0.019235020503401756,
-0.04358500614762306,
-0.03202911093831062,
-0.11127009987831116,
-0.023524031043052673,
0.02229996770620346,
0.0795380026102066,
-0.04788128659129143,
-0.05519980937242508,
-0.05851202830672264,
0.04984111338853836,
-0.12376198172569275,
0.00026479116058908403,
-0.1073765978217125,
0.0003746830334421247,
0.046231988817453384,
-0.11866550147533417,
-0.04737360030412674,
0.05281816050410271,
-0.13287042081356049,
-0.024489160627126694,
-0.004360380582511425,
0.11020372062921524,
-0.15758764743804932,
-0.013264758512377739,
0.032617468386888504,
-0.0010703562293201685,
0.11671829223632812,
0.1392361968755722,
-0.1321217566728592,
0.10043837130069733,
-0.1626366227865219,
-0.15928351879119873,
0.1274590790271759,
0.039655882865190506,
0.009726326912641525,
-0.05402383953332901,
-0.018658282235264778,
0.1491529643535614,
0.04348176345229149,
0.0067198993638157845,
0.09285944700241089,
-0.09136798232793808,
-0.06043810397386551,
-0.061068739742040634,
-0.062419090420007706,
0.012022892944514751,
-0.07540429383516312,
0.14735932648181915,
0.07318072766065598,
0.15767629444599152,
-0.02903539314866066,
-0.014042360708117485,
-0.0817103311419487,
0.04753676801919937,
-0.0967460498213768,
-0.11190531402826309,
-0.13888590037822723,
-0.01712406612932682,
0.01787114515900612,
-0.047228358685970306,
0.23953519761562347,
-0.012149062938988209,
-0.09108127653598785,
0.04124186560511589,
0.04029403254389763,
-0.046677082777023315,
0.02087046392261982,
0.2836177349090576,
0.08105667680501938,
-0.001320624491199851,
-0.026096608489751816,
-0.002030407078564167,
0.017900733277201653,
0.07412723451852798,
-0.039400432258844376,
0.18081031739711761,
0.15125557780265808,
0.16123037040233612,
0.12130798399448395,
-0.04079233855009079,
-0.018175944685935974,
-0.028701571747660637,
-0.0342695415019989,
0.09401341527700424,
-0.06271035224199295,
0.10216014832258224,
0.17965178191661835,
0.012341567315161228,
0.019517146050930023,
-0.0584084689617157,
0.006166511215269566,
-0.16539542376995087,
-0.09282619506120682,
-0.06966891139745712,
-0.11401939392089844,
-0.0018072525272145867,
-0.030370060354471207,
0.056253831833601,
0.10960518568754196,
0.040101949125528336,
-0.01588345132768154,
-0.034650105983018875,
0.007264168467372656,
-0.08578988164663315,
0.06302496045827866,
-0.051879797130823135,
-0.007549778558313847,
-0.076396644115448,
-0.0051241847686469555,
0.04942398518323898,
0.017503198236227036,
-0.008082402870059013,
0.03152940794825554,
-0.08733076602220535,
0.038619205355644226,
-0.1239638552069664,
-0.06662701070308685,
-0.04676193743944168,
0.027064308524131775,
0.03252095356583595,
0.16544276475906372,
0.09301453828811646,
-0.03592591732740402,
0.08136165887117386,
0.17872585356235504,
-0.11263114213943481,
-0.15932634472846985,
-0.0669865608215332,
0.15000267326831818,
0.007232898846268654,
0.050992824137210846,
-0.04799136519432068,
-0.02804284170269966,
-0.060036443173885345,
0.2170005738735199,
0.2967696189880371,
-0.06068136543035507,
0.04607541114091873,
-0.06578423082828522,
0.017822200432419777,
-0.015297786332666874,
0.006137613207101822,
0.1484336107969284,
0.19888943433761597,
-0.03909170627593994,
-0.0005248523084446788,
-0.04453912004828453,
-0.02221318706870079,
-0.08645308017730713,
0.06782625615596771,
-0.06546512991189957,
-0.12751372158527374,
-0.005689588841050863,
0.08489672094583511,
-0.09371194988489151,
0.024492761120200157,
-0.12456309050321579,
-0.13485243916511536,
-0.06741083413362503,
0.032057568430900574,
0.12129109352827072,
0.12949351966381073,
0.017646174877882004,
-0.058875612914562225,
-0.0157656017690897,
0.06817207485437393,
-0.02155560627579689,
-0.2101009488105774,
-0.01228750217705965,
0.07321957498788834,
-0.09762775152921677,
0.07456152886152267,
0.013931624591350555,
0.1155531033873558,
0.04879867285490036,
0.12346813827753067,
-0.0824529305100441,
0.13286498188972473,
0.00755893113091588,
-0.07871504127979279,
0.04786676913499832,
-0.020996764302253723,
0.011807238683104515,
0.016751499846577644,
0.058055706322193146,
-0.0563122034072876,
0.07318748533725739,
-0.021352184936404228,
-0.05687317997217178,
-0.08230530470609665,
0.012317581102252007,
-0.055127404630184174,
0.06482180207967758,
-0.015269387513399124,
-0.06575974076986313,
-0.06031917780637741,
-0.011251391842961311,
-0.006497659720480442,
0.020124301314353943,
-0.19800183176994324,
-0.11331914365291595,
-0.06207713112235069,
-0.009986904449760914,
-0.058937788009643555,
0.031326547265052795,
-0.1088988184928894,
-0.035319097340106964,
-0.05803821235895157,
-0.021448170766234398,
-0.014937607571482658,
0.015654990449547768,
0.11381268501281738,
-0.018599359318614006,
0.01061638817191124,
-0.05142391473054886,
0.09537690877914429,
0.07583265006542206,
-0.12515899538993835,
-0.09507746249437332
] |
null | null |
transformers
|
# Wav2Vec2-Large-960h
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The large model pretrained and fine-tuned on 960 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"],, return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-large-960h** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import soundfile as sf
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h")
def map_to_pred(batch):
input_values = processor(batch["audio"]["array"], return_tensors="pt", padding="longest").input_values
with torch.no_grad():
logits = model(input_values.to("cuda")).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, batched=True, batch_size=1, remove_columns=["speech"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 2.8 | 6.3 |
|
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["librispeech_asr"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-960h
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
Wav2Vec2-Large-960h
===================
Facebook's Wav2Vec2
The large model pretrained and fine-tuned on 960 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
Usage
=====
To transcribe audio files the model can be used as a standalone acoustic model as follows:
Evaluation
----------
This code snippet shows how to evaluate facebook/wav2vec2-large-960h on LibriSpeech's "clean" and "other" test data.
*Result (WER)*:
|
[] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
73
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.09031634777784348,
0.14857545495033264,
-0.004531687591224909,
-0.014756518416106701,
0.04771598428487778,
-0.04473524168133736,
0.07745806872844696,
0.12226777523756027,
0.027487272396683693,
-0.005848783068358898,
0.13694196939468384,
0.16170616447925568,
0.01668962650001049,
-0.0037815850228071213,
-0.05206944793462753,
-0.16524513065814972,
0.10034625232219696,
-0.008567515760660172,
0.07399213314056396,
0.08372723311185837,
0.1110319271683693,
-0.05795639753341675,
0.037375397980213165,
0.07136240601539612,
-0.027637936174869537,
0.036146268248558044,
0.0384465828537941,
-0.15488599240779877,
0.10758469253778458,
0.04297652468085289,
0.010023706592619419,
0.03924300894141197,
0.03718795254826546,
-0.1869017332792282,
0.0017832351149991155,
0.00028398787253536284,
-0.017596188932657242,
0.04464493319392204,
0.0052396091632544994,
-0.012124894186854362,
0.056408852338790894,
0.011164148338139057,
-0.0339726023375988,
0.07317192852497101,
-0.04567161947488785,
-0.2575424611568451,
-0.06672409921884537,
0.05945422872900963,
0.027677476406097412,
0.10480888187885284,
-0.009139390662312508,
0.1452101320028305,
-0.0798720046877861,
0.06851924955844879,
0.13113053143024445,
-0.3667662441730499,
0.033357441425323486,
-0.00892574992030859,
0.035764772444963455,
-0.0132681205868721,
-0.0374474972486496,
0.04316205158829689,
0.037636127322912216,
0.012491535395383835,
-0.007609959691762924,
-0.03432047739624977,
-0.22201527655124664,
0.04798053950071335,
-0.08536040037870407,
-0.07343116402626038,
0.2521573007106781,
0.016361024230718613,
0.036125779151916504,
-0.05395912379026413,
-0.033153459429740906,
0.0428299643099308,
0.004968371707946062,
0.008207987993955612,
0.013556436635553837,
0.08407101035118103,
0.029983488842844963,
-0.04440164193511009,
-0.14357611536979675,
-0.04880291968584061,
-0.18770234286785126,
0.105166956782341,
-0.005526596214622259,
0.07959524542093277,
-0.11829142272472382,
0.010031497105956078,
0.03448423370718956,
-0.09199652820825577,
-0.03861376643180847,
-0.01334307249635458,
0.06721137464046478,
0.08468401432037354,
-0.08854351937770844,
0.04080674424767494,
0.13921408355236053,
0.08974366635084152,
0.06225164979696274,
-0.0008574665989726782,
-0.035597048699855804,
0.10176469385623932,
-0.04007560387253761,
0.13967622816562653,
-0.05434228852391243,
-0.013072091154754162,
0.05585892125964165,
-0.04918678477406502,
0.07918502390384674,
-0.023748749867081642,
-0.10090292990207672,
-0.07290409505367279,
-0.016897710040211678,
0.08729512989521027,
0.08429725468158722,
0.025975938886404037,
-0.006032274104654789,
0.04509413614869118,
0.07761162519454956,
-0.11154957860708237,
0.01201039832085371,
0.04062292352318764,
0.02097341977059841,
0.0690983384847641,
0.0818072259426117,
0.05415695905685425,
-0.08916867524385452,
-0.014617270790040493,
-0.007996447384357452,
0.01843307539820671,
0.03779559209942818,
-0.03865467384457588,
0.062129028141498566,
-0.07480764389038086,
0.00919124111533165,
-0.17650608718395233,
-0.05741318687796593,
-0.0024881029967218637,
-0.008822109550237656,
0.0021871598437428474,
-0.06014712527394295,
-0.01621047966182232,
-0.04824742302298546,
0.0642128586769104,
-0.10911151021718979,
0.02404651790857315,
-0.07317544519901276,
0.0489642359316349,
-0.009573391638696194,
0.09379766881465912,
-0.16694536805152893,
0.060116056352853775,
-0.09560445696115494,
-0.04012765362858772,
0.10338306427001953,
0.04247148334980011,
-0.08670956641435623,
0.09033659100532532,
-0.11894451826810837,
-0.022790227085351944,
-0.07794207334518433,
0.013580424711108208,
0.00785969290882349,
0.09565908461809158,
-0.15434548258781433,
-0.052931301295757294,
0.14288654923439026,
-0.08945775032043457,
-0.1356806606054306,
0.09540395438671112,
0.048929065465927124,
-0.013922711834311485,
0.054702676832675934,
0.2734815776348114,
-0.03982324153184891,
-0.13131451606750488,
-0.0411432646214962,
0.12747783958911896,
-0.06899119913578033,
-0.15394113957881927,
0.09324654936790466,
-0.11252683401107788,
0.01875506527721882,
0.031373538076877594,
-0.024209870025515556,
0.07321290671825409,
0.018701594322919846,
-0.10436542332172394,
-0.07025767862796783,
-0.09497580677270889,
-0.005976270418614149,
-0.039772920310497284,
0.029099857434630394,
-0.04270362854003906,
-0.016358567401766777,
-0.06688953936100006,
0.06466682255268097,
0.0006668801652267575,
0.0800933688879013,
-0.10950131714344025,
0.06901723891496658,
-0.041550129652023315,
0.02295033633708954,
-0.12715423107147217,
0.14129330217838287,
-0.05108470469713211,
0.013307133689522743,
0.07719806581735611,
0.042289186269044876,
0.06111166253685951,
-0.10399843007326126,
0.01725769229233265,
0.0036079357378184795,
0.11496564745903015,
0.07701446115970612,
-0.012337502092123032,
-0.1231144368648529,
0.022841911762952805,
-0.060305435210466385,
0.017115550115704536,
-0.00009453925304114819,
-0.023441661149263382,
0.05459072068333626,
0.07794006168842316,
-0.04855845123529434,
0.046810973435640335,
0.012273190543055534,
-0.021160855889320374,
-0.017344770953059196,
0.0069001587107777596,
0.07076966762542725,
0.023417199030518532,
-0.04632067307829857,
0.2656203806400299,
-0.09666606038808823,
0.15826618671417236,
0.23124238848686218,
-0.12876132130622864,
0.10508950799703598,
0.08470456302165985,
-0.0059108007699251175,
-0.0062645780853927135,
0.05953311175107956,
-0.03447931632399559,
0.1346752643585205,
-0.03813225403428078,
0.11799780279397964,
-0.06012864410877228,
0.012152092531323433,
-0.0034023430198431015,
-0.056346409022808075,
-0.03861306607723236,
0.06157978996634483,
0.044377949088811874,
-0.0983240157365799,
0.10785607993602753,
0.2587341070175171,
-0.09312135726213455,
0.06788387894630432,
-0.048041921108961105,
-0.038352224975824356,
0.04849463328719139,
-0.02948988787829876,
-0.04558461904525757,
0.05441654473543167,
-0.21447545289993286,
-0.05496220290660858,
0.1056574285030365,
-0.0015202172799035907,
0.07343195378780365,
-0.14697541296482086,
-0.0033763344399631023,
-0.006574939005076885,
-0.06068513169884682,
-0.11221998184919357,
0.06800282746553421,
-0.01721225120127201,
0.08757282793521881,
-0.04755191504955292,
-0.15728338062763214,
0.05582689493894577,
-0.03336501121520996,
-0.09157612919807434,
0.07864554971456528,
-0.12659266591072083,
-0.20887288451194763,
-0.10193093121051788,
-0.07301755994558334,
-0.02909296192228794,
0.030135471373796463,
0.13551995158195496,
-0.0999024361371994,
-0.03767961263656616,
-0.0331827774643898,
-0.047445688396692276,
-0.050721678882837296,
0.026091234758496284,
0.04867665842175484,
0.021421611309051514,
0.04620003700256348,
-0.16031701862812042,
-0.021802036091685295,
-0.03924982249736786,
0.008062650449573994,
0.02445640228688717,
0.036735594272613525,
0.08501585572957993,
0.13026124238967896,
0.04332128167152405,
0.030903540551662445,
-0.016611343249678612,
0.14199616014957428,
-0.020975645631551743,
-0.06626064330339432,
0.1826094537973404,
-0.014187644235789776,
0.014344710856676102,
0.1512245088815689,
0.013598565943539143,
-0.03895338624715805,
-0.028481511399149895,
-0.07248055189847946,
-0.06304837763309479,
-0.22919347882270813,
-0.12453297525644302,
-0.10304421186447144,
-0.034683551639318466,
0.02663177065551281,
0.0738879069685936,
0.040746912360191345,
-0.0032008313573896885,
0.0035165115259587765,
-0.06476723402738571,
-0.014764867722988129,
0.021955201402306557,
0.284697949886322,
-0.055672068148851395,
0.11750483512878418,
-0.11742556095123291,
-0.04966988414525986,
0.06744375824928284,
0.09004393219947815,
0.09221484512090683,
0.12852948904037476,
0.07097127288579941,
0.043249763548374176,
0.20058904588222504,
0.06378302723169327,
0.0916951447725296,
0.04697131738066673,
0.008230539970099926,
-0.009014013223350048,
-0.060811325907707214,
-0.038661010563373566,
0.11299067735671997,
0.147914856672287,
-0.09800677746534348,
0.009928733110427856,
-0.1461716592311859,
0.05610281601548195,
0.19981645047664642,
0.1014191135764122,
-0.1671595722436905,
0.020421918481588364,
0.03968507796525955,
-0.057966992259025574,
-0.005108271725475788,
0.1020175889134407,
-0.0036093059461563826,
-0.03313459828495979,
0.08554141968488693,
0.043884627521038055,
0.06717097014188766,
0.010493628680706024,
0.04827994108200073,
-0.11075033247470856,
-0.12836138904094696,
0.07533733546733856,
0.08833696693181992,
-0.24564218521118164,
0.2382490485906601,
-0.007249655667692423,
0.01416007149964571,
-0.052550818771123886,
0.017876189202070236,
0.05964424088597298,
0.04082052409648895,
0.14463330805301666,
0.0076553234830498695,
-0.0919092521071434,
0.010077611543238163,
-0.07333758473396301,
0.08147985488176346,
0.041985124349594116,
0.09165751934051514,
-0.03992098569869995,
-0.026768449693918228,
-0.022054072469472885,
0.04779413715004921,
0.06560839712619781,
-0.12312762439250946,
-0.1271212249994278,
0.027956116944551468,
0.29066723585128784,
0.040985945612192154,
-0.0112947141751647,
-0.03935139253735542,
-0.1847560852766037,
0.11819738149642944,
-0.1401299089193344,
-0.0016835718415677547,
-0.04928947985172272,
-0.15599152445793152,
0.11424199491739273,
-0.04070957750082016,
0.041906777769327164,
-0.03686719760298729,
-0.028663616627454758,
-0.04691332206130028,
-0.13659366965293884,
0.11215750873088837,
-0.12037701159715652,
-0.03536057099699974,
-0.008385518565773964,
0.15144231915473938,
-0.07978580892086029,
0.052533142268657684,
0.06961449235677719,
0.06927289068698883,
-0.13383929431438446,
-0.08067211508750916,
0.09572027623653412,
0.09494119882583618,
-0.025134222581982613,
-0.005056946538388729,
-0.002374023664742708,
-0.21369560062885284,
-0.01735031232237816,
-0.019232243299484253,
0.2734804153442383,
0.12833747267723083,
-0.10746797174215317,
0.17885766923427582,
0.22079980373382568,
-0.05025222897529602,
-0.28079918026924133,
-0.1854899525642395,
-0.10340411961078644,
-0.02675672248005867,
-0.04516066983342171,
-0.11114074289798737,
0.09261789172887802,
-0.03189542889595032,
-0.13236238062381744,
0.01881435327231884,
-0.16951288282871246,
-0.10204093158245087,
0.2648472785949707,
-0.11500945687294006,
0.2787877917289734,
-0.11721283942461014,
-0.08693823963403702,
-0.08055523782968521,
-0.14450092613697052,
0.10521857440471649,
-0.10973028838634491,
0.0745033547282219,
0.005141061265021563,
0.03682814538478851,
0.002749866107478738,
-0.04830733686685562,
0.145588681101799,
0.045172255486249924,
-0.05368414893746376,
-0.06838606297969818,
-0.01769942417740822,
0.03944951295852661,
-0.008601740933954716,
0.13525079190731049,
-0.12587562203407288,
0.04390562325716019,
-0.09384477883577347,
-0.01628955267369747,
-0.1218361034989357,
0.11082632839679718,
0.06760382652282715,
-0.00737131480127573,
-0.02580111287534237,
-0.10374780744314194,
0.03224140405654907,
0.026513049378991127,
0.17420002818107605,
-0.05255154147744179,
0.008959989063441753,
0.22852444648742676,
0.10125966370105743,
-0.15347979962825775,
-0.004175141453742981,
-0.04716639593243599,
-0.09016860276460648,
0.11172474175691605,
-0.12103795260190964,
0.0527208112180233,
0.06230293586850166,
0.020259784534573555,
0.04088180139660835,
0.04971466585993767,
-0.03189045563340187,
0.00306355906650424,
0.11019132286310196,
-0.09731577336788177,
-0.06253574043512344,
0.029684636741876602,
0.08228877931833267,
0.04060540720820427,
0.08608624339103699,
0.15373271703720093,
-0.03490900248289108,
-0.005439590662717819,
-0.026255987584590912,
0.028125837445259094,
-0.1373659372329712,
0.09124615788459778,
0.09417863935232162,
0.009823260828852654,
-0.15280896425247192,
0.11217225342988968,
0.011181692592799664,
-0.11018233001232147,
0.038885448127985,
-0.003826528787612915,
-0.07213433086872101,
-0.13087522983551025,
-0.12193501740694046,
-0.0593748465180397,
-0.06070772930979729,
-0.13411730527877808,
0.00662463903427124,
-0.11666929721832275,
0.05863112211227417,
0.08408375829458237,
0.07098647207021713,
0.06469471007585526,
-0.05307772755622864,
-0.08551435172557831,
0.042709723114967346,
-0.02528214640915394,
-0.07110203057527542,
-0.02269287221133709,
-0.10089746862649918,
-0.01560609694570303,
0.015102538280189037,
0.07217737287282944,
-0.04565655067563057,
-0.03939120098948479,
-0.03809356689453125,
0.052875369787216187,
-0.12235283851623535,
-0.0020527089945971966,
-0.08989664167165756,
0.0016082633519545197,
0.0459260456264019,
-0.11813150346279144,
-0.04554387554526329,
0.060538459569215775,
-0.11814676970243454,
-0.03522698953747749,
-0.016468502581119537,
0.09158715605735779,
-0.17537501454353333,
-0.019461622461676598,
0.03624837473034859,
0.000774183077737689,
0.11767879128456116,
0.1553523987531662,
-0.12706398963928223,
0.09292513132095337,
-0.17257040739059448,
-0.1743219792842865,
0.1442112922668457,
0.04538218677043915,
-0.0005807451088912785,
-0.030577214434742928,
-0.024853529408574104,
0.16433611512184143,
0.039232317358255386,
0.013794571161270142,
0.12254021316766739,
-0.09261889010667801,
-0.04256097227334976,
-0.07869752496480942,
-0.059852443635463715,
0.007766589522361755,
-0.09170243889093399,
0.15142223238945007,
0.07714954763650894,
0.15721853077411652,
-0.023751523345708847,
-0.025826888158917427,
-0.05604732036590576,
0.045060865581035614,
-0.0859953761100769,
-0.1013469323515892,
-0.1466383934020996,
-0.008892239071428776,
0.012838403694331646,
-0.03833816945552826,
0.2379293590784073,
-0.02379567176103592,
-0.11038142442703247,
0.050109513103961945,
0.04876043274998665,
-0.06889704614877701,
0.016406137496232986,
0.292982816696167,
0.0794212818145752,
-0.009678169153630733,
-0.03350931033492088,
-0.01741098426282406,
0.015201037749648094,
0.052289459854364395,
-0.026324449107050896,
0.16509470343589783,
0.14471490681171417,
0.15921436250209808,
0.11302463710308075,
-0.05704193562269211,
-0.06785415858030319,
-0.04457812383770943,
-0.05044052377343178,
0.08916351944208145,
-0.06391620635986328,
0.11075230687856674,
0.16017110645771027,
0.012092138640582561,
0.0009444564930163324,
-0.05779986083507538,
0.009843805804848671,
-0.16216252744197845,
-0.0789385437965393,
-0.059491392225027084,
-0.10767177492380142,
0.007705820724368095,
-0.016728658229112625,
0.06885350495576859,
0.11171542853116989,
0.030702751129865646,
-0.008143343031406403,
-0.004016263876110315,
-0.018346471711993217,
-0.07485835999250412,
0.051208123564720154,
-0.052898067981004715,
-0.0015294435434043407,
-0.06552533060312271,
-0.01586708053946495,
0.05051576718688011,
0.004195530898869038,
-0.0008958101971074939,
0.04922081530094147,
-0.07673953473567963,
0.0456746369600296,
-0.11884327977895737,
-0.052676647901535034,
-0.05652282014489174,
0.040068119764328,
0.04545960947871208,
0.17223738133907318,
0.09184618294239044,
-0.02925807796418667,
0.08191955834627151,
0.1565815806388855,
-0.12317577004432678,
-0.17359189689159393,
-0.05184962972998619,
0.1206590086221695,
0.012817362323403358,
0.0718122348189354,
-0.05257472023367882,
-0.02366875484585762,
-0.06682063639163971,
0.2242550253868103,
0.2639330327510834,
-0.04590554162859917,
0.049554675817489624,
-0.05848689377307892,
0.02886630967259407,
-0.006937581114470959,
0.005972187500447035,
0.17364677786827087,
0.21751976013183594,
-0.04195050895214081,
-0.038255177438259125,
-0.055902399122714996,
-0.009789410047233105,
-0.1086556538939476,
0.06426124274730682,
-0.09155557304620743,
-0.13686080276966095,
-0.013826608657836914,
0.07070416957139969,
-0.07935269176959991,
0.03732909634709358,
-0.0955512523651123,
-0.1334538757801056,
-0.044706691056489944,
0.050499606877565384,
0.1495683789253235,
0.11940927803516388,
0.017551781609654427,
-0.058230213820934296,
-0.02962612174451351,
0.07016672939062119,
-0.023418080061674118,
-0.21781007945537567,
0.014251860789954662,
0.04395117983222008,
-0.11489909142255783,
0.08001494407653809,
0.013218727894127369,
0.1300700157880783,
0.047269754111766815,
0.13434304296970367,
-0.08569158613681793,
0.13364233076572418,
0.005536850541830063,
-0.09641194343566895,
0.04351728782057762,
-0.02944675274193287,
-0.003640119219198823,
0.014461977407336235,
0.05217735469341278,
-0.040403980761766434,
0.07420974224805832,
0.03393707051873207,
-0.035409554839134216,
-0.08054004609584808,
-0.0016557659255340695,
-0.05379807949066162,
0.060413435101509094,
-0.02891155704855919,
-0.05924079939723015,
-0.04757669195532799,
-0.007095259614288807,
-0.01762210950255394,
0.012537525966763496,
-0.19422699511051178,
-0.10400482267141342,
-0.056902896612882614,
-0.010407010093331337,
-0.04848446696996689,
0.036487922072410583,
-0.08762049674987793,
-0.036695465445518494,
-0.05874020606279373,
-0.011583789251744747,
-0.03600155934691429,
0.0024736335035413504,
0.09040094912052155,
-0.033406082540750504,
0.011273873038589954,
-0.06229902803897858,
0.10041220486164093,
0.06949527561664581,
-0.11580048501491547,
-0.1048194095492363
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **baltic** on **27.5** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **baltic**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "baltic", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-baltic-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"baltic"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in baltic on 27.5 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in baltic. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in baltic on 27.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in baltic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in baltic on 27.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in baltic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
70,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in baltic on 27.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in baltic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07390192896127701,
0.10022685676813126,
-0.0028725548181682825,
0.003917866852134466,
0.07107856124639511,
-0.047084990888834,
0.1438121795654297,
0.042899101972579956,
0.014824462123215199,
0.0988798588514328,
-0.010813485831022263,
-0.043453067541122437,
0.07308393716812134,
0.1408415287733078,
0.06249137967824936,
-0.2665056884288788,
0.04020357131958008,
-0.060418080538511276,
0.059174180030822754,
0.04749320074915886,
0.12441141903400421,
-0.07987562566995621,
0.03247574344277382,
0.0567401684820652,
-0.038370754569768906,
0.028180260211229324,
-0.050801828503608704,
-0.08265311270952225,
0.0547078438103199,
0.046645719558000565,
-0.0346444696187973,
0.024609306827187538,
0.08331460505723953,
-0.1825844943523407,
0.035933446139097214,
0.03508583828806877,
0.02586505375802517,
0.014818098396062851,
0.0971083715558052,
0.02154652774333954,
0.17780545353889465,
-0.031214343383908272,
0.0003583183279260993,
0.07578925043344498,
-0.054229527711868286,
-0.08728727698326111,
-0.063182033598423,
0.16251802444458008,
0.09895050525665283,
0.10041805356740952,
-0.07732363045215607,
0.08891002088785172,
-0.02412322908639908,
0.04493458941578865,
0.06493653357028961,
-0.1853475719690323,
-0.054019927978515625,
0.06294363737106323,
0.11192096769809723,
0.02598109468817711,
-0.08742860704660416,
0.07085166126489639,
0.0483333021402359,
-0.016265060752630234,
-0.06918153166770935,
-0.035727616399526596,
0.13475343585014343,
-0.11044522374868393,
-0.1174650490283966,
0.004067992325872183,
0.1675107330083847,
0.05746285617351532,
-0.07699498534202576,
-0.14918707311153412,
0.01155211590230465,
0.22338701784610748,
-0.053961947560310364,
-0.09362814575433731,
0.011673960834741592,
0.01590130291879177,
0.05849963054060936,
-0.06938581168651581,
-0.07154656946659088,
-0.004399327095597982,
0.01563316397368908,
0.10568175464868546,
0.020284883677959442,
-0.015352671965956688,
-0.06846465170383453,
-0.0065057841129601,
-0.09737063199281693,
-0.11678352952003479,
-0.006536383181810379,
-0.06740068644285202,
-0.0686715692281723,
-0.038489725440740585,
-0.003188100177794695,
-0.10123353451490402,
0.021721158176660538,
0.0998508557677269,
0.07386115938425064,
0.051314644515514374,
-0.04668894782662392,
-0.033687785267829895,
0.12646973133087158,
0.07091359794139862,
-0.12253418564796448,
-0.014728767797350883,
0.015660081058740616,
-0.01683332584798336,
0.015237854793667793,
-0.036136120557785034,
-0.04438871145248413,
0.003066020319238305,
-0.026259396225214005,
0.049572937190532684,
0.0567346028983593,
-0.03239011391997337,
-0.035264380276203156,
-0.09770160913467407,
0.10902974009513855,
-0.07496107369661331,
0.024676885455846786,
0.044141001999378204,
-0.009189185686409473,
0.09774862974882126,
-0.06279990077018738,
0.07739495486021042,
-0.10790273547172546,
0.005496257450431585,
-0.0335720032453537,
-0.00436827028170228,
0.024766478687524796,
-0.031480640172958374,
0.037524767220020294,
-0.008490280248224735,
0.006403535138815641,
-0.11424845457077026,
-0.009944701567292213,
-0.0972977876663208,
-0.02335648611187935,
-0.08515431731939316,
-0.04298807308077812,
-0.041147951036691666,
0.016198959201574326,
-0.007400610484182835,
-0.006035332102328539,
0.022860383614897728,
-0.017579874023795128,
-0.008449104614555836,
-0.0004065769608132541,
0.05245309695601463,
0.04655546322464943,
0.07701805233955383,
-0.022465521469712257,
-0.01733241230249405,
-0.09974956512451172,
0.11664837598800659,
-0.07421301305294037,
-0.015923596918582916,
-0.140951007604599,
-0.03783532977104187,
-0.03209373354911804,
0.03236299753189087,
0.011113081127405167,
0.12448567897081375,
-0.18338024616241455,
-0.07151948660612106,
0.11198707669973373,
-0.12367134541273117,
0.011903911828994751,
0.17959657311439514,
-0.000005969168341835029,
0.08023141324520111,
0.10536142438650131,
0.22922244668006897,
0.020398646593093872,
-0.17411407828330994,
-0.01218748651444912,
-0.04647481441497803,
0.038008712232112885,
0.12054595351219177,
0.06408968567848206,
-0.05975988134741783,
0.06423228979110718,
-0.01685926504433155,
-0.024134604260325432,
-0.07141949236392975,
-0.003579463344067335,
-0.044111885130405426,
0.020584795624017715,
-0.04952036961913109,
0.020591404289007187,
-0.007253056392073631,
-0.014328165911138058,
-0.012553388252854347,
-0.08783986419439316,
-0.0593264177441597,
0.11965585500001907,
-0.05768497660756111,
0.025164950639009476,
-0.09758125245571136,
0.06960874050855637,
0.06233461946249008,
0.004940371960401535,
-0.12193416059017181,
0.11641796678304672,
0.03682573512196541,
-0.05741160362958908,
0.140068918466568,
0.07563084363937378,
-0.034149885177612305,
0.010325111448764801,
-0.014909051358699799,
0.02424641139805317,
-0.019890472292900085,
0.012762508355081081,
-0.03265397623181343,
-0.10470845550298691,
-0.009344576857984066,
-0.06931410729885101,
0.11023524403572083,
-0.1324775665998459,
-0.016424644738435745,
0.04039892554283142,
0.11002042144536972,
-0.017335552722215652,
-0.038940493017435074,
0.09300227463245392,
0.0436650849878788,
0.03244892135262489,
-0.0185621976852417,
0.024541199207305908,
-0.018312819302082062,
-0.004747713916003704,
0.05333463102579117,
-0.14881442487239838,
-0.1646169126033783,
0.09717007726430893,
0.02460848167538643,
-0.015973033383488655,
0.05706596374511719,
0.02327325940132141,
-0.01800622045993805,
-0.034734006971120834,
0.002429274143651128,
0.22189311683177948,
-0.011907740496098995,
0.06347300112247467,
-0.0843733549118042,
-0.012279868125915527,
0.012894880026578903,
-0.056400734931230545,
-0.09038142114877701,
0.08018677681684494,
-0.006231373641639948,
-0.08773277699947357,
-0.039095439016819,
0.04901053011417389,
0.06843960285186768,
0.1624116152524948,
0.008516230620443821,
-0.08857116848230362,
-0.03273172676563263,
-0.06104767322540283,
-0.013395179994404316,
0.04182228073477745,
-0.14096571505069733,
-0.026740048080682755,
0.026880228891968727,
0.00474745687097311,
0.04775211960077286,
-0.0208174679428339,
0.04174208641052246,
0.008297517895698547,
-0.04931985214352608,
-0.08053044229745865,
0.04521072283387184,
-0.03295430168509483,
0.04081388935446739,
-0.009831302799284458,
0.007112535182386637,
-0.04826197028160095,
-0.05878831073641777,
-0.14130324125289917,
0.09099531173706055,
-0.06532051414251328,
-0.3031482398509979,
-0.08954867720603943,
-0.05676649883389473,
-0.030621744692325592,
0.014059973880648613,
0.04402616620063782,
-0.1123470813035965,
-0.11043781042098999,
-0.06873941421508789,
0.12903441488742828,
-0.021955635398626328,
-0.06148089841008186,
0.11247999966144562,
-0.008806895464658737,
0.02454737015068531,
-0.09758459031581879,
0.015303888358175755,
-0.045987389981746674,
-0.031470250338315964,
-0.028921687975525856,
0.008998221717774868,
0.05670975521206856,
0.12074808031320572,
0.021524537354707718,
-0.006335735321044922,
0.007034280803054571,
0.2237161099910736,
-0.12845954298973083,
0.08215459436178207,
0.24015243351459503,
-0.05283643677830696,
-0.007888365536928177,
0.14113789796829224,
-0.009834562428295612,
-0.05393916741013527,
0.03998343273997307,
-0.00019863872148562223,
-0.020457569509744644,
-0.22142186760902405,
-0.1293344348669052,
-0.04291025176644325,
-0.02611192688345909,
0.04341816529631615,
0.014200262725353241,
-0.0014814682072028518,
0.02123260125517845,
-0.08582334220409393,
-0.03809192031621933,
0.055231478065252304,
0.03536215052008629,
0.1517796814441681,
0.007606341503560543,
0.05430831015110016,
-0.04572691768407822,
-0.018867962062358856,
0.10317344218492508,
-0.031271930783987045,
0.039818186312913895,
0.07132861018180847,
0.10407961905002594,
0.05924930050969124,
0.03659672290086746,
0.05959432199597359,
-0.013858241029083729,
-0.026015950366854668,
-0.005487038288265467,
-0.03311247006058693,
-0.0631461888551712,
0.013940650969743729,
0.048462316393852234,
0.15261968970298767,
-0.13363231718540192,
-0.12000314146280289,
0.024198397994041443,
0.015440994873642921,
0.11021412909030914,
0.09811601042747498,
-0.03274831920862198,
-0.098392054438591,
0.03720833733677864,
-0.09369370341300964,
-0.03418201953172684,
0.04251449182629585,
0.08574219793081284,
-0.15766914188861847,
0.09592229872941971,
0.07663454860448837,
0.08344700187444687,
-0.03760023042559624,
0.03364523872733116,
-0.06320220232009888,
0.057467855513095856,
0.0048667555674910545,
0.07295849174261093,
-0.16872195899486542,
0.11354947090148926,
0.016857236623764038,
0.0844825953245163,
-0.0581744946539402,
0.02551368437707424,
0.044467296451330185,
0.005037768743932247,
0.1264551877975464,
-0.007704906165599823,
-0.0670810341835022,
-0.0030261948704719543,
-0.11868846416473389,
0.014955713413655758,
0.053644489496946335,
-0.05815574899315834,
0.05491835996508598,
-0.0021777672227472067,
-0.005915546789765358,
-0.03417355194687843,
-0.0021203835494816303,
-0.24356479942798615,
-0.1403375267982483,
0.04894418269395828,
-0.010948405601084232,
0.03916684538125992,
-0.03949955105781555,
-0.07858793437480927,
-0.11663585156202316,
0.09764347970485687,
0.011260295286774635,
-0.022825133055448532,
-0.07225793600082397,
0.02779829315841198,
0.11192803829908371,
-0.06616658717393875,
0.017486808821558952,
0.039861761033535004,
0.14048808813095093,
-0.06834761053323746,
-0.04054373875260353,
0.01883363351225853,
-0.09845220297574997,
-0.11831819266080856,
0.011040118522942066,
0.17520283162593842,
0.1065172329545021,
0.06129308044910431,
0.09515208750963211,
0.019697073847055435,
-0.009334662929177284,
-0.09737538546323776,
0.019711660221219063,
0.021952155977487564,
-0.0797150731086731,
0.04570761322975159,
-0.006535982247442007,
-0.26904329657554626,
-0.14152328670024872,
-0.06450646370649338,
0.07669748365879059,
0.20159104466438293,
-0.02456505224108696,
0.16967856884002686,
0.2829643487930298,
-0.08781170845031738,
-0.22178979218006134,
-0.04380153492093086,
0.004208676517009735,
0.029456928372383118,
0.04466197267174721,
-0.20058193802833557,
0.09790030866861343,
-0.000711211352609098,
0.013729271478950977,
-0.07399071007966995,
-0.21431799232959747,
-0.13659733533859253,
0.1711796075105667,
-0.025199539959430695,
0.0592569001019001,
-0.028621865436434746,
-0.07025764882564545,
-0.0357128269970417,
-0.04038254916667938,
0.01874311827123165,
-0.078504279255867,
0.07830927520990372,
0.04862424358725548,
0.020104706287384033,
0.02547854743897915,
0.014989667572081089,
0.1193942278623581,
0.08817265927791595,
-0.027056772261857986,
-0.08021309226751328,
0.013677967712283134,
0.01021396741271019,
-0.012609641067683697,
0.10618129372596741,
0.046329084783792496,
0.016390612348914146,
-0.04284762591123581,
-0.08612662553787231,
-0.07188903540372849,
0.06086499243974686,
-0.07109465450048447,
-0.015283244661986828,
-0.056498050689697266,
0.09124867618083954,
0.019216835498809814,
-0.0009368384489789605,
-0.0854717344045639,
-0.0909884050488472,
-0.02237439528107643,
0.11676876991987228,
0.21538902819156647,
-0.06400932371616364,
0.0017035751370713115,
-0.03980877622961998,
-0.04470663517713547,
0.047591861337423325,
-0.0012707207351922989,
0.03563433513045311,
0.051271647214889526,
0.02205316163599491,
0.08447336405515671,
-0.03586495667695999,
-0.13059309124946594,
0.02690209075808525,
0.03916315361857414,
-0.0643000528216362,
-0.17986541986465454,
-0.04028432443737984,
-0.006314459722489119,
-0.021414952352643013,
-0.04336593300104141,
0.19161608815193176,
-0.01313282735645771,
-0.05685940384864807,
0.0047355229035019875,
0.05615660175681114,
-0.0074557107873260975,
0.12232597917318344,
0.04591619595885277,
0.03664364293217659,
-0.09092724323272705,
0.04866452142596245,
0.11670660227537155,
-0.030386770144104958,
0.04686843231320381,
0.09180644154548645,
-0.05040830373764038,
-0.05403931066393852,
-0.10906681418418884,
-0.0020027728751301765,
0.043511807918548584,
-0.05219119414687157,
0.0006478959112428129,
-0.10995377600193024,
0.004940119106322527,
0.022827113047242165,
0.015323258005082607,
-0.04460231587290764,
-0.05739230290055275,
-0.0012553015258163214,
-0.09377239644527435,
0.063229039311409,
0.10436354577541351,
-0.027845706790685654,
-0.10573899745941162,
0.11440396308898926,
0.012898792512714863,
0.07655227929353714,
-0.03871393948793411,
-0.060022078454494476,
-0.09117454290390015,
-0.007983770221471786,
-0.10433945804834366,
0.03346286341547966,
-0.14092500507831573,
-0.013800574466586113,
-0.04838133975863457,
-0.035369016230106354,
-0.007200291380286217,
0.06538711488246918,
-0.03379353880882263,
0.003918851725757122,
-0.03180120512843132,
0.09097511321306229,
-0.1232132762670517,
0.06802207231521606,
0.05904870107769966,
-0.04470273107290268,
0.11044111102819443,
0.020802056416869164,
-0.047132816165685654,
0.03435128927230835,
-0.21958349645137787,
-0.056867145001888275,
-0.027323979884386063,
0.045050911605358124,
-0.007663966156542301,
-0.17485564947128296,
0.0004712208465207368,
0.020980603992938995,
0.013771670870482922,
-0.01573248580098152,
0.06315083801746368,
-0.02725912444293499,
-0.01852750964462757,
-0.06722694635391235,
-0.06269577890634537,
-0.038156114518642426,
0.06645616143941879,
0.06534165143966675,
0.007617991417646408,
0.10067114233970642,
-0.08713265508413315,
0.0737575963139534,
-0.08220163732767105,
0.024084731936454773,
-0.03334106504917145,
0.020399203523993492,
-0.07384227961301804,
-0.07738904654979706,
0.07881896942853928,
-0.014613303355872631,
0.08354143798351288,
0.026204677298665047,
-0.03977440297603607,
0.042074963450431824,
-0.036325693130493164,
-0.04653618112206459,
0.036595821380615234,
0.14808212220668793,
0.05480799078941345,
0.02380516566336155,
-0.0014079294633120298,
-0.04086370766162872,
0.004338731989264488,
0.1462078094482422,
0.13891829550266266,
0.16348984837532043,
0.09713950753211975,
0.03523970767855644,
0.07206833362579346,
-0.04071445018053055,
-0.08002518862485886,
0.07960450649261475,
-0.06855025142431259,
0.034452956169843674,
-0.04633210971951485,
-0.06821512430906296,
0.068169005215168,
-0.1461372673511505,
0.07044648379087448,
-0.016840949654579163,
-0.08526016026735306,
-0.10697629302740097,
-0.14405271410942078,
-0.06351388990879059,
-0.03616754338145256,
0.0027288864366710186,
-0.10952995717525482,
0.02865929901599884,
0.009161625057458878,
0.02922923117876053,
-0.08709681034088135,
0.11561858654022217,
-0.1281355768442154,
-0.12606388330459595,
0.15250249207019806,
-0.03179101645946503,
-0.014153781346976757,
-0.001754437224008143,
0.04213158041238785,
0.01655063033103943,
0.09496579319238663,
0.045028943568468094,
0.0472194142639637,
0.02291201613843441,
0.02895171567797661,
-0.09850472211837769,
-0.06655874848365784,
0.027386890724301338,
-0.013632574118673801,
0.10150459408760071,
0.19122342765331268,
0.08698831498622894,
-0.08383461833000183,
0.009343062527477741,
0.13449496030807495,
0.0323493666946888,
-0.11826328188180923,
-0.14177349209785461,
0.0413612462580204,
-0.0396890789270401,
-0.0023669409565627575,
0.0029622118454426527,
-0.08706777542829514,
0.017907358705997467,
0.20583651959896088,
0.1792653650045395,
-0.04588276520371437,
0.01698748581111431,
-0.011373909190297127,
0.00828041136264801,
0.025032280012965202,
0.08733083307743073,
0.08298482745885849,
0.18779610097408295,
-0.011826573871076107,
0.05953732132911682,
-0.020633593201637268,
-0.09381845593452454,
-0.1200147494673729,
0.1001945286989212,
0.008616437204182148,
-0.037157244980335236,
-0.004614102188497782,
0.18234141170978546,
-0.10342320799827576,
-0.21102361381053925,
-0.11742304265499115,
-0.05404168367385864,
-0.11591333150863647,
0.023984044790267944,
-0.047333456575870514,
0.13504989445209503,
0.05428232625126839,
-0.007362947333604097,
0.010566155426204205,
0.1665426641702652,
0.03328883647918701,
0.026751013472676277,
-0.03505732864141464,
0.11854390054941177,
-0.08872769773006439,
0.1161656305193901,
-0.0032762116752564907,
0.05318369343876839,
0.034941479563713074,
0.03625469654798508,
-0.06896606832742691,
0.030539274215698242,
0.035171449184417725,
0.0005988819757476449,
0.05254661664366722,
0.16340163350105286,
-0.004209036473184824,
0.09564897418022156,
0.10594566911458969,
-0.06375103443861008,
0.0282818041741848,
-0.018531767651438713,
0.0006631161668337882,
-0.05802319198846817,
0.15562359988689423,
-0.1528501659631729,
0.1310293972492218,
0.10704007744789124,
-0.0731310248374939,
-0.04987644404172897,
-0.008661150932312012,
0.049052391201257706,
-0.05034370347857475,
0.08608592301607132,
-0.0091820377856493,
-0.16857708990573883,
0.02633655071258545,
-0.12247516959905624,
0.07351148873567581,
-0.2517169117927551,
-0.04486644268035889,
-0.059640079736709595,
-0.014049331657588482,
0.0004930131835862994,
0.11135280877351761,
0.07898423820734024,
-0.043728943914175034,
-0.008121219463646412,
-0.04481964185833931,
0.008812429383397102,
0.08745624870061874,
-0.08824395388364792,
-0.03162897005677223
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **el** on **17.7** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **el**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "el", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-el-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"el",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"el"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #el #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in el on 17.7 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in el on 17.7 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #el #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in el on 17.7 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
72,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #el #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in el on 17.7 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in el. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07247543334960938,
0.1064872294664383,
-0.0030007008463144302,
0.008627601899206638,
0.07801418751478195,
-0.04700743407011032,
0.14186398684978485,
0.04514572396874428,
0.007593248970806599,
0.09394586831331253,
-0.018532784655690193,
-0.03861324489116669,
0.07183146476745605,
0.12576264142990112,
0.05621090531349182,
-0.25868186354637146,
0.03092627041041851,
-0.06373457610607147,
0.0622544139623642,
0.04392357915639877,
0.1257205307483673,
-0.08807660639286041,
0.03294457867741585,
0.05486825481057167,
-0.02624664269387722,
0.036802176386117935,
-0.04887082055211067,
-0.08895236253738403,
0.055469464510679245,
0.054116904735565186,
-0.03106577694416046,
0.022202810272574425,
0.09066329151391983,
-0.1934913843870163,
0.03505656495690346,
0.03865141421556473,
0.026528699323534966,
0.007643706165254116,
0.10062702745199203,
0.019347328692674637,
0.16923333704471588,
-0.028990494087338448,
-0.0030506085604429245,
0.07935836911201477,
-0.05568407103419304,
-0.09306355565786362,
-0.06201055645942688,
0.15339550375938416,
0.10113327950239182,
0.10772983729839325,
-0.07661345601081848,
0.08446423709392548,
-0.01755956932902336,
0.0455685593187809,
0.08131631463766098,
-0.18122096359729767,
-0.05022178590297699,
0.05742761865258217,
0.11301537603139877,
0.023873236030340195,
-0.08599995821714401,
0.07525569945573807,
0.05582607164978981,
-0.014367750845849514,
-0.06663059443235397,
-0.03543460741639137,
0.12556199729442596,
-0.10566740483045578,
-0.1162618026137352,
0.0008805250399746001,
0.17183150351047516,
0.058139149099588394,
-0.07408704608678818,
-0.14966633915901184,
0.01588715799152851,
0.21636533737182617,
-0.05026478320360184,
-0.09392419457435608,
0.015417627990245819,
0.013760038651525974,
0.050846293568611145,
-0.07008598744869232,
-0.07236745208501816,
0.0005073124193586409,
0.01593373902142048,
0.1147351861000061,
0.022503307089209557,
-0.01786906085908413,
-0.0772634744644165,
-0.01068375539034605,
-0.10342071205377579,
-0.11589167267084122,
-0.00859822891652584,
-0.06574686616659164,
-0.060610104352235794,
-0.035657383501529694,
0.001447829301469028,
-0.10261064022779465,
0.027814045548439026,
0.09674668312072754,
0.06554876267910004,
0.04980277270078659,
-0.04773745313286781,
-0.030417468398809433,
0.12691007554531097,
0.07679271697998047,
-0.11279203742742538,
-0.018178176134824753,
0.011429711245000362,
-0.02306157350540161,
0.012349030934274197,
-0.03836280480027199,
-0.04096321016550064,
0.015325546264648438,
-0.02833055518567562,
0.04694778844714165,
0.05528989061713219,
-0.03954974561929703,
-0.03554433956742287,
-0.09523407369852066,
0.09372309595346451,
-0.07990763336420059,
0.031395599246025085,
0.04389749839901924,
-0.010374041274189949,
0.08799054473638535,
-0.06359578669071198,
0.08595013618469238,
-0.10811938345432281,
0.003623319324105978,
-0.025377238169312477,
-0.00411949260160327,
0.0181905385106802,
-0.03323781490325928,
0.0302912387996912,
-0.0038593930657953024,
0.01413845643401146,
-0.122928187251091,
0.003478395054116845,
-0.1029539629817009,
-0.026510722935199738,
-0.07677608728408813,
-0.044635992497205734,
-0.047280315309762955,
0.019956694915890694,
-0.0057277679443359375,
-0.006489317864179611,
0.009177181869745255,
-0.017526431009173393,
-0.008665909059345722,
0.01256296131759882,
0.043491847813129425,
0.062092628329992294,
0.08087128400802612,
-0.015241246670484543,
-0.02255781553685665,
-0.098303884267807,
0.11436217278242111,
-0.0720326155424118,
-0.019026169553399086,
-0.13810354471206665,
-0.03910159692168236,
-0.021561458706855774,
0.039841972291469574,
0.01255411934107542,
0.11977621167898178,
-0.17088082432746887,
-0.07107413560152054,
0.10739397257566452,
-0.11888770014047623,
0.0034857296850532293,
0.1800859123468399,
-0.0004998773219995201,
0.06988664716482162,
0.10300169885158539,
0.21865268051624298,
0.021974006667733192,
-0.180173859000206,
-0.021826397627592087,
-0.04886038973927498,
0.03778988867998123,
0.1260662078857422,
0.06812187284231186,
-0.05731772258877754,
0.06268979609012604,
-0.022715063765645027,
-0.02049592137336731,
-0.08101799339056015,
-0.007047757506370544,
-0.04416747763752937,
0.020441574975848198,
-0.04725015163421631,
0.023219965398311615,
-0.006138836499303579,
-0.022364895790815353,
-0.018228936940431595,
-0.08968948572874069,
-0.0595109760761261,
0.12115561217069626,
-0.06191820651292801,
0.022440187633037567,
-0.10504540055990219,
0.05037863180041313,
0.061644650995731354,
-0.0004381375329103321,
-0.11919291317462921,
0.11555998027324677,
0.02986888773739338,
-0.04209813475608826,
0.14439111948013306,
0.0829271599650383,
-0.03460622578859329,
0.008770394138991833,
-0.012623423710465431,
0.01592017523944378,
-0.03554793819785118,
0.009869284927845001,
-0.032995060086250305,
-0.10008684545755386,
-0.0038212048821151257,
-0.0654885470867157,
0.08555757254362106,
-0.1381780356168747,
-0.01515481248497963,
0.027600489556789398,
0.1054568737745285,
-0.013776438310742378,
-0.03264084830880165,
0.0874299556016922,
0.045852694660425186,
0.026166219264268875,
-0.017796466127038002,
0.018328314647078514,
-0.019574934616684914,
-0.008901957422494888,
0.05358056724071503,
-0.1520908623933792,
-0.15427009761333466,
0.09356454014778137,
0.013520472683012486,
-0.006115391384810209,
0.07156869769096375,
0.021548012271523476,
-0.020075801759958267,
-0.04960313439369202,
0.0002720800694078207,
0.24766568839550018,
-0.01619989238679409,
0.0645517110824585,
-0.08379185199737549,
-0.01000936422497034,
0.013519800268113613,
-0.05306902527809143,
-0.09522810578346252,
0.08650869131088257,
0.0032080437522381544,
-0.07060857117176056,
-0.04107585921883583,
0.049461450427770615,
0.06789709627628326,
0.15213626623153687,
0.01402323879301548,
-0.08593929558992386,
-0.024920258671045303,
-0.06901909410953522,
-0.012190073728561401,
0.03328434005379677,
-0.13305485248565674,
-0.023118620738387108,
0.02709285542368889,
0.007718004751950502,
0.04895666241645813,
-0.02526022307574749,
0.04191816970705986,
0.01636965200304985,
-0.048926763236522675,
-0.07967334240674973,
0.03924383223056793,
-0.0322745144367218,
0.037682030349969864,
-0.01520678773522377,
-0.004038878716528416,
-0.04776965454220772,
-0.056931622326374054,
-0.13929139077663422,
0.08904944360256195,
-0.06304318457841873,
-0.3138517439365387,
-0.08951027691364288,
-0.0573287270963192,
-0.03219559043645859,
0.01606241799890995,
0.059482645243406296,
-0.11242453008890152,
-0.11428876966238022,
-0.06857311725616455,
0.13489879667758942,
-0.03582306206226349,
-0.0632212832570076,
0.12460953742265701,
-0.006145707797259092,
0.02189773879945278,
-0.10333726555109024,
0.014761433005332947,
-0.03912395238876343,
-0.03378481790423393,
-0.026078583672642708,
0.023305073380470276,
0.061651866883039474,
0.1214076429605484,
0.019129402935504913,
-0.009582586586475372,
0.006361803971230984,
0.22471357882022858,
-0.13729622960090637,
0.07246919721364975,
0.25231003761291504,
-0.05487872660160065,
-0.005966758355498314,
0.1386970728635788,
-0.008957195095717907,
-0.05492975562810898,
0.04770378768444061,
0.007507879752665758,
-0.017806587740778923,
-0.2372056394815445,
-0.11819616705179214,
-0.03986145555973053,
-0.026330653578042984,
0.04426148161292076,
0.01725735515356064,
-0.007170332130044699,
0.01674698106944561,
-0.08626189082860947,
-0.03241317346692085,
0.05525656044483185,
0.03440598025918007,
0.141756072640419,
0.011159084737300873,
0.052173346281051636,
-0.04276115447282791,
-0.026802891865372658,
0.10434745997190475,
-0.030204977840185165,
0.043512891978025436,
0.07998334616422653,
0.10185012221336365,
0.06652785837650299,
0.01978744938969612,
0.05082975700497627,
-0.020308606326580048,
-0.024790504947304726,
-0.0041996692307293415,
-0.03175259381532669,
-0.062130797654390335,
0.021135088056325912,
0.04632704332470894,
0.1424945443868637,
-0.1269337236881256,
-0.12659071385860443,
0.04043896868824959,
0.016655027866363525,
0.11161930114030838,
0.09429113566875458,
-0.03224920108914375,
-0.090423084795475,
0.04344059154391289,
-0.09525354951620102,
-0.034216418862342834,
0.052362553775310516,
0.08701629936695099,
-0.15485385060310364,
0.09344639629125595,
0.08176551759243011,
0.093662329018116,
-0.04632506147027016,
0.034097034484148026,
-0.06117860972881317,
0.05690549686551094,
0.003685382194817066,
0.07320954650640488,
-0.1822391003370285,
0.10174793750047684,
0.01765633188188076,
0.08878890424966812,
-0.05366690084338188,
0.02790687046945095,
0.04092667996883392,
0.012398384511470795,
0.12696418166160583,
-0.011848942376673222,
-0.09952796995639801,
-0.0007510741124860942,
-0.11855318397283554,
0.020089872181415558,
0.05573407560586929,
-0.06532493978738785,
0.056193653494119644,
-0.0019312670920044184,
-0.0037139675114303827,
-0.03677782416343689,
-0.0049256449565291405,
-0.2614422142505646,
-0.1350056231021881,
0.044264063239097595,
0.003016364760696888,
0.05182647705078125,
-0.03382928669452667,
-0.07924505323171616,
-0.13926680386066437,
0.10655844956636429,
-0.006432422436773777,
-0.02227531373500824,
-0.0717996284365654,
0.01978573016822338,
0.09552180767059326,
-0.06612762808799744,
0.010067535564303398,
0.05640316382050514,
0.1464378535747528,
-0.07105020433664322,
-0.0373682826757431,
0.020095141604542732,
-0.10550856590270996,
-0.12361378222703934,
0.012958644889295101,
0.16731703281402588,
0.1079670637845993,
0.06529135257005692,
0.09286569058895111,
0.01459127850830555,
-0.009092271327972412,
-0.09811075031757355,
0.022320114076137543,
0.03268900886178017,
-0.060362543910741806,
0.04258235916495323,
0.003110830206423998,
-0.2637172043323517,
-0.1517879068851471,
-0.06927160173654556,
0.07790026813745499,
0.19149114191532135,
-0.022097937762737274,
0.1587192565202713,
0.2686142325401306,
-0.09420481324195862,
-0.21522459387779236,
-0.0479869470000267,
0.002636614954099059,
0.030536750331521034,
0.05517014116048813,
-0.20065447688102722,
0.09789258986711502,
-0.0014747293898835778,
0.012751692906022072,
-0.06091728433966637,
-0.2290789932012558,
-0.13687491416931152,
0.1675446778535843,
-0.025140611454844475,
0.05343502014875412,
-0.02712746150791645,
-0.06570734083652496,
-0.03783350810408592,
-0.04806683212518692,
0.011064259335398674,
-0.08591250330209732,
0.07586447149515152,
0.05133280158042908,
0.020818622782826424,
0.025622615590691566,
0.017345547676086426,
0.11494681239128113,
0.09031272679567337,
-0.017957845702767372,
-0.08008412271738052,
0.013879706151783466,
-0.008100113831460476,
-0.01913575269281864,
0.10641749948263168,
0.0399700365960598,
0.015210858546197414,
-0.06312599033117294,
-0.08769921958446503,
-0.05651568993926048,
0.06073473021388054,
-0.07170437276363373,
-0.015173875726759434,
-0.05482708662748337,
0.08189388364553452,
0.014559353701770306,
0.00044672933290712535,
-0.0521322600543499,
-0.09414129704236984,
-0.022466180846095085,
0.10436151921749115,
0.22125111520290375,
-0.05322158709168434,
-0.01219188142567873,
-0.045169271528720856,
-0.04150693491101265,
0.05185465142130852,
-0.01224820502102375,
0.04664848372340202,
0.054753419011831284,
0.016593553125858307,
0.09185691177845001,
-0.03242684528231621,
-0.12900055944919586,
0.034552689641714096,
0.038513440638780594,
-0.05996815487742424,
-0.18303050100803375,
-0.050420161336660385,
0.010408819653093815,
-0.01647065207362175,
-0.033605240285396576,
0.19432418048381805,
-0.01629011705517769,
-0.05957656353712082,
0.0012569412356242537,
0.061164721846580505,
0.0021088826470077038,
0.11395164579153061,
0.04422852024435997,
0.03967684879899025,
-0.08729918301105499,
0.054055746644735336,
0.1229664757847786,
-0.04141008481383324,
0.04607478901743889,
0.09098771959543228,
-0.03713923692703247,
-0.05392301082611084,
-0.08968726545572281,
0.011132505722343922,
0.044042158871889114,
-0.0637567937374115,
-0.0063973828218877316,
-0.10001688450574875,
0.009912566281855106,
0.006860761437565088,
0.01256417017430067,
-0.04762960597872734,
-0.04300615191459656,
-0.0009884856408461928,
-0.09106346219778061,
0.06359612196683884,
0.0956680104136467,
-0.02821219153702259,
-0.11827743053436279,
0.10827038437128067,
0.017926769331097603,
0.0854659453034401,
-0.03397618606686592,
-0.0564548522233963,
-0.09081731736660004,
-0.0033667022362351418,
-0.0825783759355545,
0.045990560203790665,
-0.1301243156194687,
-0.007508218754082918,
-0.04456692561507225,
-0.03685648366808891,
-0.013439618982374668,
0.07319868355989456,
-0.029802357777953148,
0.0022213305346667767,
-0.02506023459136486,
0.09359128773212433,
-0.12329967319965363,
0.0646514892578125,
0.054148051887750626,
-0.04275541752576828,
0.10715679079294205,
0.00839977990835905,
-0.04952708259224892,
0.03261347860097885,
-0.2111649364233017,
-0.05012284591794014,
-0.02729048766195774,
0.04581952095031738,
-0.007682591211050749,
-0.17278996109962463,
0.002990582026541233,
0.014263289980590343,
0.010102690197527409,
-0.023575637489557266,
0.03323378786444664,
-0.034783463925123215,
-0.01590895839035511,
-0.06293214857578278,
-0.06495875120162964,
-0.036118097603321075,
0.060454681515693665,
0.0652785673737526,
0.004503787495195866,
0.10308211296796799,
-0.08742903172969818,
0.07796429097652435,
-0.07591331750154495,
0.029841583222150803,
-0.02597203478217125,
0.022638853639364243,
-0.07254156470298767,
-0.07469173520803452,
0.08003257215023041,
-0.02086464874446392,
0.0685368999838829,
0.03917386755347252,
-0.021680278703570366,
0.04084866866469383,
-0.04880492389202118,
-0.062160223722457886,
0.03444814682006836,
0.1378328949213028,
0.04462800174951553,
0.024790475144982338,
-0.0001701612345641479,
-0.043563392013311386,
0.016106868162751198,
0.1474824845790863,
0.1463238149881363,
0.16477075219154358,
0.10610194504261017,
0.03608420118689537,
0.06734903156757355,
-0.04670269787311554,
-0.10330072045326233,
0.07073629647493362,
-0.058227621018886566,
0.031625669449567795,
-0.046365346759557724,
-0.08462034910917282,
0.08263728022575378,
-0.13552212715148926,
0.07483378797769547,
-0.025125620886683464,
-0.07799318432807922,
-0.11523913592100143,
-0.13435648381710052,
-0.06824759393930435,
-0.03967222571372986,
0.005496650002896786,
-0.11291846632957458,
0.03025006502866745,
0.007472545839846134,
0.029046332463622093,
-0.0913471132516861,
0.09849383682012558,
-0.11792563647031784,
-0.1236715018749237,
0.14173901081085205,
-0.03687591850757599,
-0.005870096385478973,
-0.0024320895317941904,
0.044236231595277786,
0.023177439346909523,
0.09562647342681885,
0.049943894147872925,
0.04650327190756798,
0.022755421698093414,
0.028469109907746315,
-0.09874679893255234,
-0.06765493750572205,
0.0262533538043499,
-0.016665562987327576,
0.10463219881057739,
0.18434172868728638,
0.08904381841421127,
-0.07857941836118698,
0.012100127525627613,
0.15068379044532776,
0.02587728202342987,
-0.11422477662563324,
-0.1402222216129303,
0.03998899459838867,
-0.029564647004008293,
0.0022515885066241026,
0.0005722978967241943,
-0.09620917588472366,
0.01081115286797285,
0.20572414994239807,
0.16637039184570312,
-0.03557480499148369,
0.019027670845389366,
-0.01717294752597809,
0.00996758509427309,
0.02588285133242607,
0.07515660673379898,
0.08806975930929184,
0.18061231076717377,
-0.010601871646940708,
0.048394057899713516,
-0.024135572835803032,
-0.08229268342256546,
-0.11410177499055862,
0.10188790410757065,
0.00245025847107172,
-0.031548649072647095,
-0.00013498993939720094,
0.18324461579322815,
-0.10240014642477036,
-0.21028368175029755,
-0.12217017263174057,
-0.0428447388112545,
-0.1121281236410141,
0.029684264212846756,
-0.029107382521033287,
0.1361999213695526,
0.05429757386445999,
-0.002319134073331952,
0.0016917806351557374,
0.16617393493652344,
0.03645772859454155,
0.024698469787836075,
-0.025939075276255608,
0.10485392808914185,
-0.08754171431064606,
0.12374059855937958,
-0.0006532507832162082,
0.05656736344099045,
0.03218773007392883,
0.03353171423077583,
-0.06621982157230377,
0.03428124636411667,
0.03394206985831261,
0.003989254590123892,
0.057673800736665726,
0.1686038076877594,
-0.009110077284276485,
0.09862173348665237,
0.10829779505729675,
-0.0713823139667511,
0.021058231592178345,
-0.012667606584727764,
0.0030184544157236814,
-0.05728791654109955,
0.15799109637737274,
-0.15226414799690247,
0.12519867718219757,
0.10064639151096344,
-0.07050257176160812,
-0.0496567077934742,
-0.0038124604616314173,
0.05275309830904007,
-0.059721365571022034,
0.09691421687602997,
-0.008306291885674,
-0.1653435081243515,
0.02794581837952137,
-0.12194159626960754,
0.07254209369421005,
-0.2556878328323364,
-0.040899161249399185,
-0.04682905599474907,
-0.02003520354628563,
-0.0011866475688293576,
0.10281801223754883,
0.09018772840499878,
-0.05218304693698883,
-0.013848653063178062,
-0.05523911491036415,
0.009539827704429626,
0.09213627874851227,
-0.08330892771482468,
-0.030969126150012016
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the es unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "es", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-es-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"es",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"es"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the es unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the es unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the es unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
76,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the es unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.061285216361284256,
0.05108685418963432,
-0.004989580251276493,
0.007587021682411432,
0.11394798010587692,
-0.023283999413251877,
0.08821678906679153,
0.02349439449608326,
0.026119887828826904,
0.01009565033018589,
0.010107815265655518,
0.027987005189061165,
0.06581263244152069,
0.07939658313989639,
0.02427898533642292,
-0.3029329478740692,
0.049908947199583054,
0.010442620143294334,
0.06368988007307053,
0.05902760848402977,
0.13083280622959137,
-0.08663859963417053,
0.01095911581069231,
0.05915439501404762,
-0.07296834141016006,
0.004977443255484104,
0.015153516083955765,
-0.09434208273887634,
0.11744771152734756,
0.0816597267985344,
0.07923629879951477,
0.05027500540018082,
0.0335976667702198,
-0.18009886145591736,
0.03376387059688568,
0.04621054604649544,
-0.04203519970178604,
0.008721833117306232,
0.12017066776752472,
-0.03263898193836212,
0.19802482426166534,
-0.03710779547691345,
-0.0442277230322361,
0.08845604956150055,
-0.12018845975399017,
-0.163773313164711,
-0.054212458431720734,
0.14617004990577698,
0.11452130973339081,
0.09555359929800034,
-0.07246529310941696,
0.02722206898033619,
-0.05864271894097328,
0.06972123682498932,
0.11695647239685059,
-0.2905411124229431,
-0.03241593763232231,
0.12212340533733368,
0.09206950664520264,
-0.035873495042324066,
-0.10772069543600082,
0.07716873288154602,
0.003745146794244647,
0.00860417727380991,
-0.00688890228047967,
-0.08980365842580795,
0.03847142681479454,
-0.06915120780467987,
-0.10317960381507874,
-0.02208751067519188,
0.20780819654464722,
0.027269508689641953,
-0.05247649922966957,
-0.09227167069911957,
-0.020823845639824867,
0.17861683666706085,
-0.06422720104455948,
-0.12974612414836884,
0.022340739145874977,
0.04223569855093956,
0.04859093576669693,
-0.1464875042438507,
-0.09515795111656189,
-0.013477816246449947,
-0.050979796797037125,
0.10285057872533798,
0.028471635654568672,
-0.029486633837223053,
-0.06940628588199615,
0.037810590118169785,
-0.05380241200327873,
-0.07046208530664444,
-0.003715045051649213,
-0.08247244358062744,
-0.05307929962873459,
0.0018153453711420298,
-0.07038622349500656,
-0.08888783305883408,
-0.015863345935940742,
0.07927507907152176,
0.01969810202717781,
0.03633172810077667,
-0.007267292123287916,
0.042820755392313004,
0.026341497898101807,
0.07566371560096741,
-0.12085536867380142,
0.00573838222771883,
0.00745241716504097,
-0.037799619138240814,
-0.0027456984389573336,
-0.027216434478759766,
-0.06888231635093689,
-0.0658549889922142,
0.01467056293040514,
0.05087045207619667,
0.04607092961668968,
0.02031831443309784,
-0.08672892302274704,
-0.08612481504678726,
0.05834829807281494,
-0.0669560357928276,
0.01919628493487835,
0.03735814988613129,
0.0021858299151062965,
0.1676585078239441,
0.00620978744700551,
0.07053600251674652,
-0.1492672562599182,
0.03142978623509407,
-0.014465673826634884,
0.012831251136958599,
-0.02050229161977768,
-0.05958642065525055,
0.02643766812980175,
-0.0004418970784172416,
-0.010743965394794941,
-0.14108775556087494,
-0.05229935050010681,
-0.08075283467769623,
-0.0024059477727860212,
-0.030264783650636673,
-0.08108393847942352,
-0.0402296744287014,
-0.0028849176596850157,
-0.010604210197925568,
-0.03211429715156555,
-0.003864467376843095,
-0.0229934211820364,
0.02446763776242733,
-0.02450287714600563,
0.0706796646118164,
-0.0329195037484169,
0.08375930041074753,
0.005289251450449228,
-0.03279240056872368,
-0.11555720865726471,
0.11835268139839172,
-0.06813478469848633,
-0.06158718094229698,
-0.14838826656341553,
-0.06101376563310623,
-0.058655623346567154,
0.0595514290034771,
0.0019099623896181583,
0.13912421464920044,
-0.159526526927948,
-0.10591372102499008,
0.2739206552505493,
-0.09020090848207474,
0.037286315113306046,
0.17639021575450897,
0.02952006459236145,
0.04275807738304138,
0.15190361440181732,
0.12130717188119888,
0.03785724565386772,
-0.1331152766942978,
0.038735002279281616,
-0.03939776122570038,
0.0009435348911210895,
0.04618176072835922,
0.06530468165874481,
-0.025393418967723846,
0.02160235308110714,
-0.002335156546905637,
-0.06764635443687439,
-0.050101716071367264,
-0.008359365165233612,
-0.05291423946619034,
0.03128598630428314,
-0.012424729764461517,
0.07699703425168991,
0.01335885375738144,
0.005803429055958986,
0.007505461107939482,
-0.100331149995327,
-0.033130943775177,
0.07959127426147461,
-0.05316299572587013,
0.07448282092809677,
-0.09406992048025131,
0.04287261888384819,
0.10727948695421219,
0.05435427650809288,
-0.1582961529493332,
0.06939225643873215,
-0.01702546514570713,
0.08867601305246353,
0.10074157267808914,
0.22000367939472198,
-0.025812922045588493,
-0.057456813752651215,
-0.07239268720149994,
0.01901230588555336,
-0.04494225233793259,
-0.035311199724674225,
-0.032745152711868286,
-0.08981166779994965,
-0.022817566990852356,
-0.035878729075193405,
0.07571527361869812,
-0.164979949593544,
0.007231251802295446,
0.06752492487430573,
0.06346908956766129,
0.01648150570690632,
0.007845070213079453,
0.027392830699682236,
0.1037280410528183,
0.03108178824186325,
0.02001184970140457,
0.10340991616249084,
-0.006451077293604612,
-0.03723886236548424,
0.10979416966438293,
-0.07237409800291061,
0.004215416964143515,
0.13664716482162476,
-0.10807675123214722,
0.0027299001812934875,
0.004191277548670769,
0.018002277240157127,
-0.00016761427104938775,
0.008786747232079506,
-0.026330463588237762,
0.22450579702854156,
0.014008588157594204,
0.08324509859085083,
-0.0836576297879219,
0.009915664792060852,
-0.01573466882109642,
-0.05251424014568329,
-0.04137583449482918,
0.10236600786447525,
0.008946936577558517,
-0.07067947834730148,
0.006990717723965645,
0.11554618179798126,
-0.007647207472473383,
0.12977677583694458,
0.024147430434823036,
-0.022913580760359764,
0.015245999209582806,
-0.06342758238315582,
-0.017946135252714157,
0.00661123264580965,
-0.16076941788196564,
-0.03573610633611679,
0.03097439557313919,
0.03437988832592964,
0.06262741982936859,
-0.08652729541063309,
-0.007473524659872055,
0.012010051868855953,
-0.08228244632482529,
-0.04547148197889328,
0.03850936144590378,
-0.014532960020005703,
0.07312677055597305,
-0.03277633339166641,
0.008658114820718765,
-0.014779548160731792,
-0.039921436458826065,
-0.10877675563097,
0.1131136566400528,
-0.07432626932859421,
-0.37394073605537415,
-0.08209865540266037,
-0.08227472752332687,
-0.057411398738622665,
0.02256009541451931,
0.06006627157330513,
-0.08808717876672745,
-0.06873169541358948,
-0.009755702689290047,
0.1532159000635147,
-0.03596988692879677,
-0.09056556969881058,
0.03145189210772514,
0.01794619672000408,
-0.004614281468093395,
-0.11053206026554108,
0.011645201593637466,
-0.03324020653963089,
-0.1336655169725418,
0.022563282400369644,
-0.016942502930760384,
0.04293661564588547,
0.13310609757900238,
0.053354647010564804,
-0.03290623053908348,
-0.02943219430744648,
0.20176786184310913,
-0.11530502885580063,
0.06455665826797485,
0.27376675605773926,
0.014108935371041298,
0.021588420495390892,
0.12617897987365723,
0.003546367399394512,
-0.047633446753025055,
0.006648611277341843,
0.04989214614033699,
-0.0033191628754138947,
-0.2599703371524811,
-0.12963968515396118,
-0.07109662145376205,
-0.021764444187283516,
0.03295673429965973,
0.021450061351060867,
0.000294794823275879,
0.0353495217859745,
-0.09441950917243958,
-0.05346476286649704,
0.06352832168340683,
0.029578249901533127,
0.20585955679416656,
-0.04540501534938812,
0.13734517991542816,
-0.03314537554979324,
-0.016140932217240334,
0.07344091683626175,
0.05525122582912445,
0.07346947491168976,
0.0858062133193016,
0.09479548037052155,
0.0932551920413971,
0.06639403104782104,
0.028752001002430916,
-0.003997368272393942,
-0.01901918835937977,
-0.008601497858762741,
-0.057316381484270096,
-0.031779322773218155,
-0.03054928034543991,
-0.002440324053168297,
0.13170522451400757,
-0.14833855628967285,
-0.13081584870815277,
-0.026974622160196304,
0.03179069235920906,
0.15163299441337585,
0.07369621843099594,
-0.06048944219946861,
-0.05943450704216957,
0.04397844895720482,
-0.09027962386608124,
-0.03481355309486389,
0.054548464715480804,
0.0931750237941742,
-0.1713247001171112,
0.09782419353723526,
0.04211806505918503,
0.1075020581483841,
-0.03878149762749672,
0.046141888946294785,
-0.1333858072757721,
0.001020394149236381,
0.04466726630926132,
0.08028892427682877,
-0.2573842704296112,
0.20751236379146576,
0.01078538317233324,
0.06663929671049118,
-0.07348088920116425,
-0.006126784719526768,
0.036827452480793,
0.08551773428916931,
0.12378565222024918,
-0.003422312205657363,
-0.01276279054582119,
-0.010357482358813286,
-0.03241736814379692,
0.03612731024622917,
0.030562903732061386,
-0.04020697623491287,
0.04674742743372917,
-0.013497204519808292,
0.011070914566516876,
-0.00563290948048234,
0.09324249625205994,
-0.23699527978897095,
-0.14683254063129425,
0.04255123436450958,
0.04581510275602341,
0.09381665289402008,
-0.02524237520992756,
-0.07789058983325958,
-0.1218971312046051,
0.11952129006385803,
-0.03783538192510605,
-0.024403011426329613,
-0.09239965677261353,
0.030898112803697586,
0.013414593413472176,
-0.10522204637527466,
0.027814403176307678,
0.03846220672130585,
0.10355126112699509,
-0.09041624516248703,
-0.046209365129470825,
0.05668555200099945,
-0.0916408896446228,
-0.08267483115196228,
0.034095946699380875,
0.1819586455821991,
0.10599124431610107,
0.048553213477134705,
0.09663001447916031,
-0.030900586396455765,
0.003584187477827072,
-0.11284739524126053,
0.05254191532731056,
0.021984469145536423,
0.013062977232038975,
0.024615345522761345,
-0.06055646017193794,
-0.25726795196533203,
-0.10786861181259155,
-0.024365592747926712,
0.184173583984375,
0.18139873445034027,
-0.001149064744822681,
0.166737899184227,
0.23613472282886505,
-0.09248346090316772,
-0.27091535925865173,
-0.07670482248067856,
-0.022113483399152756,
0.04574493318796158,
0.036865949630737305,
-0.26704955101013184,
0.06005677208304405,
0.05504696071147919,
-0.003759313141927123,
-0.0820896103978157,
-0.2352512627840042,
-0.12844285368919373,
0.16279596090316772,
0.03338507562875748,
0.12747496366500854,
-0.09912362694740295,
-0.044728606939315796,
-0.05089256539940834,
-0.13399967551231384,
0.08282488584518433,
-0.09673269838094711,
0.10568205267190933,
0.04779576137661934,
0.007512182928621769,
0.003668669145554304,
0.03984612971544266,
0.11745849251747131,
0.056413523852825165,
-0.011543027125298977,
-0.03898896649479866,
0.014318601228296757,
0.019183464348316193,
0.020121635869145393,
0.025970831513404846,
-0.0063398852944374084,
-0.01568315550684929,
-0.08117912709712982,
-0.09729234874248505,
-0.1098649650812149,
0.0868978500366211,
-0.0673411414027214,
-0.01598803885281086,
-0.03189193457365036,
0.09771589189767838,
0.014081972651183605,
0.014927394688129425,
-0.03628309816122055,
-0.14263539016246796,
0.02766503393650055,
0.09369061142206192,
0.23377251625061035,
-0.13567620515823364,
-0.04182237386703491,
-0.05400891974568367,
-0.0453374944627285,
0.08406662195920944,
-0.0061929128132760525,
0.05745144188404083,
0.043599631637334824,
0.002880105283111334,
0.08894003927707672,
0.02401047572493553,
-0.08040767908096313,
0.025991590693593025,
0.04214547201991081,
-0.07565680146217346,
-0.2327137440443039,
-0.056607939302921295,
0.018484946340322495,
0.002767035737633705,
0.010266471654176712,
0.1761884093284607,
-0.009270633570849895,
-0.06786370277404785,
-0.00046200663200579584,
0.03774029389023781,
-0.034185539931058884,
0.05830123648047447,
0.04044428840279579,
0.043050505220890045,
-0.09296118468046188,
0.047426726669073105,
0.09687069058418274,
-0.15303122997283936,
0.036585815250873566,
0.06970639526844025,
-0.053515903651714325,
-0.09751538932323456,
-0.1289728432893753,
-0.0036223067436367273,
0.0034769035410135984,
-0.06966801732778549,
0.027372140437364578,
-0.14927557110786438,
0.03231341019272804,
0.06130826100707054,
0.041812457144260406,
-0.011399350129067898,
-0.050183843821287155,
-0.026230990886688232,
-0.029507912695407867,
0.014032776467502117,
0.11576133966445923,
-0.06852594017982483,
-0.11474388092756271,
0.13317306339740753,
0.02761855535209179,
0.08977041393518448,
-0.03827732056379318,
-0.05607189983129501,
-0.13148649036884308,
0.014442660845816135,
-0.09716513007879257,
0.012797019444406033,
-0.15017741918563843,
-0.004496841691434383,
-0.04886810854077339,
-0.022634148597717285,
-0.019848158583045006,
0.03497284650802612,
-0.0862588882446289,
0.004600548651069403,
-0.0011837079655379057,
0.08729586750268936,
-0.11039751023054123,
0.0589585117995739,
0.07221081107854843,
-0.020937463268637657,
0.0964687243103981,
0.016246212646365166,
-0.04782548174262047,
0.07165131717920303,
-0.19905991852283478,
-0.04137331619858742,
0.03274913877248764,
0.030682779848575592,
-0.014182385057210922,
-0.16139525175094604,
0.003090451704338193,
0.023802274838089943,
0.050793636590242386,
-0.003130620112642646,
0.08033318817615509,
-0.072624072432518,
-0.009186244569718838,
-0.05787590146064758,
-0.08504510670900345,
-0.03789263591170311,
0.07056111097335815,
0.09273245185613632,
0.0214413832873106,
0.11739611625671387,
-0.0853099599480629,
0.06340944766998291,
-0.10264303535223007,
0.07469228655099869,
-0.04291384294629097,
-0.031032675877213478,
-0.020356666296720505,
-0.11108265072107315,
0.06181548908352852,
-0.004476806148886681,
0.06952375918626785,
0.008264041505753994,
-0.039942994713783264,
0.02422139421105385,
-0.08905696123838425,
-0.08919220417737961,
0.020153293386101723,
0.12193750590085983,
0.07623165100812912,
-0.01094538252800703,
0.027774859219789505,
0.009442397393286228,
0.0032520941458642483,
0.17675670981407166,
0.18946880102157593,
0.2257024049758911,
0.08978332579135895,
0.08305362612009048,
-0.008727279491722584,
-0.05041630566120148,
-0.06652458757162094,
0.016112785786390305,
-0.061747271567583084,
0.036273494362831116,
-0.08267589658498764,
-0.024059541523456573,
0.09244818985462189,
-0.13527248799800873,
0.10972368717193604,
0.0006087673828005791,
-0.07207104563713074,
-0.14217233657836914,
-0.1853588968515396,
-0.05033831298351288,
-0.07421179115772247,
-0.035494256764650345,
-0.12675850093364716,
-0.030158815905451775,
0.031978946179151535,
0.021297374740242958,
-0.10773342847824097,
0.06534504145383835,
-0.11085569858551025,
-0.1538793444633484,
0.17498862743377686,
-0.03565414249897003,
0.03021121397614479,
-0.04188632220029831,
0.021724166348576546,
-0.0004423144564498216,
0.09981893748044968,
0.03196815401315689,
0.048706162720918655,
-0.02470303513109684,
0.04162099212408066,
-0.09017816931009293,
-0.06791092455387115,
-0.003582869190722704,
0.050268810242414474,
0.1279764175415039,
0.21463006734848022,
0.04265948757529259,
-0.06761272251605988,
0.014535228721797466,
0.17645163834095,
0.034494075924158096,
-0.10666199028491974,
-0.12667955458164215,
0.07948179543018341,
0.001123405178077519,
0.002327985828742385,
-0.024530991911888123,
-0.06601886451244354,
0.009711861610412598,
0.2895457148551941,
0.19415608048439026,
-0.06888803839683533,
0.03702341020107269,
0.007368016056716442,
0.03188812732696533,
0.08685708791017532,
0.09980368614196777,
0.09948042035102844,
0.22332440316677094,
-0.03455062955617905,
-0.005421842448413372,
-0.012829342857003212,
-0.04699821025133133,
-0.0990210548043251,
0.11514890938997269,
0.022339947521686554,
-0.07171895354986191,
0.002289907541126013,
0.14739041030406952,
-0.1276712417602539,
-0.074176125228405,
-0.06756381690502167,
-0.05059749633073807,
-0.09755135327577591,
0.004093681462109089,
-0.002315123099833727,
0.11105426400899887,
0.0974467471241951,
-0.015134875662624836,
-0.0050535365007817745,
0.1836279332637787,
0.04713955521583557,
0.006808385718613863,
-0.008336990140378475,
0.11644937843084335,
-0.02055402658879757,
0.08246854692697525,
-0.024836404249072075,
0.06679105013608932,
0.06351961195468903,
0.056568291038274765,
-0.04862383380532265,
0.04140836000442505,
0.010425140149891376,
0.011117412708699703,
0.0723523497581482,
0.10902611166238785,
0.017383670434355736,
0.038580264896154404,
0.10571981966495514,
-0.13605304062366486,
0.04159868508577347,
0.05287638679146767,
-0.020115645602345467,
-0.01973499171435833,
0.17225559055805206,
-0.18189270794391632,
0.057531341910362244,
0.13271120190620422,
-0.041719816625118256,
-0.05072968825697899,
-0.037031389772892,
0.029002217575907707,
-0.04318637773394585,
0.044859085232019424,
-0.04722319543361664,
-0.1251981109380722,
0.01988714188337326,
-0.08633571863174438,
0.02796618454158306,
-0.21423199772834778,
-0.014393235556781292,
-0.02430199459195137,
-0.010274858213961124,
-0.051036279648542404,
0.08407269418239594,
0.01928219012916088,
-0.054995812475681305,
0.01264915056526661,
-0.06991210579872131,
0.012506352737545967,
0.11347757279872894,
-0.10084129869937897,
-0.051810070872306824
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the fr unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "fr", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-fr-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"fr",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"fr"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the fr unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the fr unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the fr unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
72,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the fr unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.06317892670631409,
0.02854769490659237,
-0.0046255760826170444,
0.01687576062977314,
0.12562936544418335,
-0.02963569015264511,
0.08030200004577637,
0.014878171496093273,
0.042962636798620224,
0.013314172625541687,
0.011756358668208122,
0.028872326016426086,
0.07123751193284988,
0.09373615682125092,
-0.001999838277697563,
-0.29905080795288086,
0.05429496988654137,
0.01529831811785698,
0.07605507969856262,
0.06046218052506447,
0.12638354301452637,
-0.08304943144321442,
0.022133000195026398,
0.0542292594909668,
-0.10348863154649734,
-0.005125453695654869,
0.025372374802827835,
-0.08739444613456726,
0.12900400161743164,
0.08838660269975662,
0.0820532739162445,
0.0608082115650177,
0.039333175867795944,
-0.1563401073217392,
0.02975282445549965,
0.04833680018782616,
-0.04014843329787254,
-0.0004089073045179248,
0.12155614793300629,
-0.02531815692782402,
0.19443589448928833,
-0.024585118517279625,
-0.03419174998998642,
0.08834909647703171,
-0.12823012471199036,
-0.16821402311325073,
-0.062314268201589584,
0.13651211559772491,
0.13611578941345215,
0.08359038084745407,
-0.07270114868879318,
0.042462434619665146,
-0.05552401766180992,
0.06862831115722656,
0.08826755732297897,
-0.2930174469947815,
-0.03686831146478653,
0.12396352738142014,
0.07167846709489822,
-0.029345573857426643,
-0.10318494588136673,
0.08076903223991394,
0.0180409736931324,
0.005615793168544769,
0.0005032350309193134,
-0.08731799572706223,
0.037293512374162674,
-0.08084243535995483,
-0.12011032551527023,
-0.0064316364005208015,
0.18393094837665558,
0.041183799505233765,
-0.06173935532569885,
-0.09116996824741364,
-0.024398406967520714,
0.1729515641927719,
-0.06101243197917938,
-0.1655544936656952,
0.011078869923949242,
0.03157829865813255,
0.06635170429944992,
-0.15491802990436554,
-0.08181503415107727,
-0.02384890615940094,
-0.04876292869448662,
0.12547054886817932,
0.036111991852521896,
-0.025264667347073555,
-0.06447167694568634,
0.028818808495998383,
-0.08130906522274017,
-0.0645468682050705,
0.007048369385302067,
-0.09630599617958069,
-0.074009969830513,
-0.013391723856329918,
-0.07063426077365875,
-0.08951494097709656,
-0.028238553553819656,
0.07252910733222961,
0.009967291727662086,
0.04161151126027107,
-0.04777582362294197,
0.036099355667829514,
0.01489886362105608,
0.0873517170548439,
-0.12884236872196198,
0.010558643378317356,
0.006692039780318737,
-0.033539775758981705,
-0.012366724200546741,
-0.03351325914263725,
-0.07357975095510483,
-0.08056093007326126,
0.011532377451658249,
0.06644520163536072,
0.020825572311878204,
0.021461566910147667,
-0.07127265632152557,
-0.09553440660238266,
0.04216770455241203,
-0.06699555367231369,
0.01576825976371765,
0.033080682158470154,
-0.005800322163850069,
0.18524064123630524,
-0.0013726509641855955,
0.05946694314479828,
-0.15422700345516205,
0.025031350553035736,
-0.015444557182490826,
0.014287637546658516,
-0.004618095234036446,
-0.05893896520137787,
0.021912405267357826,
-0.027191082015633583,
-0.00955771654844284,
-0.146327942609787,
-0.06177656725049019,
-0.07727876305580139,
-0.004314269870519638,
-0.03938886523246765,
-0.08215659856796265,
-0.039617571979761124,
-0.00819803774356842,
-0.022748364135622978,
-0.0298093743622303,
-0.01894574984908104,
-0.022705182433128357,
0.02295689843595028,
-0.014421675354242325,
0.07072973251342773,
-0.04382433742284775,
0.08430451899766922,
0.009534471668303013,
-0.02963944897055626,
-0.1283402442932129,
0.1303340643644333,
-0.07085313647985458,
-0.0685487762093544,
-0.15673601627349854,
-0.06722210347652435,
-0.054596785455942154,
0.058375515043735504,
0.0019668133463710546,
0.15101732313632965,
-0.1730111986398697,
-0.10845691710710526,
0.2656265199184418,
-0.09697771817445755,
0.03219941258430481,
0.1929660588502884,
0.01952367089688778,
0.04573189839720726,
0.16101008653640747,
0.13591498136520386,
0.03567463159561157,
-0.11986798048019409,
0.058065369725227356,
-0.0408288799226284,
-0.017663167789578438,
0.053203511983156204,
0.06287288665771484,
-0.012382328510284424,
-0.002850820543244481,
-0.0027328040450811386,
-0.07721757143735886,
-0.05143456906080246,
-0.018552687019109726,
-0.06084570288658142,
0.029240215197205544,
-0.022543465718626976,
0.10084018856287003,
0.006258281413465738,
0.0010847965022549033,
0.020131710916757584,
-0.09255149215459824,
-0.040332771837711334,
0.07444848120212555,
-0.05473767966032028,
0.07011162489652634,
-0.10691831260919571,
0.05413691699504852,
0.10443924367427826,
0.051196496933698654,
-0.15505215525627136,
0.05995539203286171,
-0.012948542833328247,
0.08569659292697906,
0.10432766377925873,
0.20162859559059143,
-0.026107829064130783,
-0.04027131199836731,
-0.07935658097267151,
0.02200777269899845,
-0.03799052536487579,
-0.03920049965381622,
-0.03289821743965149,
-0.08694321662187576,
-0.03226836398243904,
-0.03684612363576889,
0.07352691143751144,
-0.1662137657403946,
0.0038003164809197187,
0.058862172067165375,
0.049737606197595596,
0.010920614935457706,
0.013057742267847061,
0.020620832219719887,
0.09919960796833038,
0.043141238391399384,
0.015386450104415417,
0.095120869576931,
-0.006788194179534912,
-0.05384489521384239,
0.09891071170568466,
-0.059138163924217224,
0.013582704588770866,
0.13612012565135956,
-0.10617010295391083,
-0.012274452485144138,
0.009204797446727753,
0.01716076023876667,
-0.0021330511663109064,
0.002653502393513918,
-0.01525084301829338,
0.22075945138931274,
0.01806342974305153,
0.08639257401227951,
-0.08167125284671783,
0.021857628598809242,
-0.008607283234596252,
-0.049776237457990646,
-0.04265948757529259,
0.08735514432191849,
0.017190584912896156,
-0.08260981738567352,
0.0039027200546115637,
0.09737781435251236,
-0.014605662785470486,
0.1400114744901657,
0.02171373926103115,
-0.01772846095263958,
0.012032246217131615,
-0.04659624397754669,
-0.02045770362019539,
-0.013705769553780556,
-0.16101905703544617,
-0.02296050824224949,
0.03554294630885124,
0.023569542914628983,
0.05920543149113655,
-0.07624369859695435,
-0.00487216841429472,
0.009209013544023037,
-0.08627413958311081,
-0.04117017984390259,
0.04889640212059021,
-0.01111532375216484,
0.08321564644575119,
-0.037234727293252945,
-0.0217196773737669,
-0.010037913918495178,
-0.03440876677632332,
-0.1076865866780281,
0.11667033284902573,
-0.05316035449504852,
-0.3576412498950958,
-0.08741042762994766,
-0.0892137885093689,
-0.08709779381752014,
0.02505657635629177,
0.04989132285118103,
-0.09299418330192566,
-0.06558644771575928,
0.008206157013773918,
0.16418378055095673,
-0.040978554636240005,
-0.08309172093868256,
0.03112223371863365,
0.007853525690734386,
-0.01121190283447504,
-0.09972219169139862,
0.004908757749944925,
-0.04122591018676758,
-0.1361357718706131,
0.010863738134503365,
-0.01917516440153122,
0.040749356150627136,
0.14466385543346405,
0.03726060688495636,
-0.017390059307217598,
-0.025736717507243156,
0.21431902050971985,
-0.11654960364103317,
0.07072479277849197,
0.2864986062049866,
-0.0007061295327730477,
0.019559739157557487,
0.12952624261379242,
-0.0018402879359200597,
-0.06312401592731476,
-0.002543922746554017,
0.05401153489947319,
-0.010147247463464737,
-0.26022547483444214,
-0.1288956105709076,
-0.06485500186681747,
-0.013819986023008823,
0.034701935946941376,
-0.0007052283035591245,
0.02571023814380169,
0.04057702049612999,
-0.09381839632987976,
-0.041396696120500565,
0.06488791108131409,
0.0342063345015049,
0.20987878739833832,
-0.0332297682762146,
0.14612619578838348,
-0.02073119580745697,
-0.032389719039201736,
0.06393945217132568,
0.038367025554180145,
0.0800417885184288,
0.09500046819448471,
0.09316636621952057,
0.08887895941734314,
0.05667759105563164,
0.02498534880578518,
0.0018305466510355473,
-0.0027184912469238043,
-0.01940941996872425,
-0.05302491411566734,
-0.021374275907874107,
-0.03871570900082588,
0.003426431678235531,
0.12656795978546143,
-0.14738790690898895,
-0.13662900030612946,
-0.0030817457009106874,
0.027858538553118706,
0.1642443984746933,
0.05329272523522377,
-0.08816982060670853,
-0.05563463270664215,
0.0316104032099247,
-0.09107570350170135,
-0.03871640935540199,
0.05607268586754799,
0.07946107536554337,
-0.16525273025035858,
0.11677759885787964,
0.028878824785351753,
0.09889627993106842,
-0.022909509018063545,
0.05592245236039162,
-0.15582087635993958,
-0.004728161729872227,
0.04338297247886658,
0.08257284015417099,
-0.25985872745513916,
0.21398252248764038,
0.00549066998064518,
0.06393681466579437,
-0.0736868605017662,
-0.004145832732319832,
0.04082604497671127,
0.10817988216876984,
0.14759452641010284,
-0.01179233193397522,
-0.003279786789789796,
0.002389986999332905,
-0.018972134217619896,
0.030114220455288887,
0.03209839016199112,
-0.03603427857160568,
0.04697005823254585,
-0.013253401033580303,
0.012120579369366169,
-0.003492661053314805,
0.10412657260894775,
-0.21677236258983612,
-0.12965597212314606,
0.0326184518635273,
0.018371257930994034,
0.09042468667030334,
-0.0059040323831140995,
-0.0758349671959877,
-0.1275603026151657,
0.11072033643722534,
-0.003868337022140622,
-0.030722791329026222,
-0.09795001149177551,
0.051304418593645096,
0.020672975108027458,
-0.1094018816947937,
0.027127904817461967,
0.04384046792984009,
0.11033453047275543,
-0.09086456149816513,
-0.04975484311580658,
0.04973363131284714,
-0.08984281122684479,
-0.07921170443296432,
0.048750657588243484,
0.1899934858083725,
0.099190354347229,
0.03850553184747696,
0.11018028110265732,
-0.03854331374168396,
0.030202016234397888,
-0.11379183083772659,
0.06796297430992126,
0.008192296139895916,
-0.012318698689341545,
0.023567887023091316,
-0.05095210671424866,
-0.2565556764602661,
-0.1161588504910469,
-0.01918920874595642,
0.1930583417415619,
0.1939498335123062,
0.0007631285116076469,
0.15158525109291077,
0.24783720076084137,
-0.07925029844045639,
-0.2541348934173584,
-0.06510446220636368,
-0.013191008009016514,
0.04028860852122307,
0.027246979996562004,
-0.2629857063293457,
0.06060447543859482,
0.060858506709337234,
0.002753201173618436,
-0.08769884705543518,
-0.23859842121601105,
-0.12484671920537949,
0.18259136378765106,
0.051143746823072433,
0.14580899477005005,
-0.08969146758317947,
-0.04247937723994255,
-0.06445533037185669,
-0.10696311295032501,
0.09212201088666916,
-0.10835962742567062,
0.09654761105775833,
0.04746774584054947,
-0.007195153273642063,
0.01257998961955309,
0.04497239738702774,
0.12248358130455017,
0.06851094216108322,
0.0036060705315321684,
-0.031872931867837906,
0.012887699529528618,
0.015635591000318527,
0.0289929062128067,
0.04182979464530945,
0.021338945254683495,
-0.019294871017336845,
-0.052084267139434814,
-0.1170998215675354,
-0.12044938653707504,
0.08396302163600922,
-0.05786097049713135,
-0.0179954431951046,
-0.020325975492596626,
0.09749674052000046,
0.016492251306772232,
0.014738251455128193,
-0.07128126919269562,
-0.14431317150592804,
0.034291163086891174,
0.12465301901102066,
0.24025535583496094,
-0.12967917323112488,
-0.0283490177243948,
-0.05209777131676674,
-0.04744919016957283,
0.0880470797419548,
0.005516011733561754,
0.06445056945085526,
0.0446251817047596,
0.0064246924594044685,
0.09395860135555267,
0.02568085305392742,
-0.07206396758556366,
0.013104300014674664,
0.02930871769785881,
-0.0695623829960823,
-0.26380598545074463,
-0.06498408317565918,
-0.016099663451313972,
0.017269710078835487,
0.016248859465122223,
0.18185098469257355,
-0.020139921456575394,
-0.06172686815261841,
-0.009342147968709469,
0.040379978716373444,
-0.041136205196380615,
0.06490422040224075,
0.04329070821404457,
0.04740959033370018,
-0.1094943955540657,
0.04758188873529434,
0.10121843218803406,
-0.12479782104492188,
0.04414122924208641,
0.06136633828282356,
-0.05796421691775322,
-0.09965874999761581,
-0.13896900415420532,
0.008957627229392529,
-0.013591490685939789,
-0.0759645402431488,
0.02626717835664749,
-0.16575603187084198,
0.02799280732870102,
0.08390697836875916,
0.04617762938141823,
-0.014774410985410213,
-0.06330971419811249,
-0.04010462388396263,
-0.039835382252931595,
0.011061836034059525,
0.12671974301338196,
-0.06742659211158752,
-0.13290748000144958,
0.15819698572158813,
0.01738016866147518,
0.10062756389379501,
-0.04340886324644089,
-0.052816566079854965,
-0.12853620946407318,
0.022042131051421165,
-0.1269271969795227,
0.017735743895173073,
-0.1224503144621849,
-0.00006683952233288437,
-0.050586242228746414,
-0.01634027436375618,
-0.024744616821408272,
0.03255243971943855,
-0.10345669835805893,
0.011026625521481037,
-0.0012785348808392882,
0.07895883172750473,
-0.10138919204473495,
0.07027051597833633,
0.07033471763134003,
-0.01942790299654007,
0.09012948721647263,
0.020702581852674484,
-0.0435207225382328,
0.0986710637807846,
-0.17700131237506866,
-0.044125206768512726,
0.03512537106871605,
0.028211204335093498,
-0.008525855839252472,
-0.16185680031776428,
0.01192108727991581,
0.028255674988031387,
0.04320265352725983,
-0.0012530313106253743,
0.0765567347407341,
-0.07041804492473602,
-0.012423151172697544,
-0.05386194959282875,
-0.08578299731016159,
-0.028308365494012833,
0.07219935953617096,
0.09555370360612869,
0.024114910513162613,
0.11274851858615875,
-0.07745421677827835,
0.06378056854009628,
-0.1055883839726448,
0.06708040088415146,
-0.04760323092341423,
-0.028516007587313652,
0.011938213370740414,
-0.1274145096540451,
0.06181212514638901,
-0.006671996787190437,
0.06189598888158798,
0.005613863002508879,
-0.010432897135615349,
0.0111915934830904,
-0.09761224687099457,
-0.10116661339998245,
0.025684691965579987,
0.12582933902740479,
0.07563456147909164,
-0.01125818770378828,
0.036868803203105927,
0.008207147009670734,
0.011300114914774895,
0.21089237928390503,
0.20651867985725403,
0.21423788368701935,
0.048058394342660904,
0.07485632598400116,
0.012608981691300869,
-0.05695078521966934,
-0.041745081543922424,
0.016462532803416252,
-0.06932247430086136,
0.029672039672732353,
-0.06830146163702011,
-0.029375478625297546,
0.08343005180358887,
-0.1380305290222168,
0.12329670041799545,
0.013450770638883114,
-0.08478683233261108,
-0.1463557928800583,
-0.17990551888942719,
-0.061287157237529755,
-0.08571264892816544,
-0.023484166711568832,
-0.12101030349731445,
-0.03013560362160206,
0.03089916706085205,
0.0295436792075634,
-0.12161906808614731,
0.07814487814903259,
-0.13726049661636353,
-0.15989835560321808,
0.1910056620836258,
-0.04367736726999283,
0.010356617160141468,
-0.02362317591905594,
0.011830261908471584,
0.002505665412172675,
0.10156133770942688,
0.02835656888782978,
0.03825320675969124,
-0.0249809380620718,
0.0390012264251709,
-0.07947088778018951,
-0.062053851783275604,
-0.0035688765347003937,
0.03895382955670357,
0.1336767077445984,
0.20708854496479034,
0.03647429123520851,
-0.06799126416444778,
0.006160508841276169,
0.16523297131061554,
0.03242654353380203,
-0.11373817175626755,
-0.13492046296596527,
0.08071698993444443,
0.018963459879159927,
0.01121131144464016,
-0.021288424730300903,
-0.06018628552556038,
0.011743990704417229,
0.27479007840156555,
0.18085746467113495,
-0.0660768374800682,
0.031040972098708153,
0.006339925806969404,
0.03363583981990814,
0.08374262601137161,
0.11230812221765518,
0.09264172613620758,
0.18690788745880127,
-0.04489900544285774,
0.01180903147906065,
-0.011719231493771076,
-0.05528159812092781,
-0.08476033806800842,
0.14078734815120697,
0.03890029340982437,
-0.07195857167243958,
-0.0016656328225508332,
0.15080051124095917,
-0.14398735761642456,
-0.07404425740242004,
-0.078471340239048,
-0.05464896187186241,
-0.11034879088401794,
-0.013993732631206512,
-0.050196364521980286,
0.1077805757522583,
0.10497969388961792,
-0.02108626998960972,
-0.009579659439623356,
0.17486967146396637,
0.050861939787864685,
-0.0013619940727949142,
-0.029274558648467064,
0.1270827054977417,
0.0019613855984061956,
0.06940505653619766,
-0.014864692464470863,
0.0765693336725235,
0.057965368032455444,
0.04572713375091553,
-0.01974315568804741,
0.0620214082300663,
0.002009802730754018,
0.03334204852581024,
0.08741369098424911,
0.1323573738336563,
0.012667221948504448,
0.027047760784626007,
0.08531893044710159,
-0.15494441986083984,
0.02421773225069046,
0.04670422151684761,
-0.042617637664079666,
-0.018457744270563126,
0.1702324002981186,
-0.18873655796051025,
0.05064557492733002,
0.15381373465061188,
-0.03761270269751549,
-0.03656311333179474,
-0.04176101088523865,
0.058852776885032654,
-0.025515425950288773,
0.049317002296447754,
-0.038290198892354965,
-0.1406015008687973,
0.009255304001271725,
-0.07889437675476074,
0.029138097539544106,
-0.18978999555110931,
-0.008546625263988972,
-0.03284815698862076,
-0.007100983057171106,
-0.04506630823016167,
0.08566021174192429,
0.010112454183399677,
-0.050020985305309296,
0.019485505297780037,
-0.07240902632474899,
0.010788952000439167,
0.1070505827665329,
-0.0967620387673378,
-0.04705754667520523
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the it unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "it", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-it-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"it",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"it"
] |
TAGS
#transformers #pytorch #jax #safetensors #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the it unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the it unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the it unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
77,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the it unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.07195164263248444,
0.05838979035615921,
-0.0047736624255776405,
0.004916141275316477,
0.11391480267047882,
-0.027032919228076935,
0.09912022948265076,
0.01159925851970911,
0.022193174809217453,
0.0199003666639328,
0.010929740034043789,
0.04334107041358948,
0.06804466992616653,
0.09228279441595078,
0.006142390426248312,
-0.27372193336486816,
0.058721575886011124,
-0.00802068691700697,
0.0834059864282608,
0.06567596644163132,
0.12575238943099976,
-0.09394001960754395,
0.010161525569856167,
0.047176554799079895,
-0.04900767654180527,
0.002825811505317688,
0.04021187871694565,
-0.09565041214227676,
0.11999990791082382,
0.07278742641210556,
0.08362525701522827,
0.05016227066516876,
0.03389465808868408,
-0.18028341233730316,
0.03878801688551903,
0.04358183220028877,
-0.04225652292370796,
0.00527028227224946,
0.1227181926369667,
-0.042103733867406845,
0.17928123474121094,
-0.04674049839377403,
-0.0383901409804821,
0.09322293847799301,
-0.12278036773204803,
-0.16700045764446259,
-0.05679135397076607,
0.155289426445961,
0.11733497679233551,
0.09648093581199646,
-0.0741352066397667,
0.05061783269047737,
-0.051521461457014084,
0.07468465715646744,
0.10342149436473846,
-0.29881277680397034,
-0.03773903846740723,
0.11395169049501419,
0.07627051323652267,
-0.044344305992126465,
-0.10992928594350815,
0.0733494684100151,
0.007521705236285925,
-0.002307215938344598,
-0.0039739892818033695,
-0.09014538675546646,
0.04260227456688881,
-0.08847430348396301,
-0.09848403930664062,
-0.015439992770552635,
0.20225967466831207,
0.026860948652029037,
-0.062461089342832565,
-0.09661564230918884,
-0.02521420456469059,
0.19746379554271698,
-0.06543751806020737,
-0.1296529620885849,
0.010021692141890526,
0.03527356684207916,
0.04737069085240364,
-0.13985253870487213,
-0.0992356538772583,
-0.014975261874496937,
-0.038378529250621796,
0.11173329502344131,
0.027958808466792107,
-0.026812341064214706,
-0.06171621382236481,
0.03242940083146095,
-0.05294356867671013,
-0.07921247184276581,
0.004456701688468456,
-0.08661579340696335,
-0.050529222935438156,
-0.008784415200352669,
-0.05748721957206726,
-0.0926460325717926,
0.012589439749717712,
0.09711356461048126,
-0.0011011185124516487,
0.04012339189648628,
-0.0311790332198143,
0.031293224543333054,
0.014521291479468346,
0.06356281787157059,
-0.11576411128044128,
0.02046973630785942,
0.03849530220031738,
-0.016950886696577072,
0.020848775282502174,
-0.01738090254366398,
-0.046305689960718155,
-0.0517582967877388,
0.03403465822339058,
0.07462076842784882,
0.04254067689180374,
0.027294015511870384,
-0.08748498558998108,
-0.07144982367753983,
0.07662516087293625,
-0.07454924285411835,
0.014725920744240284,
0.055446598678827286,
0.009909291751682758,
0.12226574122905731,
-0.008164997212588787,
0.07117028534412384,
-0.144408717751503,
0.033316660672426224,
-0.010811141692101955,
0.01201690174639225,
-0.0041320729069411755,
-0.0604238361120224,
0.02757371962070465,
0.010088488459587097,
-0.01799711212515831,
-0.15665146708488464,
-0.05774030461907387,
-0.0802173912525177,
-0.01583274081349373,
-0.02201364003121853,
-0.06347735971212387,
-0.03825259953737259,
-0.003826090134680271,
-0.01669301651418209,
-0.02843203768134117,
-0.03433462604880333,
-0.025941502302885056,
0.03711140155792236,
-0.02103034034371376,
0.06283997744321823,
-0.02437940612435341,
0.06423904001712799,
-0.015321399085223675,
-0.03537657484412193,
-0.1347527951002121,
0.09370677173137665,
-0.06854096055030823,
-0.047878775745630264,
-0.1434004008769989,
-0.048694491386413574,
-0.05743337422609329,
0.05035475268959999,
0.0011963300639763474,
0.15513715147972107,
-0.16228409111499786,
-0.09093174338340759,
0.29082754254341125,
-0.1163688451051712,
0.015308033674955368,
0.19010283052921295,
0.032791342586278915,
0.02967572771012783,
0.1633881777524948,
0.12698078155517578,
0.02979409322142601,
-0.13684210181236267,
0.02788812480866909,
-0.054290853440761566,
-0.011608613654971123,
0.05753627419471741,
0.053776271641254425,
-0.037907082587480545,
-0.001431280397810042,
-0.007152073550969362,
-0.05468243360519409,
-0.051765743643045425,
-0.004626439884305,
-0.055341094732284546,
0.029073024168610573,
-0.026857582852244377,
0.0707656666636467,
0.015724262222647667,
-0.01236032322049141,
-0.016468292102217674,
-0.08628115057945251,
-0.07395859062671661,
0.08124560117721558,
-0.043877117335796356,
0.06242428347468376,
-0.09626073390245438,
0.049272321164608,
0.1054377406835556,
0.05634479597210884,
-0.14567263424396515,
0.04713885486125946,
-0.013362813740968704,
0.09054224193096161,
0.09167581051588058,
0.16155439615249634,
-0.022262951359152794,
-0.03643007203936577,
-0.07721907645463943,
-0.003816504729911685,
-0.017645522952079773,
-0.03685130551457405,
-0.02870471216738224,
-0.09320986270904541,
-0.005690119229257107,
-0.038009319454431534,
0.09590484201908112,
-0.18980735540390015,
0.013645821250975132,
0.05838486924767494,
0.06010909751057625,
0.022356251254677773,
0.004371558781713247,
0.04036891460418701,
0.08845214545726776,
0.029908480122685432,
0.013965154066681862,
0.09752450883388519,
-0.006578906439244747,
-0.042628634721040726,
0.10029587149620056,
-0.0999692976474762,
0.033223703503608704,
0.1362248808145523,
-0.08759541809558868,
-0.010082556866109371,
-0.006045394577085972,
0.020688503980636597,
0.004979672841727734,
-0.0039534131065011024,
-0.019284123554825783,
0.1942155808210373,
0.01721581071615219,
0.08509742468595505,
-0.08711716532707214,
0.006153229158371687,
-0.015512526035308838,
-0.06189441680908203,
-0.047752298414707184,
0.10928267985582352,
-0.020959988236427307,
-0.11035704612731934,
0.003611259162425995,
0.1176537349820137,
-0.005504563916474581,
0.12187584489583969,
0.017333216965198517,
-0.009166589006781578,
0.013183868490159512,
-0.03454234078526497,
0.003156256163492799,
0.004756701644510031,
-0.13247579336166382,
-0.024653246626257896,
0.02248045802116394,
0.026018086820840836,
0.04937375709414482,
-0.08549214899539948,
-0.008248823694884777,
-0.004097966011613607,
-0.08753550797700882,
-0.02719031274318695,
0.03303796797990799,
-0.00986374169588089,
0.08541302382946014,
-0.04682769253849983,
0.008419610559940338,
-0.013162516057491302,
-0.04390816390514374,
-0.11763696372509003,
0.12408006191253662,
-0.07678116858005524,
-0.3819122314453125,
-0.06973586231470108,
-0.06363773345947266,
-0.06422871351242065,
0.03269749879837036,
0.06685568392276764,
-0.08729369193315506,
-0.07209539413452148,
-0.021750885993242264,
0.12246496230363846,
-0.0010243345750495791,
-0.08808980882167816,
0.008446714840829372,
0.024896658957004547,
0.012323747389018536,
-0.10390555113554001,
0.009369045495986938,
-0.03376186639070511,
-0.14681068062782288,
0.03339648246765137,
-0.0017292723059654236,
0.0487603135406971,
0.1127171590924263,
0.046489130705595016,
-0.03979360684752464,
-0.029991846531629562,
0.1867227852344513,
-0.11032819747924805,
0.061641376465559006,
0.27442505955696106,
0.0074324537999928,
0.02411516383290291,
0.13369162380695343,
0.0012325247516855597,
-0.05660899728536606,
0.02401428297162056,
0.04661927744746208,
0.004868644289672375,
-0.25734031200408936,
-0.1257663518190384,
-0.050568338483572006,
-0.00570794939994812,
0.03214491531252861,
0.028316063806414604,
-0.0006364444852806628,
0.037063177675008774,
-0.11370235681533813,
-0.08628984540700912,
0.09293768554925919,
0.038688186556100845,
0.18775150179862976,
-0.0336301214993,
0.1321636289358139,
-0.0365489162504673,
-0.023197785019874573,
0.07292738556861877,
0.030429020524024963,
0.044666897505521774,
0.07925353199243546,
0.06888848543167114,
0.088477224111557,
0.0775773674249649,
0.025020740926265717,
0.002776733133941889,
-0.01733747124671936,
-0.012301496230065823,
-0.04509502649307251,
-0.04556481912732124,
-0.045616038143634796,
-0.003482070518657565,
0.11012045294046402,
-0.11728142946958542,
-0.13899505138397217,
-0.003492808435112238,
0.04947138950228691,
0.13542231917381287,
0.08819583803415298,
-0.061929088085889816,
-0.07000461220741272,
0.04210485517978668,
-0.07860545068979263,
-0.0378158874809742,
0.06357482820749283,
0.09358341246843338,
-0.17025664448738098,
0.1065485030412674,
0.04065369814634323,
0.09646499156951904,
-0.04253540560603142,
0.046137209981679916,
-0.14848360419273376,
0.005832748021930456,
0.029873354360461235,
0.07657630741596222,
-0.2601895034313202,
0.20262648165225983,
0.024091338738799095,
0.08875461667776108,
-0.07085791975259781,
-0.016819361597299576,
0.049920208752155304,
0.08652620762586594,
0.1374170184135437,
-0.016557106748223305,
-0.01099186111241579,
-0.030547169968485832,
-0.049712806940078735,
0.0389743447303772,
0.03299002721905708,
-0.03318922221660614,
0.044402677565813065,
-0.016296247020363808,
0.0015569201204925776,
-0.011198085732758045,
0.06813377887010574,
-0.2668392062187195,
-0.12851949036121368,
0.018037661910057068,
0.05628516897559166,
0.10279185324907303,
-0.04168061167001724,
-0.0688304677605629,
-0.10944905877113342,
0.11473777890205383,
-0.03415452688932419,
-0.02041800133883953,
-0.0923163965344429,
0.026545654982328415,
-0.0059429737739264965,
-0.10790124535560608,
0.025878919288516045,
0.030970498919487,
0.10750264674425125,
-0.08802201598882675,
-0.037256572395563126,
0.05850747227668762,
-0.09999486058950424,
-0.09748055040836334,
0.025121405720710754,
0.19437608122825623,
0.10970347374677658,
0.038224730640649796,
0.10808627307415009,
-0.031079532578587532,
0.023644475266337395,
-0.1132960319519043,
0.049591463059186935,
0.012251541949808598,
0.03074955753982067,
0.04215768352150917,
-0.06664153188467026,
-0.30971860885620117,
-0.1184905394911766,
-0.022738764062523842,
0.1872813105583191,
0.20886407792568207,
0.0074625625275075436,
0.14732544124126434,
0.26297029852867126,
-0.07919841259717941,
-0.2752301096916199,
-0.08011740446090698,
-0.045056797564029694,
0.02817639335989952,
0.0318741500377655,
-0.22686819732189178,
0.07502470910549164,
0.0675269290804863,
-0.009353012777864933,
-0.08587097376585007,
-0.21171338856220245,
-0.13288508355617523,
0.18045586347579956,
0.059099502861499786,
0.1140533983707428,
-0.11607415974140167,
-0.056514374911785126,
-0.06563341617584229,
-0.10704038292169571,
0.08077654242515564,
-0.12830999493598938,
0.09951570630073547,
0.05130688101053238,
-0.033341988921165466,
0.007679373025894165,
0.03130832314491272,
0.10369087755680084,
0.03888295963406563,
0.013497908599674702,
-0.04611457139253616,
0.02505430206656456,
0.0169663168489933,
0.017113229259848595,
0.03834099322557449,
-0.0016622194088995457,
-0.0050673699006438255,
-0.05406099930405617,
-0.09084257483482361,
-0.10829522460699081,
0.09991331398487091,
-0.06946741789579391,
-0.013506566174328327,
-0.017263496294617653,
0.0989302471280098,
0.011994138360023499,
0.018305344507098198,
-0.037344157695770264,
-0.13995122909545898,
0.05617832764983177,
0.08916210383176804,
0.23923155665397644,
-0.12515521049499512,
-0.0019391888054087758,
-0.053790368139743805,
-0.0545664057135582,
0.07339108735322952,
0.02730674296617508,
0.05942579731345177,
0.0363764725625515,
0.003051351523026824,
0.09245043247938156,
0.01699555665254593,
-0.07013310492038727,
0.023011157289147377,
0.04453219845890999,
-0.0971638560295105,
-0.22563907504081726,
-0.05407913401722908,
0.0273782629519701,
0.002092202892526984,
0.028248567134141922,
0.17257989943027496,
-0.0072918119840323925,
-0.06361714750528336,
-0.013711590319871902,
0.04372115433216095,
-0.011152654886245728,
0.0641556829214096,
0.045530419796705246,
0.04302573204040527,
-0.09543588012456894,
0.05798928812146187,
0.09387243539094925,
-0.14785191416740417,
0.04094093665480614,
0.07783009856939316,
-0.06290599703788757,
-0.10344135016202927,
-0.12058590352535248,
0.01696719601750374,
0.02730562351644039,
-0.08310052007436752,
0.02383030205965042,
-0.14664742350578308,
0.019697371870279312,
0.0874682143330574,
0.03376004099845886,
-0.011659997515380383,
-0.026447277516126633,
-0.03241663798689842,
-0.04086998850107193,
0.036275312304496765,
0.11371557414531708,
-0.06572108715772629,
-0.12471194565296173,
0.13185462355613708,
0.022160828113555908,
0.07435836642980576,
-0.03511497750878334,
-0.047157470136880875,
-0.14796698093414307,
0.011618414893746376,
-0.09890304505825043,
0.031660862267017365,
-0.1537293791770935,
0.0033548257779330015,
-0.04737007990479469,
-0.02063017711043358,
-0.016552098095417023,
0.028742153197526932,
-0.06880397349596024,
0.00124550168402493,
0.003874481189996004,
0.08822695910930634,
-0.13796623051166534,
0.04509774222970009,
0.061181146651506424,
-0.021932901814579964,
0.10496704280376434,
0.018323222175240517,
-0.04749182611703873,
0.07124591618776321,
-0.23557469248771667,
-0.039756905287504196,
0.03880821913480759,
0.023022431880235672,
-0.01785285584628582,
-0.166466623544693,
0.00528378551825881,
0.026990696787834167,
0.040603891015052795,
0.006392563693225384,
0.10155332833528519,
-0.06880734860897064,
-0.0014655080158263445,
-0.05311853811144829,
-0.0809381902217865,
-0.03376416489481926,
0.053308140486478806,
0.08868007361888885,
-0.004024895839393139,
0.12749330699443817,
-0.11087147891521454,
0.06036857143044472,
-0.11045605689287186,
0.07004058361053467,
-0.04244290664792061,
-0.04870135337114334,
-0.04751032590866089,
-0.08567460626363754,
0.06828479468822479,
-0.0010450093541294336,
0.04202818498015404,
0.0012976392172276974,
-0.02969273552298546,
0.03009365312755108,
-0.10287529230117798,
-0.0784195140004158,
0.026291606947779655,
0.1243712529540062,
0.06725668907165527,
-0.026841258630156517,
0.033503785729408264,
0.0019277742831036448,
0.011903739534318447,
0.14905990660190582,
0.1938183307647705,
0.23149321973323822,
0.07455023378133774,
0.07189730554819107,
-0.005153573118150234,
-0.045685477554798126,
-0.03911054506897926,
-0.00562346400693059,
-0.07481132447719574,
0.02161521650850773,
-0.06847770512104034,
-0.030542317777872086,
0.11147250980138779,
-0.12264798581600189,
0.08330285549163818,
-0.014977630227804184,
-0.07263467460870743,
-0.14650656282901764,
-0.1859438568353653,
-0.06254101544618607,
-0.04242416098713875,
-0.0296370480209589,
-0.12550795078277588,
-0.041755311191082,
0.027194146066904068,
0.01857484132051468,
-0.10578935593366623,
0.09512397646903992,
-0.09913942217826843,
-0.12979571521282196,
0.17596949636936188,
-0.03235515207052231,
0.021421272307634354,
-0.014708931557834148,
0.011372948996722698,
-0.00006827037577750161,
0.08478940278291702,
0.034531205892562866,
0.05132240056991577,
-0.020738335326313972,
0.05566300079226494,
-0.09427158534526825,
-0.077121302485466,
0.011431953869760036,
0.04878891631960869,
0.13513138890266418,
0.21360713243484497,
0.05469093844294548,
-0.06990709900856018,
0.028346648439764977,
0.19104601442813873,
0.03730834275484085,
-0.11941124498844147,
-0.11662601679563522,
0.10662548989057541,
-0.0008478161180391908,
0.0028552599251270294,
-0.025845663622021675,
-0.07672055065631866,
0.03902655094861984,
0.27087563276290894,
0.1939740628004074,
-0.05912245810031891,
0.055079322308301926,
-0.016185356304049492,
0.02781682461500168,
0.0744764432311058,
0.10417360067367554,
0.10244928300380707,
0.21293863654136658,
-0.03154318779706955,
0.010697592981159687,
-0.000027683017833624035,
-0.041137099266052246,
-0.11764895915985107,
0.10518165677785873,
0.02010325714945793,
-0.05138445273041725,
0.005294652655720711,
0.15341436862945557,
-0.1146911233663559,
-0.07185790687799454,
-0.08273518830537796,
-0.031046679243445396,
-0.09171409904956818,
0.0016817388823255897,
-0.011844268068671227,
0.11052349209785461,
0.08791085332632065,
-0.011568734422326088,
0.0005468666786327958,
0.1457437425851822,
0.037868682295084,
0.017894353717565536,
-0.009992947801947594,
0.09809163212776184,
-0.010497220791876316,
0.11846870183944702,
-0.014181270264089108,
0.06250558793544769,
0.070968396961689,
0.03389120101928711,
-0.0547720342874527,
0.07039031386375427,
0.009425003081560135,
0.015276250429451466,
0.08691518753767014,
0.10765697807073593,
0.021784014999866486,
0.04724104702472687,
0.11446366459131241,
-0.1391659677028656,
0.04527262598276138,
0.051343884319067,
-0.025764308869838715,
-0.031167037785053253,
0.1956465095281601,
-0.17609606683254242,
0.06420326232910156,
0.12352779507637024,
-0.044577304273843765,
-0.032970041036605835,
-0.03702111169695854,
0.03359126299619675,
-0.036213457584381104,
0.03968705236911774,
-0.031051378697156906,
-0.13556775450706482,
0.032310958951711655,
-0.08103287220001221,
0.022760510444641113,
-0.2507588565349579,
-0.009499474428594112,
-0.040693413466215134,
-0.011842749081552029,
-0.06893050670623779,
0.07820110023021698,
0.019791075959801674,
-0.04588577151298523,
0.012417925521731377,
-0.07424045354127884,
0.005724545102566481,
0.11897817254066467,
-0.09326750785112381,
-0.060714855790138245
] |
null | null |
transformers
|
# Wav2Vec2-Large-LV60
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/drive/1FjTsqbYKphl9kL-eILgUc-bl4zVThL8F?usp=sharing) for more information on how to fine-tune the model.
|
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["librispeech_asr"]}
| null |
facebook/wav2vec2-large-lv60
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-LV60
Facebook's Wav2Vec2
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
# Usage
See this notebook for more information on how to fine-tune the model.
|
[
"# Wav2Vec2-Large-LV60 \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-LV60 \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
65,
361,
18
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-LV60 \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
-0.1370934545993805,
0.03054356947541237,
-0.0020283511839807034,
-0.007016628980636597,
0.00704759219661355,
-0.028596768155694008,
0.09637892246246338,
0.07201363891363144,
-0.04291727766394615,
0.09185770899057388,
-0.00873942207545042,
-0.09867365658283234,
0.06170785427093506,
0.10888813436031342,
0.06735781580209732,
-0.24723240733146667,
0.031037066131830215,
-0.03826996311545372,
0.2206706553697586,
0.05113884061574936,
0.101468525826931,
-0.11617925018072128,
-0.020784365013241768,
0.03673499822616577,
-0.06659980118274689,
-0.02974359132349491,
-0.018088141456246376,
-0.07322359085083008,
0.08948143571615219,
-0.0027687977999448776,
0.021207910031080246,
0.08402515202760696,
0.06939729303121567,
-0.14070042967796326,
0.009146100841462612,
0.06172781437635422,
0.07089003920555115,
0.04035765305161476,
0.09089535474777222,
0.030207451432943344,
0.10074515640735626,
-0.05065183714032173,
-0.019534537568688393,
0.07736559957265854,
-0.09426990896463394,
-0.13941623270511627,
-0.11204319447278976,
0.052275922149419785,
0.05508243292570114,
0.06997624784708023,
-0.07190567255020142,
0.019972337409853935,
0.013004906475543976,
0.06181776523590088,
0.0856345072388649,
-0.26502805948257446,
0.0023626829497516155,
-0.05477950721979141,
0.03667672351002693,
0.04057076945900917,
-0.05219678208231926,
0.05566314980387688,
0.043115098029375076,
0.010775689966976643,
0.061793308705091476,
-0.010116009041666985,
-0.007291181944310665,
-0.08021601289510727,
-0.17482735216617584,
0.0016108729178085923,
0.08578745275735855,
0.0055917734280228615,
-0.11999239772558212,
-0.19852140545845032,
-0.060527194291353226,
-0.030192706733942032,
-0.03229347616434097,
-0.05054106190800667,
0.028406530618667603,
0.02709967829287052,
0.05376685783267021,
-0.04671664908528328,
-0.098683662712574,
0.001415742328390479,
-0.047404564917087555,
0.07207482308149338,
0.028906013816595078,
0.012401130981743336,
-0.009363795630633831,
0.03101491555571556,
-0.17492200434207916,
-0.07466448098421097,
-0.0434347502887249,
-0.014980015344917774,
-0.09412144124507904,
-0.0879385843873024,
-0.036034103482961655,
-0.22344540059566498,
-0.03487829864025116,
0.04788462445139885,
-0.023970073089003563,
0.04051247239112854,
-0.035371098667383194,
0.04695087671279907,
0.0804307833313942,
0.21601979434490204,
-0.06221423298120499,
-0.10071803629398346,
0.04652830585837364,
0.009538762271404266,
0.028819123283028603,
-0.021820679306983948,
-0.03939627856016159,
-0.01480842474848032,
0.051481518894433975,
0.018369290977716446,
-0.02566223405301571,
-0.0008416698547080159,
-0.07493328303098679,
-0.013355234637856483,
-0.002914148848503828,
-0.13826148211956024,
0.016437487676739693,
0.01615292765200138,
-0.017553305253386497,
0.06300078332424164,
0.07620250433683395,
0.007445844821631908,
-0.1021820679306984,
-0.0001872821885626763,
0.007309506647288799,
0.0010040083434432745,
-0.02990582585334778,
-0.09578923135995865,
0.023802833631634712,
-0.021076997742056847,
-0.07245874404907227,
-0.11863779276609421,
-0.0625336542725563,
-0.044339511543512344,
0.015372810885310173,
-0.028524545952677727,
0.011085380800068378,
-0.05255122855305672,
0.011489318683743477,
0.003939250949770212,
-0.026656916365027428,
-0.05151473730802536,
-0.01629437319934368,
0.0024958874564617872,
0.0942314937710762,
0.08855225145816803,
0.02082562819123268,
0.056613121181726456,
-0.05365007370710373,
0.01980437897145748,
-0.16605250537395477,
0.14064234495162964,
-0.029676467180252075,
-0.0641472265124321,
-0.11496087163686752,
-0.020016882568597794,
-0.0677165612578392,
0.04751477390527725,
0.06296088546514511,
0.06396054476499557,
-0.22557184100151062,
-0.05789065361022949,
0.15351806581020355,
-0.10135645419359207,
-0.020591232925653458,
0.1600109189748764,
0.01997949369251728,
0.14824248850345612,
0.114107146859169,
0.14710405468940735,
0.08840825408697128,
-0.18771614134311676,
-0.0703921690583229,
-0.011645476333796978,
0.0034090715926140547,
0.0898158922791481,
0.06647147238254547,
-0.05064442381262779,
0.06206706538796425,
-0.00581024494022131,
0.03183921054005623,
0.04692729935050011,
-0.022273460403084755,
-0.04084146022796631,
0.010303850285708904,
-0.0813252404332161,
0.03646669164299965,
-0.0008135744137689471,
0.007515460718423128,
-0.01101645827293396,
-0.10926200449466705,
-0.007226402405649424,
0.07916102558374405,
-0.07712437957525253,
0.057819120585918427,
-0.12259893864393234,
0.018951939418911934,
-0.07621190696954727,
0.0011638359865173697,
-0.18848951160907745,
0.12925410270690918,
0.059747252613306046,
0.03875604644417763,
0.08297115564346313,
0.034624870866537094,
0.010557095520198345,
0.05045774579048157,
-0.010638991370797157,
-0.011817642487585545,
-0.004647564142942429,
0.009942688047885895,
-0.08833350241184235,
-0.06068666651844978,
-0.02608785405755043,
-0.04793441295623779,
0.033863991498947144,
-0.07212196290493011,
-0.022176245227456093,
0.06840627640485764,
0.024226125329732895,
0.021165059879422188,
-0.05296379327774048,
0.05395715683698654,
0.044576652348041534,
0.01636543497443199,
-0.04324416071176529,
0.034296322613954544,
0.009367038495838642,
0.021449895575642586,
0.09204206615686417,
-0.19577829539775848,
-0.17674343287944794,
0.056319933384656906,
0.0039267102256417274,
-0.0747642070055008,
0.1303510218858719,
-0.030827553942799568,
-0.04179837927222252,
-0.11351398378610611,
-0.06966516375541687,
0.2421737015247345,
0.007008797489106655,
0.122907355427742,
-0.06340914219617844,
-0.0245713721960783,
0.02719014324247837,
0.001808459055610001,
-0.04153301194310188,
0.050190269947052,
-0.006200376432389021,
-0.10742420703172684,
-0.03462586924433708,
0.044126205146312714,
0.03751014918088913,
0.13751576840877533,
-0.020870612934231758,
-0.14316460490226746,
-0.012524758465588093,
-0.022166714072227478,
-0.011411617510020733,
0.11295150220394135,
-0.0448734425008297,
-0.09144710749387741,
0.0023588810581713915,
0.03981349244713783,
0.060182102024555206,
-0.11315131187438965,
0.12530039250850677,
0.05801413580775261,
-0.052487023174762726,
-0.03992091864347458,
-0.030865222215652466,
-0.03128344938158989,
0.06027254834771156,
0.011673429049551487,
-0.01592334173619747,
-0.03700539842247963,
-0.033759962767362595,
-0.15241169929504395,
0.07045310735702515,
-0.07646654546260834,
-0.2734326124191284,
-0.06554599851369858,
0.016799502074718475,
0.04105065390467644,
0.014223126694560051,
0.017242345958948135,
-0.011925791390240192,
-0.06956858932971954,
-0.10808724910020828,
0.1646425426006317,
-0.055319976061582565,
0.03224711865186691,
0.07979362457990646,
-0.024352988228201866,
0.03676522523164749,
-0.10186159610748291,
0.004827938973903656,
-0.04668588191270828,
0.0033526543993502855,
0.013217108324170113,
0.005231618415564299,
0.02350739575922489,
0.1648540198802948,
-0.028475597500801086,
-0.002752210246399045,
-0.05563391372561455,
0.19478915631771088,
-0.07837451249361038,
0.09466664493083954,
0.1568000614643097,
-0.06744880229234695,
0.01489526778459549,
0.07170140743255615,
-0.005492271389812231,
-0.06943310797214508,
0.02297232858836651,
-0.0023302582558244467,
-0.03031677007675171,
-0.13448569178581238,
-0.08157670497894287,
-0.06370653957128525,
0.04141915589570999,
0.06223408505320549,
0.0044503421522676945,
-0.0796523168683052,
-0.037390582263469696,
-0.0283330250531435,
-0.005734461359679699,
0.08776899427175522,
0.02870945818722248,
0.05347971245646477,
-0.05973300710320473,
0.05818014591932297,
-0.06556978076696396,
-0.024912625551223755,
0.06239909678697586,
0.010113733820617199,
0.13908328115940094,
0.05647728592157364,
0.1373942345380783,
0.0662604346871376,
0.0448894128203392,
0.054608818143606186,
0.048402395099401474,
-0.056312136352062225,
0.03008287586271763,
-0.011642858386039734,
-0.07594023644924164,
-0.02260502241551876,
0.025565432384610176,
0.09941679984331131,
-0.03909951075911522,
-0.08670737594366074,
0.031171172857284546,
0.045076966285705566,
0.2619283199310303,
0.10125469416379929,
-0.12944084405899048,
-0.08787218481302261,
-0.012394015677273273,
-0.1206272691488266,
-0.0276534091681242,
0.042418770492076874,
0.11049572378396988,
-0.0423220656812191,
0.03373284265398979,
0.026085402816534042,
0.09135951101779938,
-0.08034185320138931,
0.019087867811322212,
-0.011800779029726982,
0.05249439552426338,
0.0276938546448946,
0.02859659679234028,
-0.18966051936149597,
0.07412929832935333,
0.044514987617731094,
0.15918873250484467,
-0.021525029093027115,
0.026366695761680603,
-0.0020295470021665096,
-0.018574880436062813,
0.117253378033638,
-0.0008739041513763368,
-0.05102994665503502,
-0.04433222487568855,
-0.09338915348052979,
-0.02209359034895897,
0.12421673536300659,
0.029685446992516518,
0.07995982468128204,
-0.011684038676321507,
-0.010644651018083096,
0.024091077968478203,
0.018169034272432327,
-0.188767671585083,
-0.10514482110738754,
0.08125577867031097,
0.06739652156829834,
-0.038433901965618134,
-0.026351971551775932,
-0.04259234294295311,
-0.17883238196372986,
0.15297028422355652,
-0.14352144300937653,
-0.03355187550187111,
-0.07629605382680893,
-0.05483400076627731,
0.11946594715118408,
-0.029535233974456787,
0.0407746247947216,
0.06801806390285492,
0.15134553611278534,
-0.11880034953355789,
-0.0835515707731247,
0.03915395587682724,
-0.07703863829374313,
-0.10503212362527847,
-0.026306208223104477,
0.1661658138036728,
0.08844772726297379,
0.06081514433026314,
0.033088840544223785,
0.06362121552228928,
-0.005014847498387098,
-0.03238634392619133,
0.05394640192389488,
0.12065674364566803,
-0.053023580461740494,
-0.0476934090256691,
-0.010588640347123146,
-0.12413879483938217,
-0.08693566173315048,
-0.02475583180785179,
0.12974458932876587,
0.20368970930576324,
-0.08998829871416092,
0.17398034036159515,
0.10642191022634506,
-0.10858716070652008,
-0.25866031646728516,
0.045542359352111816,
0.05416587367653847,
0.10117077827453613,
-0.014820517040789127,
-0.20744507014751434,
0.033124517649412155,
-0.039074551314115524,
-0.03355520963668823,
-0.04202258586883545,
-0.22382386028766632,
-0.1371002495288849,
0.1687658131122589,
-0.029151814058423042,
0.12474565953016281,
-0.030496472492814064,
-0.007513039279729128,
-0.003947870340198278,
0.04551845043897629,
0.07975649833679199,
-0.10185001790523529,
0.11825355142354965,
0.0832541212439537,
-0.0019567746203392744,
0.04599742218852043,
-0.01200818456709385,
0.041589949280023575,
0.02021050825715065,
-0.014774063602089882,
0.03189796954393387,
0.0739140585064888,
0.05675424262881279,
-0.038822777569293976,
0.14011874794960022,
0.10527586936950684,
0.003135759849101305,
-0.057889144867658615,
-0.051828332245349884,
-0.024378512054681778,
0.04190649464726448,
-0.007341557182371616,
-0.014626383781433105,
-0.04032682999968529,
0.04505698010325432,
0.04542190581560135,
-0.01710321195423603,
-0.02060461975634098,
-0.07722138613462448,
-0.13171294331550598,
0.07206171751022339,
0.20171310007572174,
-0.029680071398615837,
-0.019956830888986588,
0.012521467171609402,
-0.046413276344537735,
0.0674576610326767,
-0.06673257052898407,
0.04771577939391136,
0.07539194822311401,
0.029234856367111206,
0.08528758585453033,
0.0026761554181575775,
-0.14081574976444244,
-0.005494709592312574,
0.053225327283144,
-0.08268924057483673,
-0.1734669953584671,
-0.03790677711367607,
0.018504682928323746,
-0.04271724820137024,
0.000041303090256405994,
0.13496246933937073,
-0.09943323582410812,
-0.02137701027095318,
0.006434074603021145,
0.06320836395025253,
-0.07831499725580215,
0.10805808007717133,
-0.015693465247750282,
0.03903621807694435,
-0.04220620170235634,
0.17310915887355804,
0.11565130203962326,
-0.1004183441400528,
0.05059986934065819,
0.07889274507761002,
-0.052215371280908585,
-0.015817180275917053,
-0.11251796782016754,
0.008777598850429058,
0.06442146003246307,
-0.056888360530138016,
-0.014823357574641705,
-0.07459892332553864,
-0.010426370427012444,
0.16271641850471497,
-0.01689537614583969,
0.06006971746683121,
-0.0649360790848732,
0.04828952997922897,
-0.0996830090880394,
0.058458395302295685,
0.03886326774954796,
0.004593301098793745,
-0.036189138889312744,
0.2430819720029831,
-0.005442793946713209,
0.06858464330434799,
-0.05741998180747032,
-0.041162289679050446,
-0.038836896419525146,
0.03556418418884277,
-0.045505791902542114,
0.037957292050123215,
-0.052183181047439575,
-0.034852445125579834,
0.0018982812762260437,
0.021062131971120834,
0.01365500409156084,
0.06598867475986481,
-0.036578018218278885,
0.020823761820793152,
-0.015557806007564068,
0.01972995512187481,
-0.06971806287765503,
0.020808693021535873,
0.0035422411747276783,
-0.061722271144390106,
0.11008833348751068,
-0.003898934694007039,
-0.0584605410695076,
0.024022504687309265,
-0.10781405866146088,
-0.08215410262346268,
0.015870820730924606,
0.011611862108111382,
-0.028650542721152306,
-0.14033415913581848,
0.007902784273028374,
0.041015006601810455,
-0.0079638147726655,
-0.024745402857661247,
0.053636591881513596,
-0.038513872772455215,
-0.017509697005152702,
-0.07114522904157639,
0.049537163227796555,
-0.08022002130746841,
0.04739970713853836,
0.002381635596975684,
0.0986228734254837,
0.053383827209472656,
-0.10627715289592743,
0.040804196149110794,
-0.09438841789960861,
0.01841641031205654,
-0.020177196711301804,
0.005501820705831051,
-0.0824701115489006,
-0.05418437719345093,
0.0765312984585762,
-0.04732056334614754,
0.11350910365581512,
0.010493619367480278,
-0.03196689486503601,
0.02956254966557026,
-0.07603555917739868,
-0.11673227697610855,
0.04285417124629021,
0.16570129990577698,
-0.012217523530125618,
-0.004420584067702293,
-0.04461844637989998,
0.037235260009765625,
0.015989521518349648,
0.1255948841571808,
0.10935235023498535,
0.14786052703857422,
0.10605527460575104,
0.0879654511809349,
-0.01138284895569086,
-0.059945445507764816,
-0.1062854528427124,
0.06394848972558975,
-0.049392540007829666,
0.03162778541445732,
-0.06575727462768555,
0.09522467106580734,
0.08338037878274918,
-0.06593381613492966,
0.10692515224218369,
-0.03922221437096596,
-0.0801953598856926,
-0.10756780207157135,
-0.09810398519039154,
-0.017721282318234444,
0.014754058793187141,
-0.006075069308280945,
-0.10094549506902695,
0.014443089254200459,
-0.07408882677555084,
0.026079347357153893,
-0.03510875254869461,
0.12170251458883286,
-0.1613927185535431,
-0.11316374689340591,
0.17414650321006775,
-0.0252835750579834,
0.04122704640030861,
-0.00024103958276100457,
0.015088207088410854,
0.1113448292016983,
0.056605082005262375,
0.10271564871072769,
0.040294747799634933,
0.016715092584490776,
0.03784406930208206,
-0.044945333153009415,
-0.08086790889501572,
-0.005080442409962416,
-0.03163702040910721,
0.10046707093715668,
0.11664620786905289,
0.0792718306183815,
-0.11056268215179443,
-0.00046395245590247214,
0.11730636656284332,
-0.0663285106420517,
-0.10973496735095978,
-0.14758098125457764,
0.1786588728427887,
-0.030204031616449356,
0.0212741456925869,
-0.0086204307153821,
-0.1018727496266365,
0.011393551714718342,
0.21807122230529785,
0.10702906548976898,
0.005365739576518536,
0.005307222716510296,
-0.04236771538853645,
0.004503136035054922,
-0.013446026481688023,
0.058657195419073105,
0.027954069897532463,
0.2586480975151062,
0.0053509254939854145,
0.03187297657132149,
-0.034531909972429276,
-0.05437901243567467,
-0.0499168336391449,
0.15605518221855164,
-0.041905391961336136,
0.03530781343579292,
-0.06341078132390976,
0.08161281049251556,
-0.05077837035059929,
-0.22625653445720673,
-0.08451604843139648,
-0.07216756790876389,
-0.06958503276109695,
0.017436806112527847,
-0.02501254342496395,
0.03226041793823242,
0.022156760096549988,
-0.017004111781716347,
0.05278679355978966,
0.12546800076961517,
0.018401771783828735,
-0.013940851204097271,
0.07151436805725098,
-0.024958938360214233,
-0.11475806683301926,
0.09941332787275314,
0.012876923196017742,
0.07600508630275726,
0.034237053245306015,
0.054444413632154465,
-0.08150814473628998,
0.07431863993406296,
-0.037105292081832886,
-0.08516407012939453,
0.02996646985411644,
0.23042163252830505,
-0.014069800265133381,
0.1588500440120697,
0.06699897348880768,
-0.05637112632393837,
0.03431194648146629,
-0.022385617718100548,
-0.049059510231018066,
-0.08208005875349045,
0.029695698991417885,
-0.025324875488877296,
0.1504947990179062,
0.0626218393445015,
-0.03777981922030449,
-0.004679740406572819,
-0.04845114424824715,
-0.0030771715100854635,
-0.0019046374363824725,
0.13018199801445007,
-0.025178175419569016,
-0.15936629474163055,
0.0032685159239917994,
-0.07558625191450119,
0.022884679958224297,
-0.2144881635904312,
-0.06217935308814049,
-0.01772426813840866,
-0.03547835722565651,
-0.03887499123811722,
0.09672975540161133,
0.09458561986684799,
-0.024623973295092583,
-0.04081762582063675,
-0.07552622258663177,
0.06393773853778839,
0.09357117116451263,
-0.11875241249799728,
-0.04447734355926514
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **mt** on **9.1** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **mt**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "mt", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-mt-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"mt",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"mt"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #mt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in mt on 9.1 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in mt on 9.1 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #mt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in mt on 9.1 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
73,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #mt #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in mt on 9.1 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in mt. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.07870951294898987,
0.0967971533536911,
-0.002635761396959424,
0.008764657191932201,
0.0834311917424202,
-0.06366321444511414,
0.12696661055088043,
0.04360297694802284,
0.02072225511074066,
0.09913680702447891,
-0.022794844582676888,
-0.03866958990693092,
0.06247826665639877,
0.13845734298229218,
0.05632511526346207,
-0.2451595664024353,
0.0318632498383522,
-0.07860597968101501,
0.03720109164714813,
0.05352204665541649,
0.11619200557470322,
-0.08753762394189835,
0.03171197324991226,
0.04799838364124298,
-0.030070224776864052,
0.035734351724386215,
-0.04587021842598915,
-0.09071039408445358,
0.050028447061777115,
0.04864801466464996,
-0.04432828351855278,
0.031022589653730392,
0.07017844915390015,
-0.1636778563261032,
0.037973422557115555,
0.04237959533929825,
0.027498332783579826,
0.002985098399221897,
0.10829642415046692,
0.015417602844536304,
0.15918618440628052,
-0.03373776748776436,
-0.00396325858309865,
0.08889376372098923,
-0.06092923507094383,
-0.08160059154033661,
-0.0689510777592659,
0.14252030849456787,
0.10392021387815475,
0.10837422311306,
-0.08022087067365646,
0.1031489148736,
-0.03140407055616379,
0.04692568629980087,
0.07770324498414993,
-0.17139141261577606,
-0.044094134122133255,
0.07098250836133957,
0.09658340364694595,
0.03426467254757881,
-0.08720101416110992,
0.07381541281938553,
0.04270763322710991,
-0.018969053402543068,
-0.055962517857551575,
-0.03267283737659454,
0.12170326709747314,
-0.09967189282178879,
-0.12438622862100601,
0.010640201158821583,
0.19804419577121735,
0.05246623605489731,
-0.06687074154615402,
-0.13534802198410034,
0.0116544459015131,
0.21028095483779907,
-0.04104384034872055,
-0.08591736108064651,
0.007739225402474403,
0.021157830953598022,
0.0222103763371706,
-0.052752673625946045,
-0.06366922706365585,
0.004258646164089441,
-0.0020457373466342688,
0.07319954037666321,
0.0018323420081287622,
-0.022438639774918556,
-0.07589757442474365,
-0.007214273791760206,
-0.1044432520866394,
-0.12534020841121674,
-0.01917516440153122,
-0.0713314414024353,
-0.06561695784330368,
-0.05257297307252884,
-0.007235977333039045,
-0.12009844928979874,
0.025151975452899933,
0.08094143122434616,
0.0811045691370964,
0.0531122200191021,
-0.06359516829252243,
-0.032154668122529984,
0.12454552203416824,
0.05263941362500191,
-0.11785132437944412,
-0.03980536758899689,
0.016051821410655975,
-0.02800564281642437,
0.008933034725487232,
-0.03636014461517334,
-0.028947876766324043,
0.02082948386669159,
-0.036400534212589264,
0.05620954558253288,
0.051854927092790604,
-0.02666635252535343,
-0.03064749762415886,
-0.09736838936805725,
0.10597744584083557,
-0.08474502712488174,
0.029516344889998436,
0.050837624818086624,
-0.009251296520233154,
0.10454504191875458,
-0.06055881455540657,
0.07597596198320389,
-0.11159487813711166,
0.019161278381943703,
-0.02989622764289379,
0.000770685903262347,
0.0210183784365654,
-0.02301868051290512,
0.045405104756355286,
0.02678200975060463,
0.0016765753971412778,
-0.12614081799983978,
0.0015352752525359392,
-0.09956706315279007,
-0.035561058670282364,
-0.07336628437042236,
-0.030728871002793312,
-0.04324278235435486,
0.015491677448153496,
0.0028920795302838087,
-0.009068753570318222,
0.01842387393116951,
-0.024738075211644173,
-0.0052796415984630585,
0.0016373529797419906,
0.05204739421606064,
0.07034274190664291,
0.0859128013253212,
-0.029881957918405533,
-0.023756839334964752,
-0.10665208101272583,
0.12043384462594986,
-0.07770457863807678,
-0.005110585130751133,
-0.1419653296470642,
-0.025198623538017273,
-0.042321886867284775,
0.026978570967912674,
0.012459718622267246,
0.1374390870332718,
-0.17849738895893097,
-0.0794309452176094,
0.12409650534391403,
-0.11770594865083694,
-0.007658679503947496,
0.17928187549114227,
-0.0007005391526035964,
0.07553526014089584,
0.10615388303995132,
0.19723749160766602,
0.02792084962129593,
-0.1666700392961502,
-0.02907080575823784,
-0.06418801099061966,
0.03906523436307907,
0.11496436595916748,
0.07045269757509232,
-0.05226266011595726,
0.067845419049263,
-0.02716546691954136,
-0.017569901421666145,
-0.0713801383972168,
0.00639177905395627,
-0.04722532629966736,
0.015147214755415916,
-0.03727778419852257,
0.01573007181286812,
-0.015299320220947266,
-0.013981417752802372,
-0.02283271960914135,
-0.07799262553453445,
-0.03718356043100357,
0.12321153283119202,
-0.048634789884090424,
0.02130969800055027,
-0.0937759205698967,
0.05162873864173889,
0.049544062465429306,
0.010215139016509056,
-0.12530240416526794,
0.1211504265666008,
0.023462234064936638,
-0.05941968411207199,
0.1417805254459381,
0.0724831148982048,
-0.025997323915362358,
-0.004444614984095097,
-0.01784234680235386,
0.02971162460744381,
-0.010819156654179096,
0.008833339437842369,
-0.029108643531799316,
-0.10904167592525482,
0.011405904777348042,
-0.0640575960278511,
0.10552116483449936,
-0.12741900980472565,
-0.014749477617442608,
0.06166813150048256,
0.11574586480855942,
-0.0056836786679923534,
-0.04146658629179001,
0.09349227696657181,
0.04022593051195145,
0.025477634742856026,
-0.013719415292143822,
0.01435053814202547,
-0.01693260669708252,
-0.0088338702917099,
0.07998581230640411,
-0.15062755346298218,
-0.15726161003112793,
0.10971687734127045,
0.012858031317591667,
-0.009899623692035675,
0.04873894900083542,
0.037029068917036057,
-0.024077342823147774,
-0.04727163538336754,
-0.006533977575600147,
0.21266768872737885,
-0.013044008053839207,
0.06155906990170479,
-0.08593206852674484,
-0.028160452842712402,
0.009807885624468327,
-0.04047541320323944,
-0.09268473088741302,
0.08505188673734665,
0.006705162581056356,
-0.1107194647192955,
-0.02237614057958126,
0.08546200394630432,
0.07264775037765503,
0.17466279864311218,
0.010382628999650478,
-0.0952252745628357,
-0.024966029450297356,
-0.06494912505149841,
-0.005630410276353359,
0.012085983529686928,
-0.15122781693935394,
-0.023004986345767975,
0.028636150062084198,
0.019220020622015,
0.03780127316713333,
-0.010585450567305088,
0.03811687231063843,
0.014347228221595287,
-0.035559672862291336,
-0.08725539594888687,
0.03953859210014343,
-0.02602498233318329,
0.03878391906619072,
-0.007031688001006842,
0.02960273250937462,
-0.041564732789993286,
-0.061965085566043854,
-0.14025406539440155,
0.08114640414714813,
-0.07641100138425827,
-0.31817519664764404,
-0.09491653740406036,
-0.06161307170987129,
-0.03800755739212036,
0.007150917313992977,
0.05504414439201355,
-0.11397306621074677,
-0.11095555871725082,
-0.07347314059734344,
0.1370936781167984,
-0.015845241025090218,
-0.06723669916391373,
0.13219672441482544,
0.0016069221310317516,
0.018063683062791824,
-0.09487242996692657,
0.021719496697187424,
-0.02385159209370613,
-0.03903445973992348,
-0.024146636947989464,
0.017968732863664627,
0.05348057672381401,
0.13292507827281952,
0.03460029140114784,
-0.011988273821771145,
0.013945835642516613,
0.21333737671375275,
-0.15011060237884521,
0.07458192110061646,
0.2342233806848526,
-0.046565305441617966,
-0.008424204774200916,
0.14616060256958008,
-0.004350391216576099,
-0.04874460771679878,
0.05397787317633629,
0.00812800694257021,
-0.022210748866200447,
-0.24091792106628418,
-0.12480252236127853,
-0.05679973214864731,
-0.0208810456097126,
0.048131681978702545,
0.022912805899977684,
0.00004438631367520429,
0.017627552151679993,
-0.09614823758602142,
-0.0492577888071537,
0.05477214604616165,
0.03987668454647064,
0.14135615527629852,
0.015292968600988388,
0.061130013316869736,
-0.0472223274409771,
-0.015438936650753021,
0.116806760430336,
-0.03780674934387207,
0.05028422549366951,
0.06992945820093155,
0.10512284934520721,
0.06199796125292778,
0.014611825346946716,
0.05539365112781525,
-0.021001439541578293,
-0.0149952108040452,
-0.008374427445232868,
-0.03223692625761032,
-0.07857764512300491,
0.025429856032133102,
0.04685032740235329,
0.13818849623203278,
-0.1260024756193161,
-0.11142580211162567,
0.02705891616642475,
0.03325595706701279,
0.117339588701725,
0.09671793133020401,
-0.026629069820046425,
-0.10548405349254608,
0.04444218799471855,
-0.09958237409591675,
-0.018307246267795563,
0.049592796713113785,
0.09817259013652802,
-0.1585904210805893,
0.10095279663801193,
0.0795760378241539,
0.09415341168642044,
-0.021122876554727554,
0.024846967309713364,
-0.05925580859184265,
0.05489132180809975,
-0.004959673620760441,
0.05992797017097473,
-0.18400971591472626,
0.10110291093587875,
0.02107810415327549,
0.07828286290168762,
-0.07246700674295425,
0.01610509492456913,
0.06436484307050705,
0.011176597326993942,
0.11748483777046204,
-0.0005539051489904523,
-0.11313184350728989,
0.01155125629156828,
-0.11405081301927567,
0.008876336738467216,
0.052338872104883194,
-0.05576949939131737,
0.05675293877720833,
-0.004427620675414801,
-0.01676751859486103,
-0.03712151199579239,
0.01116851344704628,
-0.22493354976177216,
-0.13739772140979767,
0.04598672688007355,
0.008414759300649166,
0.050873275846242905,
-0.03319689631462097,
-0.08350297063589096,
-0.11281895637512207,
0.09291553497314453,
0.006668785121291876,
-0.019557781517505646,
-0.07222431898117065,
0.0007761825108900666,
0.09430774301290512,
-0.06386087834835052,
0.014017021283507347,
0.047609973698854446,
0.14797016978263855,
-0.0557061992585659,
-0.04749505594372749,
0.036471206694841385,
-0.1025579646229744,
-0.1295960247516632,
0.003839684883132577,
0.1816554069519043,
0.10906078666448593,
0.06554792821407318,
0.07819760590791702,
0.023908676579594612,
0.00392010947689414,
-0.09019115567207336,
0.011991082690656185,
0.03450210765004158,
-0.07827342301607132,
0.05651146173477173,
0.0025844648480415344,
-0.2576461434364319,
-0.1489601731300354,
-0.08626627922058105,
0.08243712037801743,
0.18757668137550354,
-0.01642555184662342,
0.1741742491722107,
0.2798350155353546,
-0.08424494415521622,
-0.23270218074321747,
-0.024781154468655586,
-0.009399528615176678,
0.017792439088225365,
0.06674209237098694,
-0.1954420655965805,
0.10990040749311447,
0.013397297821938992,
0.01620473526418209,
-0.04005774110555649,
-0.22957874834537506,
-0.13963697850704193,
0.14135335385799408,
-0.03366005793213844,
0.05526846647262573,
-0.03921876847743988,
-0.06905843317508698,
-0.03172909840941429,
-0.054194752126932144,
0.012292067520320415,
-0.09614288061857224,
0.08644192665815353,
0.05589403212070465,
0.02719132974743843,
0.018362879753112793,
0.009570918045938015,
0.11000165343284607,
0.08609754592180252,
-0.026553405448794365,
-0.09095197916030884,
0.036517899483442307,
-0.0011161984875798225,
-0.020445775240659714,
0.08591067790985107,
0.03159427270293236,
0.004334527067840099,
-0.039914265275001526,
-0.08695151656866074,
-0.06944620609283447,
0.06188362464308739,
-0.06632687896490097,
-0.017461489886045456,
-0.05582146346569061,
0.09775122255086899,
0.02996313013136387,
-0.014062050729990005,
-0.058651380240917206,
-0.10326997190713882,
-0.010282034054398537,
0.12059379369020462,
0.2265464961528778,
-0.06387681514024734,
0.001465643523260951,
-0.03938945382833481,
-0.04720286279916763,
0.0525442510843277,
-0.032718148082494736,
0.061394378542900085,
0.05064805969595909,
0.029196932911872864,
0.08109357208013535,
-0.029502296820282936,
-0.12893728911876678,
0.03376276046037674,
0.04590969160199165,
-0.06044420227408409,
-0.15986192226409912,
-0.04762735217809677,
0.019526228308677673,
-0.012824486009776592,
-0.03499459847807884,
0.2016608715057373,
-0.021945975720882416,
-0.05502765625715256,
0.010497194714844227,
0.059339750558137894,
-0.0035809725522994995,
0.12890084087848663,
0.033855587244033813,
0.04020846262574196,
-0.09156060218811035,
0.06064701825380325,
0.12002676725387573,
-0.04910562187433243,
0.04211743548512459,
0.11786855012178421,
-0.04783552512526512,
-0.061557572335004807,
-0.10527163743972778,
-0.013186180032789707,
0.044403884559869766,
-0.04349268600344658,
0.008973708376288414,
-0.10269521921873093,
0.019247008487582207,
0.03357706591486931,
0.006180833093822002,
-0.04751797765493393,
-0.03953934460878372,
-0.004370414651930332,
-0.09075529873371124,
0.07025036960840225,
0.0925423800945282,
-0.02810550481081009,
-0.11159957945346832,
0.11054296791553497,
0.004700410645455122,
0.07205019891262054,
-0.038762252777814865,
-0.06860621273517609,
-0.09638861566781998,
-0.006142203696072102,
-0.09138567745685577,
0.02962612360715866,
-0.14755365252494812,
-0.009670954197645187,
-0.0488160215318203,
-0.031911611557006836,
-0.024111857637763023,
0.07362590730190277,
-0.035893045365810394,
0.001787498826161027,
-0.03672807291150093,
0.09757906943559647,
-0.12014330923557281,
0.06578519195318222,
0.059773385524749756,
-0.043675798922777176,
0.10426361858844757,
0.019565057009458542,
-0.05661700293421745,
0.030306298285722733,
-0.21378067135810852,
-0.059416405856609344,
-0.02225024811923504,
0.05456852912902832,
-0.003629365935921669,
-0.17283901572227478,
0.005134538747370243,
0.01879095658659935,
0.022661801427602768,
-0.016183115541934967,
0.03639386594295502,
-0.030449138954281807,
-0.019047170877456665,
-0.0661596804857254,
-0.05902010202407837,
-0.038125645369291306,
0.05303826928138733,
0.07264508306980133,
0.003814343363046646,
0.107208751142025,
-0.09263629466295242,
0.06966482102870941,
-0.07677636295557022,
0.026266250759363174,
-0.023440122604370117,
0.010173815302550793,
-0.0665300190448761,
-0.07101195305585861,
0.07875718921422958,
-0.014353753998875618,
0.0810355693101883,
0.011368202976882458,
-0.04969538375735283,
0.062241457402706146,
-0.049113523215055466,
-0.06172442436218262,
0.03047020733356476,
0.1345800906419754,
0.05437582731246948,
0.01338526513427496,
-0.00034184817923232913,
-0.03627646341919899,
0.008657547645270824,
0.1470872312784195,
0.14185817539691925,
0.1703260987997055,
0.10496390610933304,
0.028251953423023224,
0.07154227048158646,
-0.03726513683795929,
-0.08808637410402298,
0.05737048014998436,
-0.08099477738142014,
0.04163051396608353,
-0.05754033848643303,
-0.06834971904754639,
0.07857774943113327,
-0.13547258079051971,
0.07273662090301514,
-0.04252970591187477,
-0.06853469461202621,
-0.10540866106748581,
-0.13858647644519806,
-0.07048141211271286,
-0.05057436600327492,
-0.001696013263426721,
-0.1085224598646164,
0.046541571617126465,
0.016651198267936707,
0.05011891946196556,
-0.09412852674722672,
0.10835689306259155,
-0.13311846554279327,
-0.133632630109787,
0.1507987231016159,
-0.04719587787985802,
-0.011941912584006786,
-0.0008550349739380181,
0.04505699872970581,
0.02664017491042614,
0.09016099572181702,
0.04443405196070671,
0.05246293172240257,
0.017407163977622986,
0.013876601122319698,
-0.1073155403137207,
-0.0739164724946022,
0.02729830890893936,
-0.006582019384950399,
0.08908059448003769,
0.1953498125076294,
0.08831777423620224,
-0.07241034507751465,
0.01679840125143528,
0.14543630182743073,
0.017188785597682,
-0.10856613516807556,
-0.15529431402683258,
0.025819679722189903,
-0.03394092246890068,
-0.006116461008787155,
-0.008617542684078217,
-0.09675929695367813,
0.008059967309236526,
0.20544235408306122,
0.16422250866889954,
-0.045840371400117874,
0.02529282495379448,
-0.018705572932958603,
0.013839909806847572,
0.016468200832605362,
0.08544796705245972,
0.09196490794420242,
0.1695481240749359,
-0.004323988221585751,
0.05974873527884483,
-0.01797696389257908,
-0.08770827949047089,
-0.11373066902160645,
0.08485282212495804,
-0.007492457050830126,
-0.03793865442276001,
0.0007123128743842244,
0.18292896449565887,
-0.1101749911904335,
-0.20105412602424622,
-0.12822912633419037,
-0.050409361720085144,
-0.11875033378601074,
0.01645757630467415,
-0.044862743467092514,
0.1449548453092575,
0.04118754714727402,
0.011617436073720455,
0.009500646963715553,
0.16326579451560974,
0.04358387365937233,
0.029599331319332123,
-0.04147294536232948,
0.11561708152294159,
-0.08639343082904816,
0.10464245080947876,
-0.0006177923060022295,
0.03127589821815491,
0.03775271028280258,
0.03724516183137894,
-0.06910816580057144,
0.02580730803310871,
0.04533914104104042,
-0.026364807039499283,
0.045474618673324585,
0.18198665976524353,
-0.012456119060516357,
0.1173454150557518,
0.11523771286010742,
-0.06594651192426682,
0.018432362005114555,
0.0011291301343590021,
0.025118669494986534,
-0.06453783810138702,
0.1548101305961609,
-0.14646734297275543,
0.13314048945903778,
0.11161655187606812,
-0.0714254304766655,
-0.04613141715526581,
-0.012264628894627094,
0.051339853554964066,
-0.06008968502283096,
0.07138059288263321,
-0.018856167793273926,
-0.18023072183132172,
0.029697144404053688,
-0.11784490197896957,
0.07846857607364655,
-0.2692687511444092,
-0.04296594485640526,
-0.045205097645521164,
-0.01996680535376072,
-0.004100168123841286,
0.11970245838165283,
0.08367657661437988,
-0.051882900297641754,
-0.006692967843264341,
-0.07200587540864944,
0.019331667572259903,
0.09253499656915665,
-0.0847354307770729,
-0.020410459488630295
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the nl unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "nl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-nl-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"nl",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"nl"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the nl unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the nl unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the nl unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
72,
133,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the nl unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.0645478293299675,
0.026864802464842796,
-0.004930464085191488,
0.010616949759423733,
0.11867018043994904,
-0.028952257707715034,
0.08794040232896805,
0.012847415171563625,
0.032603465020656586,
0.010280214250087738,
0.011447528377175331,
0.037810176610946655,
0.07051992416381836,
0.08472040295600891,
0.0038160686381161213,
-0.3053262233734131,
0.04966700077056885,
0.01570974290370941,
0.07503652572631836,
0.06198843568563461,
0.1297679841518402,
-0.08215373009443283,
0.022274179384112358,
0.06670333445072174,
-0.09011264145374298,
-0.006497078575193882,
0.02403748407959938,
-0.08956415206193924,
0.12670530378818512,
0.08492248505353928,
0.08736703544855118,
0.054049380123615265,
0.039698198437690735,
-0.16139575839042664,
0.02926536835730076,
0.04231349751353264,
-0.03665585815906525,
0.0017267927760258317,
0.12408456206321716,
-0.029575766995549202,
0.20613372325897217,
-0.03500049561262131,
-0.03790980949997902,
0.08717303723096848,
-0.11991876363754272,
-0.17489579319953918,
-0.05975385010242462,
0.13847434520721436,
0.13233385980129242,
0.08554694801568985,
-0.06949329376220703,
0.0387614406645298,
-0.056079234927892685,
0.07258336246013641,
0.09778346866369247,
-0.29612958431243896,
-0.03481462225317955,
0.12125816941261292,
0.07596985995769501,
-0.03743704408407211,
-0.0986301377415657,
0.07763399183750153,
0.014765865169465542,
0.00965797808021307,
-0.01306236069649458,
-0.08517233282327652,
0.04325684905052185,
-0.07968846708536148,
-0.11563601344823837,
-0.014615400694310665,
0.19025766849517822,
0.03730928897857666,
-0.05978243425488472,
-0.09308945387601852,
-0.019707923755049706,
0.17651930451393127,
-0.058872438967227936,
-0.15190117061138153,
0.00802510417997837,
0.03277549147605896,
0.05906909331679344,
-0.14814212918281555,
-0.09169429540634155,
-0.01561096590012312,
-0.05438239499926567,
0.12285915017127991,
0.038203418254852295,
-0.017735200002789497,
-0.0740474984049797,
0.028520910069346428,
-0.06321587413549423,
-0.06575597077608109,
0.009830337017774582,
-0.09409265965223312,
-0.06327056139707565,
-0.004669601563364267,
-0.0668126568198204,
-0.09343595802783966,
-0.023944759741425514,
0.0685097724199295,
0.015097612515091896,
0.036170005798339844,
-0.03004002571105957,
0.03887756168842316,
0.016777778044342995,
0.0889003723859787,
-0.1269870549440384,
0.019316066056489944,
0.011408708058297634,
-0.04573013260960579,
-0.007900447584688663,
-0.027996966615319252,
-0.07354045659303665,
-0.07802782207727432,
0.01083192229270935,
0.062135085463523865,
0.027643561363220215,
0.0226691085845232,
-0.07399384677410126,
-0.09464563429355621,
0.04075004160404205,
-0.06506510078907013,
0.014339234679937363,
0.032264310866594315,
-0.0049863881431519985,
0.18489737808704376,
-0.0008933368953876197,
0.06162164360284805,
-0.1556389033794403,
0.022500963881611824,
-0.008932758122682571,
0.015498165041208267,
-0.007397630251944065,
-0.06061360239982605,
0.025769885629415512,
-0.015903092920780182,
-0.007384150288999081,
-0.15041233599185944,
-0.06218880042433739,
-0.07893853634595871,
-0.006408386398106813,
-0.027693921700119972,
-0.08212189376354218,
-0.036532770842313766,
-0.006108168046921492,
-0.017530491575598717,
-0.033467065542936325,
-0.012450321577489376,
-0.018368661403656006,
0.026486365124583244,
-0.025328202173113823,
0.06845252215862274,
-0.037198059260845184,
0.08429689705371857,
0.005074348766356707,
-0.02917342819273472,
-0.1163744330406189,
0.1288340836763382,
-0.06664133816957474,
-0.06219544634222984,
-0.15286333858966827,
-0.07140423357486725,
-0.046844128519296646,
0.055805038660764694,
0.0021564962808042765,
0.1381215900182724,
-0.17960204184055328,
-0.10871191322803497,
0.26055774092674255,
-0.09502431005239487,
0.034626711159944534,
0.1945847123861313,
0.021907951682806015,
0.04156339168548584,
0.15913590788841248,
0.13396838307380676,
0.040936242789030075,
-0.13359642028808594,
0.053553506731987,
-0.04061965271830559,
-0.01690598949790001,
0.04040812700986862,
0.06291351467370987,
-0.012077233754098415,
0.004700843244791031,
0.000712005712557584,
-0.06625834852457047,
-0.04190601035952568,
-0.016180412843823433,
-0.0625278428196907,
0.02814592234790325,
-0.024006539955735207,
0.09735672920942307,
0.013362804427742958,
0.007347323000431061,
0.011463423259556293,
-0.09089837968349457,
-0.03757646679878235,
0.07731310278177261,
-0.05135747417807579,
0.06733744591474533,
-0.10570120066404343,
0.0510985292494297,
0.1061200499534607,
0.050269197672605515,
-0.15279117226600647,
0.05304563418030739,
-0.016438128426671028,
0.09419947117567062,
0.1011967808008194,
0.20893679559230804,
-0.02434626966714859,
-0.04580603539943695,
-0.07793726027011871,
0.019306635484099388,
-0.03855276480317116,
-0.03826502710580826,
-0.03351885452866554,
-0.07773678749799728,
-0.029529832303524017,
-0.03425466641783714,
0.06575364619493484,
-0.16083382070064545,
0.0009457054547965527,
0.05208654701709747,
0.05325371026992798,
0.01227701548486948,
0.014429816044867039,
0.02571740187704563,
0.10582796484231949,
0.03982424736022949,
0.020607776939868927,
0.09767568856477737,
-0.010001908987760544,
-0.050625644624233246,
0.10189447551965714,
-0.056601203978061676,
0.0110818175598979,
0.13201001286506653,
-0.10840825736522675,
-0.00047422171337530017,
0.010912791825830936,
0.012146998196840286,
-0.0018719073850661516,
0.002606002613902092,
-0.015839681029319763,
0.22857269644737244,
0.01576782390475273,
0.08056644350290298,
-0.08390098065137863,
0.02166733704507351,
-0.014084676280617714,
-0.05265836790204048,
-0.0457015186548233,
0.10416511446237564,
0.01410408690571785,
-0.08014612644910812,
-0.0014268686063587666,
0.10093028843402863,
-0.018827475607395172,
0.13719357550144196,
0.01574624516069889,
-0.02193612791597843,
0.01523437537252903,
-0.04981730505824089,
-0.018526582047343254,
-0.01827302947640419,
-0.14780379831790924,
-0.02156941220164299,
0.04045001044869423,
0.028355350717902184,
0.058468159288167953,
-0.08244391530752182,
-0.008840885944664478,
0.006306142080575228,
-0.08429207652807236,
-0.04767031967639923,
0.05210084095597267,
-0.00563819520175457,
0.08121012896299362,
-0.03970532491803169,
-0.020153315737843513,
-0.007946948520839214,
-0.03480856120586395,
-0.10977792739868164,
0.11571724712848663,
-0.06353235244750977,
-0.36237072944641113,
-0.08563562482595444,
-0.08932286500930786,
-0.08466635644435883,
0.018972251564264297,
0.051655299961566925,
-0.09375680238008499,
-0.06391484290361404,
0.006032824050635099,
0.1639983057975769,
-0.040247172117233276,
-0.0860496386885643,
0.03256096690893173,
0.013124899938702583,
-0.016816051676869392,
-0.10573095083236694,
0.004140973091125488,
-0.04748815298080444,
-0.13280409574508667,
0.02046787366271019,
-0.022946927696466446,
0.03935561329126358,
0.1452156901359558,
0.03602501377463341,
-0.01954585127532482,
-0.025105129927396774,
0.21705234050750732,
-0.11128716915845871,
0.0692906305193901,
0.29568982124328613,
0.010838933289051056,
0.020174821838736534,
0.12587623298168182,
0.0018515887204557657,
-0.0543699748814106,
0.0010403975611552596,
0.04804651811718941,
-0.012962833978235722,
-0.26933491230010986,
-0.13618303835391998,
-0.06712391972541809,
-0.017532700672745705,
0.029197128489613533,
0.0021985811181366444,
0.01678968220949173,
0.036066699773073196,
-0.09232478588819504,
-0.04107902944087982,
0.0673002377152443,
0.027668818831443787,
0.21209275722503662,
-0.03359527513384819,
0.13578464090824127,
-0.029115166515111923,
-0.03144071623682976,
0.06340079009532928,
0.04742484539747238,
0.07683208584785461,
0.08825468271970749,
0.09751126915216446,
0.09377386420965195,
0.06305267661809921,
0.033073827624320984,
0.008762650191783905,
-0.009743358939886093,
-0.016879335045814514,
-0.04735688120126724,
-0.026735851541161537,
-0.04240307956933975,
0.0056722708977758884,
0.1374315619468689,
-0.14502452313899994,
-0.13571228086948395,
-0.0004285935719963163,
0.02353784814476967,
0.15833599865436554,
0.06386257708072662,
-0.08230768144130707,
-0.05893931910395622,
0.036217816174030304,
-0.08402729779481888,
-0.03937746211886406,
0.0580546110868454,
0.08227340877056122,
-0.16786737740039825,
0.11724147200584412,
0.040200211107730865,
0.1040361076593399,
-0.029571685940027237,
0.049878209829330444,
-0.15470489859580994,
0.00004176571383140981,
0.04811360687017441,
0.08655783534049988,
-0.2614523768424988,
0.21754680573940277,
0.006887371651828289,
0.06805329769849777,
-0.0772942453622818,
-0.008553140796720982,
0.039546750485897064,
0.09602946043014526,
0.1377229243516922,
-0.00656583346426487,
-0.0035384600050747395,
-0.00012394080113153905,
-0.031126471236348152,
0.030748382210731506,
0.02519836835563183,
-0.02133551612496376,
0.039420176297426224,
-0.007589624263346195,
0.012040759436786175,
-0.006642453838139772,
0.09410605579614639,
-0.22462572157382965,
-0.1289045363664627,
0.03382358327507973,
0.03535090386867523,
0.08559346944093704,
-0.01313697174191475,
-0.08221111446619034,
-0.1318805068731308,
0.1204347237944603,
-0.0034893678966909647,
-0.03936091065406799,
-0.09676310420036316,
0.03931078314781189,
0.02164394222199917,
-0.11458848416805267,
0.02004699967801571,
0.04259514808654785,
0.10936985909938812,
-0.09304533898830414,
-0.05512135103344917,
0.04404320567846298,
-0.08896227180957794,
-0.07902379333972931,
0.04683923348784447,
0.18891990184783936,
0.10091283172369003,
0.04421215131878853,
0.1089625358581543,
-0.04272129014134407,
0.015116953290998936,
-0.1122196689248085,
0.06345688551664352,
0.018521727994084358,
-0.004659803584218025,
0.02559596858918667,
-0.051695190370082855,
-0.2601912021636963,
-0.11655981093645096,
-0.02180081233382225,
0.19822359085083008,
0.17469799518585205,
-0.005942220333963633,
0.1545112580060959,
0.24079303443431854,
-0.07750038802623749,
-0.2583223581314087,
-0.07303391396999359,
-0.009454799816012383,
0.04964793100953102,
0.023365138098597527,
-0.2555757761001587,
0.06302125751972198,
0.05732293799519539,
0.0011398853966966271,
-0.09707945585250854,
-0.23149214684963226,
-0.13123637437820435,
0.1825464516878128,
0.049705635756254196,
0.1416042447090149,
-0.08593182265758514,
-0.043282400816679,
-0.06364882737398148,
-0.120974101126194,
0.0894736796617508,
-0.0912819504737854,
0.09493575990200043,
0.0442219078540802,
0.017144707962870598,
0.00950110424309969,
0.042607441544532776,
0.12107286602258682,
0.06960230320692062,
-0.0006272523896768689,
-0.03610658273100853,
0.016106750816106796,
0.011612249538302422,
0.025955215096473694,
0.04574894905090332,
0.023050984367728233,
-0.016171367838978767,
-0.07159379124641418,
-0.10794350504875183,
-0.12666308879852295,
0.0855972096323967,
-0.05926061049103737,
-0.01984056644141674,
-0.024288780987262726,
0.09557653218507767,
0.01578865759074688,
0.014338799752295017,
-0.06013157591223717,
-0.13375598192214966,
0.027125906199216843,
0.10388301312923431,
0.2443821281194687,
-0.13694079220294952,
-0.029561402276158333,
-0.05808708071708679,
-0.04727824404835701,
0.0832652598619461,
0.011141282506287098,
0.05325574055314064,
0.05247527360916138,
0.006234460975974798,
0.09004513174295425,
0.024279484525322914,
-0.0758649930357933,
0.018209002912044525,
0.031117256730794907,
-0.06484189629554749,
-0.25496721267700195,
-0.06631515175104141,
-0.0001131995304604061,
0.018100552260875702,
0.022383244708180428,
0.17584316432476044,
-0.016467690467834473,
-0.06408996880054474,
-0.011147051118314266,
0.03956248611211777,
-0.03884214535355568,
0.06327272206544876,
0.04506490379571915,
0.05052748695015907,
-0.1066761165857315,
0.039219532161951065,
0.09836488962173462,
-0.12306258082389832,
0.04084383696317673,
0.06009383127093315,
-0.057238057255744934,
-0.09795323014259338,
-0.1293502300977707,
0.0031009239610284567,
-0.007878398522734642,
-0.07789017260074615,
0.03275248780846596,
-0.1686335653066635,
0.031152112409472466,
0.07602297514677048,
0.046046528965234756,
-0.01485416665673256,
-0.06280194967985153,
-0.04579722881317139,
-0.03829270601272583,
0.014637107029557228,
0.12176448851823807,
-0.0702304095029831,
-0.12843839824199677,
0.15293502807617188,
0.017767665907740593,
0.10828101634979248,
-0.04081173986196518,
-0.05570102483034134,
-0.12462611496448517,
0.027537493035197258,
-0.12549445033073425,
0.016569962725043297,
-0.1324453055858612,
0.004097040742635727,
-0.05262293666601181,
-0.02168155089020729,
-0.021614914759993553,
0.03758510202169418,
-0.10006754100322723,
0.014308669604361057,
-0.0019473545253276825,
0.08335301280021667,
-0.10173025727272034,
0.0631551519036293,
0.06727378070354462,
-0.015863755717873573,
0.09467101097106934,
0.022654607892036438,
-0.047596681863069534,
0.09254322946071625,
-0.18920393288135529,
-0.044522710144519806,
0.03762160986661911,
0.03465038165450096,
-0.004432716406881809,
-0.1647617220878601,
0.010011947713792324,
0.03259151428937912,
0.040236689150333405,
-0.0026322759222239256,
0.06663347780704498,
-0.06765539944171906,
-0.007538817822933197,
-0.05106651037931442,
-0.08898352831602097,
-0.036040257662534714,
0.07259922474622726,
0.08780164271593094,
0.023516906425356865,
0.11630381643772125,
-0.07600605487823486,
0.06636344641447067,
-0.09862418472766876,
0.06695659458637238,
-0.048683639615774155,
-0.029667170718312263,
0.009551615454256535,
-0.12207352370023727,
0.0639001801609993,
-0.0034831564407795668,
0.058983612805604935,
0.01064690388739109,
-0.03450682759284973,
0.007409766316413879,
-0.09063340723514557,
-0.0975819081068039,
0.02153000794351101,
0.1239909827709198,
0.06958796083927155,
-0.0069646816700696945,
0.040675997734069824,
0.0019310432253405452,
0.008070680312812328,
0.220807284116745,
0.20281684398651123,
0.2112276703119278,
0.0657205805182457,
0.07531043142080307,
0.01258853729814291,
-0.04730701446533203,
-0.06341458112001419,
-0.009199579246342182,
-0.06588765978813171,
0.03211469575762749,
-0.07131586223840714,
-0.02572278492152691,
0.08489041030406952,
-0.13698557019233704,
0.11558474600315094,
0.010485940612852573,
-0.08276194334030151,
-0.14802826941013336,
-0.18802694976329803,
-0.05713054537773132,
-0.0736008733510971,
-0.02544579468667507,
-0.12277878820896149,
-0.03970981389284134,
0.02440570294857025,
0.03217370808124542,
-0.12042028456926346,
0.06581487506628036,
-0.13200876116752625,
-0.1528872549533844,
0.18632444739341736,
-0.04479967802762985,
0.017598653212189674,
-0.02598576247692108,
0.016537338495254517,
-0.0014231452951207757,
0.09485495835542679,
0.02929762192070484,
0.03894950449466705,
-0.028659654781222343,
0.04070236161351204,
-0.07941052317619324,
-0.06099952757358551,
-0.0055674291215837,
0.041325490921735764,
0.13882575929164886,
0.2104981392621994,
0.04277745634317398,
-0.06998644024133682,
0.007069960702210665,
0.15133067965507507,
0.04004860669374466,
-0.11951342225074768,
-0.12860849499702454,
0.0851653441786766,
0.009448782540857792,
0.0022420117165893316,
-0.01916414126753807,
-0.05551920458674431,
0.0114006157964468,
0.27985551953315735,
0.187519833445549,
-0.056906916201114655,
0.03119984269142151,
0.0009907388594001532,
0.03284212946891785,
0.0799061730504036,
0.11630623787641525,
0.09874117374420166,
0.21498066186904907,
-0.04423925653100014,
0.0028317715041339397,
-0.01605590432882309,
-0.05207795277237892,
-0.09911680966615677,
0.12312234938144684,
0.028485091403126717,
-0.07523532956838608,
0.0011513284407556057,
0.1514160931110382,
-0.14842841029167175,
-0.056277137249708176,
-0.07542669028043747,
-0.05064899101853371,
-0.10920003056526184,
-0.005546892061829567,
-0.029912440106272697,
0.10879631340503693,
0.10102516412734985,
-0.02447713352739811,
-0.007632817607372999,
0.17432042956352234,
0.047670722007751465,
0.0004230267077218741,
-0.011441770009696484,
0.12703992426395416,
0.011016963049769402,
0.06352180987596512,
-0.01190903689712286,
0.08099742978811264,
0.06121116876602173,
0.05457086116075516,
-0.02840198576450348,
0.06613178551197052,
0.004032537341117859,
0.032545819878578186,
0.08610205352306366,
0.12530522048473358,
0.014385990798473358,
0.026325786486268044,
0.0913064256310463,
-0.15063492953777313,
0.02645852044224739,
0.04978376254439354,
-0.029478685930371284,
-0.018547765910625458,
0.1730831265449524,
-0.18447715044021606,
0.050034694373607635,
0.14945849776268005,
-0.039060186594724655,
-0.045100197196006775,
-0.04215294122695923,
0.04895765706896782,
-0.0297088623046875,
0.03862253949046135,
-0.042648956179618835,
-0.13460515439510345,
0.012762639671564102,
-0.07869233191013336,
0.028413187712430954,
-0.18953470885753632,
-0.01179483998566866,
-0.036495838314294815,
-0.014309020712971687,
-0.04858582466840744,
0.08608260005712509,
0.010340217500925064,
-0.05241067707538605,
0.021258926019072533,
-0.06581549346446991,
0.01226024329662323,
0.10506035387516022,
-0.09878311306238174,
-0.052163269370794296
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **north_germanic** on **29.9** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **north_germanic**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "north_germanic", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-north_germanic-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"north_germanic"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in north_germanic on 29.9 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in north_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in north_germanic on 29.9 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in north_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in north_germanic on 29.9 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in north_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
70,
259
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in north_germanic on 29.9 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in north_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.08960765600204468,
0.10886476933956146,
-0.0028318550903350115,
0.009056149050593376,
0.07216256111860275,
-0.05408749356865883,
0.13058413565158844,
0.02403201349079609,
0.05348392203450203,
0.10715005546808243,
-0.044844672083854675,
-0.08265754580497742,
0.0783187747001648,
0.11378027498722076,
0.04937419295310974,
-0.23649753630161285,
0.047549277544021606,
-0.08405949920415878,
0.05658672749996185,
0.04449101909995079,
0.10164891928434372,
-0.09034260362386703,
0.03096834570169449,
0.048447441309690475,
-0.027958128601312637,
0.03869031369686127,
-0.03725166246294975,
-0.08015508949756622,
0.04744458198547363,
0.0413622185587883,
-0.01026113796979189,
0.02313680201768875,
0.07318804413080215,
-0.16595882177352905,
0.03343401104211807,
0.03377420827746391,
0.008446366526186466,
-0.006892108358442783,
0.1274343729019165,
-0.02467459812760353,
0.1977051943540573,
-0.019335396587848663,
-0.010908974334597588,
0.0974632203578949,
-0.06728560477495193,
-0.09955241531133652,
-0.08421795070171356,
0.1824202835559845,
0.09859234094619751,
0.10904210060834885,
-0.09026654809713364,
0.04795456677675247,
-0.039545152336359024,
0.04579993709921837,
0.032997239381074905,
-0.1928066462278366,
-0.034121401607990265,
0.06906251609325409,
0.09016473591327667,
0.040609631687402725,
-0.11228111386299133,
0.07269871979951859,
0.04012622684240341,
-0.005846264306455851,
-0.07204296439886093,
-0.02968885563313961,
0.09946458041667938,
-0.08729321509599686,
-0.12775826454162598,
-0.007370026782155037,
0.1887367218732834,
0.04776838421821594,
-0.07446794211864471,
-0.12729476392269135,
0.015220871195197105,
0.19408948719501495,
-0.043906837701797485,
-0.07343224436044693,
-0.004926059860736132,
0.04245566204190254,
0.04533353075385094,
-0.05704435706138611,
-0.07673664391040802,
-0.0086585758253932,
-0.00447807926684618,
0.10546933114528656,
0.009121259674429893,
-0.02050001360476017,
-0.058020930737257004,
0.004191659856587648,
-0.08660473674535751,
-0.13309262692928314,
-0.022284284234046936,
-0.07300721108913422,
-0.07465904206037521,
-0.039579879492521286,
-0.012728997506201267,
-0.12309940904378891,
0.0125933438539505,
0.0897391065955162,
0.05801423266530037,
0.04389485716819763,
-0.05924244970083237,
-0.022467665374279022,
0.11138014495372772,
0.06645558774471283,
-0.08904805034399033,
-0.016467468813061714,
-0.007894806563854218,
-0.0446753092110157,
0.015049022622406483,
-0.03150578960776329,
-0.02305593527853489,
0.007179960608482361,
-0.041742704808712006,
0.05220281332731247,
0.04412257298827171,
-0.03321582451462746,
-0.04955929145216942,
-0.08816947042942047,
0.09626586735248566,
-0.09033888578414917,
0.029794692993164062,
0.06944196671247482,
0.0041940524242818356,
0.10475312173366547,
-0.0484778992831707,
0.06836184859275818,
-0.11532851308584213,
0.028315089643001556,
-0.04147480055689812,
0.0007178432424552739,
0.01992148533463478,
-0.03376833721995354,
0.05123329907655716,
0.018216947093605995,
-0.008028128184378147,
-0.12531869113445282,
-0.0021674763411283493,
-0.09559972584247589,
-0.03613796830177307,
-0.06371388584375381,
-0.00023169319320004433,
-0.04803607612848282,
0.017070861533284187,
0.015965435653924942,
-0.0055417511612176895,
-0.02118971385061741,
-0.02895710989832878,
-0.002013946883380413,
-0.0014383295783773065,
0.051948077976703644,
0.0798649862408638,
0.08065272867679596,
-0.04201309382915497,
-0.02619466744363308,
-0.10901573300361633,
0.1248510554432869,
-0.10102558135986328,
-0.003502767765894532,
-0.13415800034999847,
-0.0032070602755993605,
-0.0334734283387661,
0.038830846548080444,
0.01614565961062908,
0.1545221209526062,
-0.16708716750144958,
-0.08900152891874313,
0.16400614380836487,
-0.14073564112186432,
-0.0017859716899693012,
0.17943428456783295,
0.012244699522852898,
0.037849292159080505,
0.11449588090181351,
0.2149442583322525,
0.030293218791484833,
-0.2023918479681015,
-0.054225433617830276,
-0.0726415365934372,
0.030178092420101166,
0.10503564029932022,
0.06368650496006012,
-0.06474902480840683,
0.058427147567272186,
-0.02679356373846531,
-0.025165654718875885,
-0.05114671587944031,
0.00853410828858614,
-0.04763682559132576,
0.011798999272286892,
-0.03446226567029953,
0.03931853920221329,
-0.009334193542599678,
-0.023214191198349,
-0.029283007606863976,
-0.0908963605761528,
-0.06439061462879181,
0.11107831448316574,
-0.06890891492366791,
0.025229565799236298,
-0.07690610736608505,
0.04462521895766258,
0.06325501203536987,
0.02159423753619194,
-0.12271508574485779,
0.07447139918804169,
0.028060544282197952,
-0.06011464446783066,
0.13874199986457825,
0.07209675759077072,
-0.01824127323925495,
0.004530999809503555,
-0.016513697803020477,
0.005267334170639515,
-0.032991036772727966,
-0.003028987441211939,
-0.02391689456999302,
-0.1170586571097374,
0.018444135785102844,
-0.05743250250816345,
0.08724065124988556,
-0.1583283692598343,
-0.00960008054971695,
0.08417563140392303,
0.12496951967477798,
-0.004190701059997082,
-0.04017388075590134,
0.08055602759122849,
0.04698985442519188,
0.02056475728750229,
-0.00005927261372562498,
0.012214123271405697,
-0.021413665264844894,
-0.02193613536655903,
0.12354082614183426,
-0.14376388490200043,
-0.18118451535701752,
0.11118794232606888,
0.018191903829574585,
-0.009707925841212273,
0.04032664746046066,
0.024914966896176338,
-0.029025835916399956,
-0.04303118959069252,
-0.020301049575209618,
0.21078316867351532,
-0.006797156762331724,
0.05813949182629585,
-0.07858334481716156,
-0.016309717670083046,
0.011790714226663113,
-0.041244495660066605,
-0.09565319865942001,
0.10120085626840591,
-0.008232694119215012,
-0.07420999556779861,
0.006014331243932247,
0.10186499357223511,
0.07164403796195984,
0.19722367823123932,
-0.0008797910413704813,
-0.10535392165184021,
-0.013608846813440323,
-0.05436563491821289,
-0.005532260984182358,
0.040944769978523254,
-0.1343967616558075,
-0.02457326650619507,
0.026219388470053673,
0.0320422500371933,
0.039932526648044586,
-0.028019171208143234,
0.02966720424592495,
-0.0007087629637680948,
-0.047061987221241,
-0.0865703672170639,
0.041128549724817276,
-0.031807057559490204,
0.03389642387628555,
-0.018296953290700912,
0.03516928479075432,
-0.04365217685699463,
-0.05586516484618187,
-0.14858752489089966,
0.09251835942268372,
-0.08416833728551865,
-0.2997904419898987,
-0.11446979641914368,
-0.06969710439443588,
-0.046080581843853,
0.01863122545182705,
0.06239292398095131,
-0.11453927308320999,
-0.1200394406914711,
-0.06374525278806686,
0.1529160439968109,
-0.017723767086863518,
-0.07250023633241653,
0.0895111933350563,
-0.0025554862804710865,
0.00858934037387371,
-0.08971164375543594,
0.014165966771543026,
-0.026866557076573372,
-0.03981068357825279,
-0.016636397689580917,
0.013029668480157852,
0.07499933987855911,
0.14144310355186462,
0.0518389530479908,
-0.023120548576116562,
0.012288612313568592,
0.1966313272714615,
-0.15146000683307648,
0.06708146631717682,
0.2224396914243698,
-0.04986780136823654,
0.001083879149518907,
0.1496298909187317,
-0.003877362934872508,
-0.04885514825582504,
0.05441694334149361,
0.012419458478689194,
-0.016492240130901337,
-0.2651613652706146,
-0.13447289168834686,
-0.07017292827367783,
-0.003996676299721003,
0.04436148330569267,
0.026040872558951378,
0.008604004047811031,
0.013309003785252571,
-0.10623107105493546,
-0.05602043494582176,
0.05142762511968613,
0.05265055596828461,
0.15971426665782928,
0.01478554867208004,
0.06251715123653412,
-0.0593574158847332,
-0.0012941861059516668,
0.11941065639257431,
-0.02223035879433155,
0.08183396607637405,
0.09169210493564606,
0.11986270546913147,
0.0741388276219368,
0.020949285477399826,
0.061381205916404724,
-0.024784784764051437,
0.000022232354240259156,
-0.011671052314341068,
-0.014358717948198318,
-0.0809357613325119,
0.022815028205513954,
0.06256575137376785,
0.11459097266197205,
-0.135614812374115,
-0.13214920461177826,
0.002996156457811594,
0.04161247983574867,
0.12032733857631683,
0.1009022444486618,
-0.010692066513001919,
-0.1319277584552765,
0.04883960261940956,
-0.09770311415195465,
-0.017868056893348694,
0.039147716015577316,
0.11456213146448135,
-0.14229202270507812,
0.08747140318155289,
0.0777619481086731,
0.10218217968940735,
-0.019370395690202713,
0.012926307506859303,
-0.090360626578331,
0.057268161326646805,
-0.0008203665493056178,
0.062445610761642456,
-0.1796107292175293,
0.11806552112102509,
0.025025490671396255,
0.0933869481086731,
-0.05523340776562691,
0.014602845534682274,
0.05862473323941231,
0.021552491933107376,
0.12262819707393646,
0.010703535750508308,
-0.15013931691646576,
0.0038686636835336685,
-0.10702589899301529,
0.01285350602120161,
0.06966555118560791,
-0.0718069076538086,
0.06352334469556808,
0.005836409982293844,
-0.006026621907949448,
-0.04049577936530113,
-0.01631559617817402,
-0.24294497072696686,
-0.13553394377231598,
0.03903697431087494,
0.013325526379048824,
0.06837543845176697,
-0.027798235416412354,
-0.08307616412639618,
-0.11630457639694214,
0.09610703587532043,
-0.058154668658971786,
-0.023125285282731056,
-0.09180870652198792,
-0.007841207087039948,
0.07376541942358017,
-0.07895022630691528,
0.016047896817326546,
0.04215826839208603,
0.14110618829727173,
-0.07516290247440338,
-0.05280785635113716,
0.04300384223461151,
-0.10584433376789093,
-0.13546720147132874,
0.0003882972232531756,
0.1832398623228073,
0.14909669756889343,
0.05892513692378998,
0.07312082499265671,
0.02884778380393982,
0.0002586024929769337,
-0.10069108754396439,
0.03684423491358757,
0.04110715910792351,
-0.03907671198248863,
0.05975010246038437,
-0.015913885086774826,
-0.2889886796474457,
-0.1439564824104309,
-0.0733700692653656,
0.0696297362446785,
0.16581809520721436,
-0.013825619593262672,
0.18176567554473877,
0.28057873249053955,
-0.09148901700973511,
-0.2527726888656616,
-0.022906746715307236,
0.0005062904674559832,
0.039070382714271545,
0.05366139113903046,
-0.22199684381484985,
0.12220872193574905,
0.013127668760716915,
0.006546834018081427,
-0.010931294411420822,
-0.19848966598510742,
-0.13758304715156555,
0.12568315863609314,
-0.021532148122787476,
0.039003465324640274,
-0.03923918306827545,
-0.07440279424190521,
0.003119362983852625,
-0.1056562215089798,
0.022361204028129578,
-0.07686281204223633,
0.08399714529514313,
0.0556727796792984,
0.04855571687221527,
0.020858993753790855,
0.016006119549274445,
0.11155982315540314,
0.10661190003156662,
-0.019790546968579292,
-0.08233784884214401,
0.0638987198472023,
0.027336223050951958,
-0.020298723131418228,
0.09185944497585297,
-0.00048489111941307783,
-0.0024400290567427874,
-0.05842100456357002,
-0.08437415957450867,
-0.06986038386821747,
0.07663354277610779,
-0.06625846028327942,
-0.01639588177204132,
-0.05103401839733124,
0.09370052069425583,
0.02029879204928875,
-0.010118436999619007,
-0.03650180622935295,
-0.10563381016254425,
-0.022234169766306877,
0.07922759652137756,
0.21271519362926483,
-0.027093034237623215,
-0.005114958621561527,
-0.0592784658074379,
-0.05560392513871193,
0.06592320650815964,
-0.07066596299409866,
0.05978105962276459,
0.0648047998547554,
0.025172511115670204,
0.08965331315994263,
-0.02879168465733528,
-0.13699617981910706,
0.03195500373840332,
0.05575445294380188,
-0.08109690248966217,
-0.17818506062030792,
-0.05139715224504471,
-0.006341077387332916,
-0.009517512284219265,
-0.01842358335852623,
0.1916196495294571,
-0.02181101217865944,
-0.0464216023683548,
0.0037051185499876738,
0.053401317447423935,
-0.006867688614875078,
0.0973658487200737,
0.03151831030845642,
0.039830807596445084,
-0.08361763507127762,
0.06602080166339874,
0.11278139799833298,
-0.09422960132360458,
0.05319712311029434,
0.1261654496192932,
-0.04888094589114189,
-0.06487082690000534,
-0.1367262303829193,
0.018750321120023727,
0.03040228970348835,
-0.05277242884039879,
0.01568884216248989,
-0.11411560326814651,
0.01752532832324505,
0.04044411703944206,
0.0061902450397610664,
-0.036644332110881805,
-0.03209497407078743,
0.002823621267452836,
-0.07580370455980301,
0.07775719463825226,
0.08333172649145126,
-0.035957638174295425,
-0.09709371626377106,
0.09546548873186111,
0.013182339258491993,
0.06332430988550186,
-0.034165266901254654,
-0.07435326278209686,
-0.09536054730415344,
-0.013010975904762745,
-0.10727491974830627,
0.017641253769397736,
-0.14450067281723022,
-0.011813470162451267,
-0.0610455647110939,
-0.03580950200557709,
-0.0171906016767025,
0.08081904798746109,
-0.04017068073153496,
-0.0008640146115794778,
-0.03548503667116165,
0.10872627049684525,
-0.14035959541797638,
0.06409984081983566,
0.06677678227424622,
-0.050684284418821335,
0.09877939522266388,
0.05238967761397362,
-0.037890225648880005,
0.05852953717112541,
-0.22237741947174072,
-0.03744351491332054,
-0.021879229694604874,
0.05209173634648323,
-0.0064432076178491116,
-0.1414303183555603,
-0.005562369711697102,
0.022171836346387863,
0.016872640699148178,
-0.01552265603095293,
0.026885831728577614,
-0.031159400939941406,
-0.02059432677924633,
-0.04204556345939636,
-0.07110103219747543,
-0.04442734271287918,
0.07714441418647766,
0.0742000862956047,
0.026158299297094345,
0.11086024343967438,
-0.09289098531007767,
0.07368350028991699,
-0.06612991541624069,
0.0260073971003294,
-0.015367559157311916,
0.014686270616948605,
-0.0685075893998146,
-0.07948744297027588,
0.0647527277469635,
-0.007990793325006962,
0.05993662029504776,
0.003968822304159403,
-0.034528426826000214,
0.06338758766651154,
-0.04196790233254433,
-0.05712651088833809,
0.037147440016269684,
0.1494234949350357,
0.04918063431978226,
0.009436925873160362,
0.004783587995916605,
-0.03888198360800743,
0.0065289088524878025,
0.13350293040275574,
0.1536816954612732,
0.16894666850566864,
0.08287841081619263,
0.037595026195049286,
0.06686683744192123,
-0.04310160130262375,
-0.12092475593090057,
0.0368439219892025,
-0.09995764493942261,
0.027958102524280548,
-0.05618127062916756,
-0.02764214389026165,
0.08149299025535583,
-0.14900930225849152,
0.07125907391309738,
-0.02045314945280552,
-0.0753316879272461,
-0.0949680432677269,
-0.1478905826807022,
-0.05574996769428253,
-0.04337751120328903,
0.0067125409841537476,
-0.10819410532712936,
0.04061784967780113,
-0.010002481751143932,
0.0402861051261425,
-0.10446200519800186,
0.10567233711481094,
-0.10737387835979462,
-0.1263207048177719,
0.14560575783252716,
-0.03984960913658142,
-0.004390050191432238,
0.018716994673013687,
0.04619382694363594,
0.021503817290067673,
0.08451680094003677,
0.03944728150963783,
0.05649123340845108,
0.02023935131728649,
0.019183970987796783,
-0.11425551027059555,
-0.07620193064212799,
0.02812192775309086,
0.00629211263731122,
0.08199873566627502,
0.19309118390083313,
0.09776964783668518,
-0.06322406232357025,
0.01763768307864666,
0.15091803669929504,
0.024233104661107063,
-0.08380767703056335,
-0.15544289350509644,
0.04524020850658417,
-0.033038120716810226,
0.0013860574690625072,
-0.009068187326192856,
-0.11371846497058868,
0.01222582720220089,
0.20485542714595795,
0.13165365159511566,
-0.0343472883105278,
0.031001416966319084,
-0.032778237015008926,
0.01317958440631628,
0.037755731493234634,
0.0746302381157875,
0.07231525331735611,
0.22090765833854675,
0.004827285651117563,
0.04640454426407814,
-0.03238554671406746,
-0.09868807345628738,
-0.1064513698220253,
0.10456375032663345,
-0.02787705697119236,
-0.05170174315571785,
-0.02162897028028965,
0.18491734564304352,
-0.11133961379528046,
-0.15522204339504242,
-0.10202959179878235,
-0.03298218920826912,
-0.10086256265640259,
0.007688394282013178,
-0.026522696018218994,
0.14293473958969116,
0.039520565420389175,
0.00843606237322092,
-0.004170733969658613,
0.1937900334596634,
0.044421009719371796,
0.01945430040359497,
-0.02842811495065689,
0.09501682221889496,
-0.08280562609434128,
0.10971308499574661,
0.005012452602386475,
0.03669844567775726,
0.04184667393565178,
0.05884132906794548,
-0.07211126387119293,
0.016570108011364937,
0.036902111023664474,
-0.042094435542821884,
0.027584517374634743,
0.18132193386554718,
-0.020404992625117302,
0.09047945588827133,
0.11674689501523972,
-0.07963298261165619,
0.023303816094994545,
-0.01019138190895319,
0.00614890456199646,
-0.06257631629705429,
0.16471244394779205,
-0.13906461000442505,
0.12351931631565094,
0.1136302575469017,
-0.058856986463069916,
-0.047409556806087494,
-0.0169170331209898,
0.050392355769872665,
-0.05809897929430008,
0.08523837476968765,
-0.0028320762794464827,
-0.17059588432312012,
0.028730710968375206,
-0.06953597068786621,
0.0651598572731018,
-0.2659505009651184,
-0.040532391518354416,
-0.04607977718114853,
0.00245284172706306,
-0.01938614435493946,
0.1231229156255722,
0.09319712221622467,
-0.06235475465655327,
-0.0032327999360859394,
-0.08169127255678177,
0.03616934269666672,
0.09411942958831787,
-0.0576653815805912,
-0.02872309274971485
] |
null | null |
transformers
|
# Wav2Vec2-Large-Robust finetuned on Librispeech
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/).
This model is a fine-tuned version of the [wav2vec2-large-robust](https://huggingface.co/facebook/wav2vec2-large-robust) model.
It has been pretrained on:
- [Libri-Light](https://github.com/facebookresearch/libri-light): open-source audio books from the LibriVox project; clean, read-out audio data
- [CommonVoice](https://huggingface.co/datasets/common_voice): crowd-source collected audio data; read-out text snippets
- [Switchboard](https://catalog.ldc.upenn.edu/LDC97S62): telephone speech corpus; noisy telephone data
- [Fisher](https://catalog.ldc.upenn.edu/LDC2004T19): conversational telephone speech; noisy telephone data
and subsequently been finetuned on 960 hours of
- [Librispeech](https://huggingface.co/datasets/librispeech_asr): open-source read-out audio data.
When using the model make sure that your speech input is also sampled at 16Khz.
[Paper Robust Wav2Vec2](https://arxiv.org/abs/2104.01027)
Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli
**Abstract**
Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import soundfile as sf
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-robust-ft-libri-960h")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-robust-ft-libri-960h")
# define function to read in sound file
def map_to_array(batch):
speech, _ = sf.read(batch["file"])
batch["speech"] = speech
return batch
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
ds = ds.map(map_to_array)
# tokenize
input_values = processor(ds["speech"][:2], return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
|
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["libri_light", "common_voice", "switchboard", "fisher", "librispeech_asr"], "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-robust-ft-libri-960h
|
[
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"en",
"dataset:libri_light",
"dataset:common_voice",
"dataset:switchboard",
"dataset:fisher",
"dataset:librispeech_asr",
"arxiv:2104.01027",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2104.01027"
] |
[
"en"
] |
TAGS
#transformers #pytorch #safetensors #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #dataset-librispeech_asr #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-Robust finetuned on Librispeech
Facebook's Wav2Vec2.
This model is a fine-tuned version of the wav2vec2-large-robust model.
It has been pretrained on:
- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data
- CommonVoice: crowd-source collected audio data; read-out text snippets
- Switchboard: telephone speech corpus; noisy telephone data
- Fisher: conversational telephone speech; noisy telephone data
and subsequently been finetuned on 960 hours of
- Librispeech: open-source read-out audio data.
When using the model make sure that your speech input is also sampled at 16Khz.
Paper Robust Wav2Vec2
Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli
Abstract
Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.
The original model can be found under URL
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
|
[
"# Wav2Vec2-Large-Robust finetuned on Librispeech\n\nFacebook's Wav2Vec2.\n\nThis model is a fine-tuned version of the wav2vec2-large-robust model.\nIt has been pretrained on:\n\n- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data\n- CommonVoice: crowd-source collected audio data; read-out text snippets\n- Switchboard: telephone speech corpus; noisy telephone data\n- Fisher: conversational telephone speech; noisy telephone data\n\nand subsequently been finetuned on 960 hours of\n\n- Librispeech: open-source read-out audio data.\n\nWhen using the model make sure that your speech input is also sampled at 16Khz. \n\nPaper Robust Wav2Vec2\n\nAuthors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli\n\nAbstract\nSelf-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
"TAGS\n#transformers #pytorch #safetensors #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #dataset-librispeech_asr #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-Robust finetuned on Librispeech\n\nFacebook's Wav2Vec2.\n\nThis model is a fine-tuned version of the wav2vec2-large-robust model.\nIt has been pretrained on:\n\n- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data\n- CommonVoice: crowd-source collected audio data; read-out text snippets\n- Switchboard: telephone speech corpus; noisy telephone data\n- Fisher: conversational telephone speech; noisy telephone data\n\nand subsequently been finetuned on 960 hours of\n\n- Librispeech: open-source read-out audio data.\n\nWhen using the model make sure that your speech input is also sampled at 16Khz. \n\nPaper Robust Wav2Vec2\n\nAuthors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli\n\nAbstract\nSelf-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
109,
492,
25
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #dataset-librispeech_asr #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.12057500332593918,
0.1594502329826355,
-0.005581534933298826,
-0.026174034923315048,
0.05621945485472679,
-0.041354432702064514,
0.12538906931877136,
0.1098402813076973,
0.0057474588975310326,
0.022562967613339424,
0.06721238046884537,
0.13714227080345154,
0.029570432379841805,
0.09206154942512512,
-0.028501395136117935,
-0.15744608640670776,
0.08894356340169907,
0.03121625818312168,
0.07505930215120316,
0.10681844502687454,
0.09059139341115952,
-0.09162255376577377,
0.03739827126264572,
0.005179333500564098,
-0.0321350060403347,
0.004561275709420443,
0.048026349395513535,
-0.1356557011604309,
0.10957089811563492,
0.0188031867146492,
0.07086166739463806,
0.04364239051938057,
0.019780779257416725,
-0.1786506175994873,
0.022681070491671562,
0.03028629906475544,
0.01923602633178234,
0.06008237600326538,
0.031791191548109055,
-0.029217462986707687,
-0.04665033519268036,
-0.04975515231490135,
0.0033517854753881693,
0.06677324324846268,
-0.056995272636413574,
-0.262210875749588,
-0.03546247258782387,
0.05502816289663315,
-0.009984300471842289,
0.08313824981451035,
-0.04694889858365059,
0.15161356329917908,
-0.06277678906917572,
0.126662477850914,
0.13149790465831757,
-0.22263804078102112,
0.026312753558158875,
-0.02288193628191948,
0.04363158345222473,
0.05247235670685768,
-0.035297997295856476,
0.049590274691581726,
0.012176519259810448,
0.023376375436782837,
-0.01648200862109661,
-0.040229856967926025,
-0.2158535122871399,
-0.0020159536506980658,
-0.0882251188158989,
-0.02926204912364483,
0.24953597784042358,
-0.002212311839684844,
0.010087910108268261,
-0.11073135584592819,
-0.05115357041358948,
-0.014002880081534386,
-0.01713632233440876,
0.01902805268764496,
-0.022118303924798965,
0.004852582700550556,
-0.027920104563236237,
-0.014023790135979652,
-0.13522259891033173,
-0.06301402300596237,
-0.14099432528018951,
0.14836357533931732,
-0.008274965919554234,
0.04556918889284134,
-0.11137481033802032,
-0.006018195301294327,
0.04176001995801926,
-0.09380318969488144,
0.02242557518184185,
0.0023491771426051855,
-0.012597508728504181,
0.049563247710466385,
-0.010995441116392612,
-0.05702580511569977,
0.15502497553825378,
0.050470851361751556,
0.045008085668087006,
0.023228950798511505,
-0.11194875836372375,
0.07600777596235275,
0.026131030172109604,
0.07290367037057877,
-0.08980488032102585,
-0.040354352444410324,
0.043710362166166306,
0.009570802561938763,
0.0874713659286499,
-0.05228283256292343,
-0.05740363523364067,
-0.03837711364030838,
0.08224208652973175,
0.07624685764312744,
0.10901065915822983,
0.024799933657050133,
-0.06824179738759995,
0.003222040366381407,
0.05180441588163376,
-0.1191597580909729,
0.015365434810519218,
0.07830331474542618,
0.05772711709141731,
0.07613910734653473,
0.05188925936818123,
0.02953895926475525,
-0.0574374794960022,
-0.0009346713777631521,
-0.022333340719342232,
-0.008208906278014183,
0.04215506464242935,
-0.09378281980752945,
0.07819639891386032,
-0.07280465960502625,
0.007121448405086994,
-0.13640612363815308,
-0.07533007860183716,
-0.010421725921332836,
-0.026700278744101524,
0.018508154898881912,
-0.08158276975154877,
-0.031235964968800545,
-0.057639770209789276,
0.022548966109752655,
-0.11713415384292603,
0.034083761274814606,
-0.07181482762098312,
0.09819284081459045,
0.03142877668142319,
0.1007995456457138,
-0.11336927860975266,
0.05483659356832504,
-0.08367229253053665,
-0.024145307019352913,
-0.017674479633569717,
0.08872369676828384,
-0.1137240082025528,
0.0311653520911932,
-0.07146986573934555,
-0.04894739389419556,
-0.055084481835365295,
0.04895330220460892,
0.006529420148581266,
0.1178031712770462,
-0.2395305186510086,
-0.08361849933862686,
0.13683374226093292,
-0.13962149620056152,
-0.08129602670669556,
0.12752707302570343,
0.037120409309864044,
-0.03381028771400452,
0.12499237060546875,
0.32085946202278137,
0.01277014147490263,
-0.1204354390501976,
-0.08014540374279022,
0.07435717433691025,
-0.05752843618392944,
-0.07223201543092728,
0.07745791226625443,
-0.0380537211894989,
0.020717063918709755,
0.021944135427474976,
0.05016060918569565,
0.06971817463636398,
-0.0369865819811821,
-0.09317212551832199,
-0.029647164046764374,
-0.0982867032289505,
0.038863640278577805,
-0.005334209650754929,
-0.024270441383123398,
-0.0417807474732399,
-0.028994088992476463,
0.021442795172333717,
0.08997052162885666,
-0.06449239701032639,
0.06376563012599945,
-0.13128112256526947,
0.09831259399652481,
-0.050493452697992325,
0.015164422802627087,
-0.14676569402217865,
0.08155956864356995,
-0.03870842233300209,
0.021104704588651657,
0.0710754543542862,
0.022074997425079346,
0.07941501587629318,
-0.029541559517383575,
-0.007358415983617306,
-0.047106269747018814,
0.13009066879749298,
0.0653647854924202,
-0.001510092057287693,
-0.19163769483566284,
0.02366417460143566,
-0.08218173682689667,
0.038947559893131256,
-0.045983389019966125,
-0.015462461858987808,
0.1437452882528305,
0.1138916090130806,
0.023286988958716393,
-0.023365331813693047,
0.06823042035102844,
0.00864491518586874,
0.021012598648667336,
0.0015256934566423297,
0.0384431816637516,
-0.01319203246384859,
-0.06742478907108307,
0.22003018856048584,
-0.17419853806495667,
0.2380894422531128,
0.21586595475673676,
-0.07623546570539474,
0.09179243445396423,
0.09131427109241486,
-0.00978656392544508,
-0.018218278884887695,
0.031563181430101395,
-0.061403460800647736,
0.08302079886198044,
0.01063456479460001,
0.08399990946054459,
-0.05931314826011658,
0.019590141251683235,
0.025726694613695145,
-0.0344945453107357,
-0.03600914776325226,
0.10848432779312134,
-0.12113508582115173,
-0.05214221030473709,
0.1344289481639862,
0.055579718202352524,
-0.03787306696176529,
0.16161000728607178,
-0.04694286361336708,
-0.03710886836051941,
0.022078154608607292,
-0.029438793659210205,
-0.025052690878510475,
0.1444428414106369,
-0.22380337119102478,
-0.02888519875705242,
0.08653657138347626,
-0.0068925595842301846,
0.056641191244125366,
-0.15334396064281464,
-0.004889394156634808,
-0.01489891204982996,
-0.0660470575094223,
-0.09236018359661102,
0.04566166177392006,
0.005392357707023621,
0.09083086252212524,
-0.08154922723770142,
-0.11760808527469635,
0.027930373325943947,
-0.045731574296951294,
-0.08817238360643387,
0.08552445471286774,
-0.13016186654567719,
-0.3168913722038269,
-0.10901877284049988,
-0.05999169871211052,
-0.010066109709441662,
0.016726423054933548,
0.10807668417692184,
-0.10630962252616882,
-0.01951395347714424,
-0.04198796674609184,
0.013966327533125877,
0.003721785033121705,
0.025734465569257736,
0.023494673892855644,
-0.005693643353879452,
0.056971047073602676,
-0.1606934517621994,
0.0036784864496439695,
-0.046430282294750214,
0.03633597120642662,
0.06869065016508102,
0.05984703451395035,
0.07332651317119598,
0.15822164714336395,
0.049632005393505096,
0.007356974296271801,
-0.027024365961551666,
0.1703820377588272,
-0.06428423523902893,
-0.025712743401527405,
0.15490365028381348,
-0.03164796903729439,
0.008010282181203365,
0.18052777647972107,
0.02845609374344349,
-0.03813345730304718,
-0.04079074412584305,
-0.011977121233940125,
-0.052420370280742645,
-0.2241126000881195,
-0.15970194339752197,
-0.07525584101676941,
0.06793218106031418,
-0.012957442551851273,
0.052608225494623184,
0.03051045536994934,
-0.019651243463158607,
0.006870026234537363,
-0.08528786897659302,
0.06380417197942734,
0.010167652741074562,
0.2532077431678772,
-0.09604554623365402,
0.12086866796016693,
-0.05442407354712486,
-0.06070951744914055,
0.06746503710746765,
0.10548220574855804,
0.042650725692510605,
0.06780828535556793,
0.11632677912712097,
0.023864541202783585,
0.12067867070436478,
0.09360500425100327,
0.07353189587593079,
0.0075674159452319145,
0.0013510385761037469,
0.007382163777947426,
-0.09250686317682266,
-0.05489620566368103,
0.01790311187505722,
0.1514279693365097,
-0.043913912028074265,
0.02452872507274151,
-0.07665898650884628,
0.061502307653427124,
0.09202347695827484,
0.09373392909765244,
-0.18665677309036255,
-0.0013938889605924487,
0.05233908072113991,
-0.03400452435016632,
-0.03795120492577553,
0.07152807712554932,
0.11235327273607254,
-0.01371223945170641,
0.059580445289611816,
0.012106836773455143,
0.053926512598991394,
-0.10622353106737137,
0.058670055121183395,
-0.13349294662475586,
-0.04944092407822609,
0.02165110968053341,
0.05446796864271164,
-0.23194953799247742,
0.20891574025154114,
0.018815776333212852,
0.05019335448741913,
-0.028336176648736,
-0.002690979279577732,
-0.0007953762542456388,
0.05798913165926933,
0.14103834331035614,
0.021574009209871292,
-0.010092373006045818,
-0.11202017217874527,
-0.14192719757556915,
0.0647723376750946,
0.026559006422758102,
0.12298156321048737,
-0.04968777671456337,
0.00790866557508707,
-0.030707374215126038,
0.03810124844312668,
-0.0069796014577150345,
-0.13735534250736237,
-0.10382261872291565,
0.02641644887626171,
0.27962470054626465,
0.1032324880361557,
-0.03426113352179527,
-0.10219773650169373,
-0.1257411539554596,
0.06242358312010765,
-0.15468725562095642,
-0.02043095976114273,
-0.05234081298112869,
-0.12078234553337097,
0.15390253067016602,
-0.024921707808971405,
0.018557097762823105,
-0.0044663031585514545,
0.038597431033849716,
-0.057342901825904846,
-0.07892239838838577,
0.08896618336439133,
-0.0980636402964592,
-0.08164619654417038,
-0.03971957042813301,
0.2056562751531601,
0.0033350579906255007,
0.07038109749555588,
0.021138155832886696,
0.034621089696884155,
-0.053670432418584824,
-0.03547591716051102,
0.08628037571907043,
0.08716467022895813,
-0.035301949828863144,
0.08349437266588211,
0.0092652952298522,
-0.28113484382629395,
-0.04509911313652992,
-0.010061591863632202,
0.23052625358104706,
0.15201853215694427,
-0.05046422407031059,
0.1789272576570511,
0.2443399280309677,
-0.032059505581855774,
-0.2774682939052582,
-0.11021765321493149,
-0.08419975638389587,
0.0038869809359312057,
-0.04434848204255104,
-0.14965468645095825,
0.09555152803659439,
-0.047749876976013184,
-0.09010264277458191,
0.03606798127293587,
-0.1915789395570755,
-0.10731615871191025,
0.262455016374588,
-0.06461848318576813,
0.20215445756912231,
-0.1251685917377472,
-0.0850500836968422,
-0.07260534167289734,
-0.09363154321908951,
0.16466857492923737,
-0.11569543182849884,
0.09521866589784622,
0.03648233041167259,
0.06585521250963211,
0.02848663367331028,
-0.05975976958870888,
0.09926553070545197,
0.03459228575229645,
-0.033786527812480927,
-0.02096877247095108,
-0.059806693345308304,
0.04434464871883392,
0.025316964834928513,
0.07327999919652939,
-0.08984703570604324,
0.03348388150334358,
-0.08382715284824371,
-0.024088291451334953,
-0.10956530272960663,
0.08169150352478027,
0.05406370386481285,
-0.031219560652971268,
0.008472341112792492,
-0.04595711827278137,
0.004809908103197813,
0.0376187227666378,
0.18707486987113953,
-0.06260216236114502,
0.031092483550310135,
0.16971272230148315,
0.13313259184360504,
-0.12138519436120987,
-0.07023689150810242,
-0.04220467433333397,
-0.10388161987066269,
0.09943974763154984,
-0.017192618921399117,
0.06929973512887955,
0.06477732211351395,
0.05047241598367691,
0.06386994570493698,
0.047673363238573074,
-0.09399107843637466,
-0.005252557341009378,
0.08463383466005325,
-0.13207091391086578,
-0.08853700011968613,
0.052560485899448395,
0.061186209321022034,
0.018443653360009193,
0.10347238183021545,
0.16971170902252197,
-0.017510633915662766,
-0.02040898986160755,
-0.010607964359223843,
0.04831922426819801,
-0.11645393818616867,
0.18891555070877075,
0.08680472522974014,
0.05419900268316269,
-0.17269740998744965,
0.09553413838148117,
-0.008166084997355938,
-0.10982823371887207,
0.049024198204278946,
0.016993798315525055,
-0.060598112642765045,
-0.10912319272756577,
-0.06724405288696289,
0.01750832051038742,
0.0458519421517849,
-0.12321976572275162,
-0.015147722326219082,
-0.1145072802901268,
0.044761158525943756,
0.15633326768875122,
0.056274596601724625,
0.09942691028118134,
-0.05662356689572334,
-0.05393528193235397,
-0.021592555567622185,
0.061272453516721725,
-0.008578053675591946,
0.0035122199915349483,
-0.16450437903404236,
0.05221141129732132,
0.0077008698135614395,
0.03189970180392265,
-0.061776090413331985,
-0.037920061498880386,
-0.1003277376294136,
0.04476035758852959,
-0.08241943269968033,
0.02282622829079628,
-0.07290451973676682,
0.016773227602243423,
0.014298006892204285,
-0.09279075264930725,
-0.01336349081248045,
0.030961640179157257,
-0.06830490380525589,
0.022016538307070732,
0.009467713534832,
0.055047616362571716,
-0.18952831625938416,
-0.029509222134947777,
0.022545190528035164,
-0.019534394145011902,
0.11823999881744385,
0.13562266528606415,
-0.11701131612062454,
0.06339551508426666,
-0.23241090774536133,
-0.16399769484996796,
0.15596112608909607,
0.023288259282708168,
-0.010641150176525116,
-0.09620556980371475,
-0.032966725528240204,
0.13209225237369537,
0.05535399168729782,
0.0265846885740757,
0.16969721019268036,
-0.0696961060166359,
0.028653064742684364,
-0.09625480324029922,
-0.04511849209666252,
-0.01374079566448927,
-0.054850347340106964,
0.1441596895456314,
0.0460413359105587,
0.14681731164455414,
-0.060970235615968704,
0.011375726200640202,
-0.1017073318362236,
0.054072245955467224,
-0.0682503953576088,
-0.1469697803258896,
-0.13360737264156342,
0.037132956087589264,
0.0516604483127594,
-0.03813708946108818,
0.1879202276468277,
-0.019048282876610756,
-0.09963306784629822,
0.027675794437527657,
0.034517981112003326,
-0.10503704845905304,
0.0424545556306839,
0.28503409028053284,
0.0303143709897995,
-0.03170846030116081,
0.0008086111047305167,
-0.017878981307148933,
0.025786524638533592,
0.1409417688846588,
-0.04626629874110222,
0.16297325491905212,
0.09323050826787949,
0.07235994935035706,
0.1372043490409851,
-0.049963969737291336,
-0.10107195377349854,
-0.004134343937039375,
-0.10658445954322815,
0.055608853697776794,
-0.04557378217577934,
0.16217508912086487,
0.08716345578432083,
0.008431412279605865,
0.05895829573273659,
-0.060849159955978394,
-0.03804747387766838,
-0.13889166712760925,
-0.1371452957391739,
-0.06434687227010727,
-0.07919555902481079,
-0.012853297404944897,
-0.07375399023294449,
0.03220720961689949,
0.05571639537811279,
0.02209712378680706,
0.00911110732704401,
0.07896933704614639,
0.002383252838626504,
-0.02942776121199131,
0.0468447282910347,
-0.004331659525632858,
-0.03707241639494896,
-0.04675961658358574,
-0.015449555590748787,
0.10482210665941238,
0.05159438028931618,
0.00568217970430851,
0.036221981048583984,
-0.06019875034689903,
0.06712844967842102,
-0.09696375578641891,
-0.1064332127571106,
-0.032472118735313416,
-0.005158525425940752,
0.06498438864946365,
0.15943150222301483,
0.0852988138794899,
-0.04970012977719307,
0.05926651135087013,
0.20255233347415924,
-0.06906352937221527,
-0.19171704351902008,
-0.0351228304207325,
0.09798579663038254,
-0.06066499650478363,
0.08627202361822128,
-0.0353967621922493,
-0.06290711462497711,
-0.05933405086398125,
0.19135083258152008,
0.29445236921310425,
-0.022130893543362617,
0.05203642323613167,
-0.041843317449092865,
0.02128206193447113,
-0.06435706466436386,
-0.02332758717238903,
0.13214904069900513,
0.24299992620944977,
-0.009859850630164146,
0.017472120001912117,
-0.044661689549684525,
-0.05255449190735817,
-0.056894417852163315,
0.011163026094436646,
-0.026566559448838234,
-0.07595712691545486,
-0.04748596251010895,
0.0988464206457138,
-0.12706990540027618,
-0.055578213185071945,
-0.10716389119625092,
-0.20941902697086334,
-0.05347694829106331,
-0.01238927710801363,
0.19329720735549927,
0.09883687645196915,
0.007834582589566708,
-0.0451953150331974,
-0.01831825263798237,
0.0051617431454360485,
-0.006808669771999121,
-0.202303946018219,
0.026059575378894806,
0.010051853023469448,
-0.10527324676513672,
0.07330145686864853,
-0.015344653278589249,
0.14307668805122375,
0.04377879947423935,
0.061650265008211136,
-0.07465176284313202,
0.10516475886106491,
-0.0035898347850888968,
-0.1268163025379181,
-0.00849828589707613,
0.11995619535446167,
-0.0030362762045115232,
0.09755846112966537,
0.05164779722690582,
-0.09701352566480637,
0.02846195548772812,
-0.008345595560967922,
-0.030224718153476715,
-0.06668145954608917,
-0.015289702452719212,
-0.03910796344280243,
0.0441659614443779,
-0.05742939934134483,
-0.040305495262145996,
-0.01877056434750557,
-0.03651820123195648,
-0.005295114126056433,
0.041280053555965424,
-0.0741669237613678,
-0.044254519045352936,
-0.15405356884002686,
-0.043442502617836,
-0.07622317969799042,
0.027889274060726166,
-0.1777961403131485,
-0.03813454508781433,
-0.09604016691446304,
-0.023751135915517807,
-0.0781623125076294,
0.0008489941246807575,
0.07951939851045609,
-0.00679544173181057,
-0.0013129889266565442,
-0.03864775598049164,
0.09729758650064468,
0.10990511626005173,
-0.11800604313611984,
-0.08629467338323593
] |
null | null |
transformers
|
# Wav2Vec2-Large-Robust finetuned on Switchboard
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/).
This model is a fine-tuned version of the [wav2vec2-large-robust](https://huggingface.co/facebook/wav2vec2-large-robust) model.
It has been pretrained on:
- [Libri-Light](https://github.com/facebookresearch/libri-light): open-source audio books from the LibriVox project; clean, read-out audio data
- [CommonVoice](https://huggingface.co/datasets/common_voice): crowd-source collected audio data; read-out text snippets
- [Switchboard](https://catalog.ldc.upenn.edu/LDC97S62): telephone speech corpus; noisy telephone data
- [Fisher](https://catalog.ldc.upenn.edu/LDC2004T19): conversational telephone speech; noisy telephone data
and subsequently been finetuned on 300 hours of
- [Switchboard](https://catalog.ldc.upenn.edu/LDC97S62): telephone speech corpus; noisy telephone data
When using the model make sure that your speech input is also sampled at 16Khz.
[Paper Robust Wav2Vec2](https://arxiv.org/abs/2104.01027)
Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli
**Abstract**
Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-robust-ft-swbd-300h")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-robust-ft-swbd-300h")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
|
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["libri_light", "common_voice", "switchboard", "fisher"], "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-robust-ft-swbd-300h
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"en",
"dataset:libri_light",
"dataset:common_voice",
"dataset:switchboard",
"dataset:fisher",
"arxiv:2104.01027",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2104.01027"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-Robust finetuned on Switchboard
Facebook's Wav2Vec2.
This model is a fine-tuned version of the wav2vec2-large-robust model.
It has been pretrained on:
- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data
- CommonVoice: crowd-source collected audio data; read-out text snippets
- Switchboard: telephone speech corpus; noisy telephone data
- Fisher: conversational telephone speech; noisy telephone data
and subsequently been finetuned on 300 hours of
- Switchboard: telephone speech corpus; noisy telephone data
When using the model make sure that your speech input is also sampled at 16Khz.
Paper Robust Wav2Vec2
Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli
Abstract
Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.
The original model can be found under URL
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
|
[
"# Wav2Vec2-Large-Robust finetuned on Switchboard\n\nFacebook's Wav2Vec2.\n\nThis model is a fine-tuned version of the wav2vec2-large-robust model.\nIt has been pretrained on:\n\n- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data\n- CommonVoice: crowd-source collected audio data; read-out text snippets\n- Switchboard: telephone speech corpus; noisy telephone data\n- Fisher: conversational telephone speech; noisy telephone data\n\nand subsequently been finetuned on 300 hours of\n\n- Switchboard: telephone speech corpus; noisy telephone data\n\nWhen using the model make sure that your speech input is also sampled at 16Khz. \n\nPaper Robust Wav2Vec2\n\nAuthors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli\n\nAbstract\nSelf-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-Robust finetuned on Switchboard\n\nFacebook's Wav2Vec2.\n\nThis model is a fine-tuned version of the wav2vec2-large-robust model.\nIt has been pretrained on:\n\n- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data\n- CommonVoice: crowd-source collected audio data; read-out text snippets\n- Switchboard: telephone speech corpus; noisy telephone data\n- Fisher: conversational telephone speech; noisy telephone data\n\nand subsequently been finetuned on 300 hours of\n\n- Switchboard: telephone speech corpus; noisy telephone data\n\nWhen using the model make sure that your speech input is also sampled at 16Khz. \n\nPaper Robust Wav2Vec2\n\nAuthors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli\n\nAbstract\nSelf-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
94,
488,
25
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.13224239647388458,
0.1656566560268402,
-0.004481815733015537,
-0.025102002546191216,
0.04057791456580162,
-0.035757824778556824,
0.12833361327648163,
0.13241097331047058,
0.005412352737039328,
0.01974431611597538,
0.06849941611289978,
0.17255282402038574,
0.04320512339472771,
0.07110369205474854,
-0.015992432832717896,
-0.18305784463882446,
0.09647216647863388,
0.0625004693865776,
0.043960679322481155,
0.1012926697731018,
0.08367633819580078,
-0.08137202262878418,
0.0330793671309948,
0.043990664184093475,
-0.05812622606754303,
0.00145028589759022,
0.040731679648160934,
-0.13831770420074463,
0.11587077379226685,
0.038332339376211166,
0.03922504186630249,
0.057909660041332245,
0.030678242444992065,
-0.1633191853761673,
0.013937068171799183,
0.024181898683309555,
0.02257964015007019,
0.05834406241774559,
0.009545481763780117,
-0.004629007540643215,
-0.0312909372150898,
-0.041766323149204254,
0.003064855234697461,
0.06405021995306015,
-0.024897843599319458,
-0.28328949213027954,
-0.034152399748563766,
0.068747378885746,
0.007472040597349405,
0.08818008750677109,
-0.030781609937548637,
0.13904066383838654,
-0.028788214549422264,
0.10871018469333649,
0.1579037457704544,
-0.2686942219734192,
0.015947118401527405,
-0.012143241241574287,
0.046266794204711914,
0.00485272379592061,
-0.03439527377486229,
0.07736966013908386,
0.013409520499408245,
0.03839441388845444,
-0.0348612442612648,
-0.0446011982858181,
-0.19645319879055023,
0.00979530531913042,
-0.06896216422319412,
-0.0284171923995018,
0.2691288888454437,
0.010655044578015804,
0.007872199639678001,
-0.10548480600118637,
-0.035317011177539825,
-0.01646367274224758,
-0.019927959889173508,
0.008006616495549679,
-0.014545203186571598,
0.022914551198482513,
-0.059841644018888474,
-0.038425590842962265,
-0.1395532339811325,
-0.07333695888519287,
-0.17347677052021027,
0.14102301001548767,
-0.006291271653026342,
0.06014374643564224,
-0.11634857952594757,
-0.0006741784163750708,
0.024847175925970078,
-0.07874514907598495,
-0.016079291701316833,
0.018889766186475754,
-0.00945060234516859,
0.058559734374284744,
-0.04247964546084404,
-0.03865747153759003,
0.14019104838371277,
0.09170984476804733,
0.08976011723279953,
0.02762339822947979,
-0.11962924897670746,
0.08067813515663147,
0.017921030521392822,
0.07878369092941284,
-0.09121590852737427,
-0.04234239086508751,
0.05322831869125366,
-0.038454748690128326,
0.09431998431682587,
-0.06580228358507156,
-0.08034776151180267,
-0.06864291429519653,
0.056978508830070496,
0.04931509867310524,
0.11800969392061234,
0.020522400736808777,
-0.04133022204041481,
-0.01738152652978897,
0.057690560817718506,
-0.09551176428794861,
0.007438052445650101,
0.06415316462516785,
0.044361796230077744,
0.11378460377454758,
0.07614632695913315,
0.04276011511683464,
-0.05737331137061119,
-0.025209903717041016,
0.00010316503175999969,
-0.011330037377774715,
0.050889868289232254,
-0.07226263731718063,
0.08014316111803055,
-0.07525238394737244,
0.010202375240623951,
-0.13089069724082947,
-0.07321576774120331,
-0.005363281350582838,
0.01051483303308487,
-0.0026605494786053896,
-0.12135215848684311,
-0.03567100688815117,
-0.047375790774822235,
0.04340562969446182,
-0.11126842349767685,
0.07933904975652695,
-0.06116458773612976,
0.09176909923553467,
-0.00005617768329102546,
0.11123953759670258,
-0.14749671518802643,
0.06343355774879456,
-0.0516074076294899,
-0.03016539290547371,
0.014998957514762878,
0.08901719748973846,
-0.09542766213417053,
0.02027680166065693,
-0.09307476133108139,
-0.051586735993623734,
-0.0403008833527565,
0.04517359659075737,
0.023728348314762115,
0.09441795945167542,
-0.24271856248378754,
-0.08473126590251923,
0.10789791494607925,
-0.11686795949935913,
-0.09755996614694595,
0.11020202934741974,
0.03718540072441101,
-0.002944641513749957,
0.10320302098989487,
0.30154383182525635,
0.006821807008236647,
-0.1392023265361786,
-0.03512931987643242,
0.08870270848274231,
-0.06598170101642609,
-0.11739210039377213,
0.0799470916390419,
-0.05626041442155838,
0.034200236201286316,
0.025051115080714226,
0.03904619440436363,
0.07121643424034119,
-0.019260838627815247,
-0.09686417132616043,
-0.028943728655576706,
-0.09554208070039749,
0.008357412181794643,
-0.03292926028370857,
-0.0010881107300519943,
-0.03145621344447136,
-0.029256382957100868,
0.02566801756620407,
0.09327733516693115,
-0.05394550785422325,
0.0910976454615593,
-0.12863238155841827,
0.07970742881298065,
-0.03952677547931671,
0.009199331514537334,
-0.1451328843832016,
0.13962459564208984,
-0.04580676928162575,
0.03003782220184803,
0.0960659459233284,
0.04568881541490555,
0.06801795214414597,
-0.06040634587407112,
0.011191019788384438,
-0.024194804951548576,
0.13825173676013947,
0.07239492982625961,
-0.005311575252562761,
-0.1978307068347931,
0.00445340434089303,
-0.062328699976205826,
0.00736034894362092,
-0.00338833243586123,
-0.013258330523967743,
0.11467746645212173,
0.10456731170415878,
-0.013655446469783783,
-0.004854929633438587,
0.051570065319538116,
-0.010799805633723736,
0.0024998802691698074,
0.007367121987044811,
0.05238476023077965,
0.025074059143662453,
-0.07573246955871582,
0.23670417070388794,
-0.11687131971120834,
0.200779527425766,
0.22315983474254608,
-0.10203640908002853,
0.10687379539012909,
0.09797003865242004,
-0.010482335463166237,
-0.037253495305776596,
0.03819028660655022,
-0.04069856181740761,
0.1084957867860794,
0.01555639412254095,
0.08038871735334396,
-0.054188236594200134,
0.027705051004886627,
0.034662943333387375,
-0.02598896622657776,
-0.04692184180021286,
0.07279884070158005,
-0.05763419345021248,
-0.040925946086645126,
0.1109209805727005,
0.05578082427382469,
-0.050635166466236115,
0.14468537271022797,
-0.0446140430867672,
-0.025702929124236107,
0.011193887330591679,
-0.04749961942434311,
-0.03151193633675575,
0.1560053676366806,
-0.25925344228744507,
-0.03231322020292282,
0.10651829838752747,
-0.0032883998937904835,
0.0696885883808136,
-0.15571200847625732,
0.0010215375805273652,
-0.00234745885245502,
-0.0664076954126358,
-0.08954548090696335,
0.0636632889509201,
-0.006349934730678797,
0.07487934827804565,
-0.05864132195711136,
-0.11076376587152481,
0.03794761747121811,
-0.040747471153736115,
-0.09348062425851822,
0.07298023998737335,
-0.1389668881893158,
-0.27190765738487244,
-0.11704768240451813,
-0.08743409067392349,
-0.03280559182167053,
0.013609183020889759,
0.12379996478557587,
-0.07731971144676208,
-0.008726900443434715,
-0.0038339151069521904,
-0.008153388276696205,
-0.030243005603551865,
0.038071054965257645,
0.03802228346467018,
-0.0038774122949689627,
0.040733762085437775,
-0.16463488340377808,
-0.005670225713402033,
-0.034623391926288605,
0.04370085150003433,
0.04560810700058937,
0.034221455454826355,
0.06428344547748566,
0.1547701507806778,
0.07371147722005844,
0.02676595002412796,
-0.030848922207951546,
0.16148856282234192,
-0.04721008613705635,
-0.0154002346098423,
0.16288670897483826,
0.007512014824897051,
0.004389723297208548,
0.18072842061519623,
0.03550536185503006,
-0.03643076866865158,
-0.060341328382492065,
-0.020528538152575493,
-0.06276249885559082,
-0.21895471215248108,
-0.13550420105457306,
-0.08955062180757523,
0.04922904819250107,
-0.0020722858607769012,
0.05061670392751694,
0.07120689749717712,
-0.011632471345365047,
0.011800574138760567,
-0.0533999502658844,
0.04104479402303696,
0.018785564228892326,
0.27555257081985474,
-0.08583641797304153,
0.1289949268102646,
-0.06384310126304626,
-0.030800750479102135,
0.07475627958774567,
0.11958328634500504,
0.06758381426334381,
0.08206043392419815,
0.1350284069776535,
0.03657010942697525,
0.14048415422439575,
0.08684208244085312,
0.08437206596136093,
0.026364710181951523,
0.014854229055345058,
-0.00019869269453920424,
-0.0823998972773552,
-0.04396273195743561,
0.03444467857480049,
0.17077511548995972,
-0.05763646587729454,
0.019155925139784813,
-0.09450016915798187,
0.028960933908820152,
0.14333952963352203,
0.09166336059570312,
-0.15968741476535797,
-0.01416498888283968,
0.041091613471508026,
-0.020149532705545425,
-0.04839363694190979,
0.06753583997488022,
0.06867817789316177,
-0.04345221072435379,
0.07354042679071426,
0.009404951706528664,
0.06573157757520676,
-0.07917988300323486,
0.05450429394841194,
-0.10783695429563522,
-0.07240064442157745,
0.03676179423928261,
0.06204383075237274,
-0.2569564878940582,
0.20359858870506287,
0.002634854055941105,
0.0467618964612484,
-0.05310065299272537,
0.009212806820869446,
0.005573388654738665,
0.06431297212839127,
0.13172276318073273,
0.022203262895345688,
-0.04336056485772133,
-0.07202038913965225,
-0.12598910927772522,
0.06770489364862442,
-0.005555668845772743,
0.11215053498744965,
-0.05649849772453308,
0.0172197874635458,
-0.033831577748060226,
0.03536413982510567,
0.05108128860592842,
-0.1315472424030304,
-0.12130072712898254,
0.02274845354259014,
0.2926449179649353,
0.09220084547996521,
-0.02791426330804825,
-0.10661697387695312,
-0.10691479593515396,
0.10854514688253403,
-0.15062598884105682,
-0.03160473704338074,
-0.05344001576304436,
-0.12055560201406479,
0.13496245443820953,
-0.022302033379673958,
0.0237866397947073,
-0.0203558299690485,
0.0397016741335392,
-0.06190449744462967,
-0.1036745011806488,
0.10359693318605423,
-0.10318288952112198,
-0.04949408769607544,
-0.04244598001241684,
0.19360759854316711,
-0.0044568623416125774,
0.07276421040296555,
0.034137919545173645,
0.014101285487413406,
-0.07217369973659515,
-0.03433490917086601,
0.0734846368432045,
0.09187382459640503,
-0.04572233557701111,
0.021734394133090973,
0.026526285335421562,
-0.24681536853313446,
-0.050131287425756454,
-0.021821819245815277,
0.22692351043224335,
0.09496567398309708,
-0.06292904913425446,
0.20687171816825867,
0.21820050477981567,
-0.035936374217271805,
-0.25795137882232666,
-0.1273055523633957,
-0.06243511289358139,
0.009282327257096767,
-0.08735524117946625,
-0.1843656450510025,
0.10848420858383179,
-0.061164602637290955,
-0.09864924103021622,
0.03394060581922531,
-0.1924893707036972,
-0.12127276510000229,
0.251498818397522,
-0.07659964263439178,
0.248612642288208,
-0.0875883474946022,
-0.08825056254863739,
-0.07145504653453827,
-0.1492420732975006,
0.17842742800712585,
-0.09030614793300629,
0.09583418816328049,
0.03648241609334946,
0.08961392194032669,
0.012332698330283165,
-0.046344079077243805,
0.1031021699309349,
0.05145781859755516,
-0.05373449623584747,
-0.021193470805883408,
-0.04183470457792282,
0.04552618786692619,
0.014445734210312366,
0.06346926093101501,
-0.10059861093759537,
0.016315605491399765,
-0.11440747231245041,
-0.027347126975655556,
-0.12029173225164413,
0.08712141960859299,
0.04327469319105148,
-0.042394623160362244,
0.002237733919173479,
-0.05717764422297478,
0.01605561003088951,
0.027913285419344902,
0.17799098789691925,
-0.0329061783850193,
0.008899172767996788,
0.16656672954559326,
0.11643844842910767,
-0.15644431114196777,
-0.0973629429936409,
-0.05539143458008766,
-0.1149948388338089,
0.107060007750988,
-0.04548510164022446,
0.04397832974791527,
0.0686386376619339,
0.04513237252831459,
0.04357602074742317,
0.05196678265929222,
-0.09019298106431961,
0.001226706663146615,
0.07798723131418228,
-0.09876794368028641,
-0.05180155113339424,
0.04757612943649292,
0.08647135645151138,
0.04745180159807205,
0.10395293682813644,
0.16482563316822052,
-0.01870233379304409,
-0.02217079885303974,
0.005795014090836048,
0.02330409549176693,
-0.1354990154504776,
0.1651211977005005,
0.0766870379447937,
0.05078819766640663,
-0.17396794259548187,
0.08332017809152603,
-0.014401220716536045,
-0.11117745190858841,
0.05622343719005585,
-0.008915714919567108,
-0.06728792190551758,
-0.10380030423402786,
-0.03404197096824646,
0.010355129837989807,
-0.016223954036831856,
-0.12969616055488586,
0.004494833759963512,
-0.11967837065458298,
0.05833595618605614,
0.11377409845590591,
0.06327511370182037,
0.08550821244716644,
-0.08785039186477661,
-0.057441867887973785,
-0.005747683811932802,
0.029625361785292625,
-0.026517964899539948,
-0.0026926291175186634,
-0.1467519998550415,
0.04783006012439728,
0.016651608049869537,
0.05879511684179306,
-0.055270131677389145,
-0.05577898025512695,
-0.08033809065818787,
0.050313111394643784,
-0.11512766033411026,
0.01886071264743805,
-0.09850416332483292,
0.006045241840183735,
0.024847211316227913,
-0.10259847342967987,
-0.032292552292346954,
0.04449218511581421,
-0.07093130797147751,
0.009532937780022621,
-0.0015996078727766871,
0.047133587300777435,
-0.18249669671058655,
-0.022659216076135635,
0.012600315734744072,
-0.029202532023191452,
0.10522628575563431,
0.1571245789527893,
-0.10534868389368057,
0.060533855110406876,
-0.18009813129901886,
-0.17031489312648773,
0.12725265324115753,
0.03355417028069496,
-0.0019273051293566823,
-0.10280866175889969,
-0.03556414321064949,
0.13872577250003815,
0.05679018050432205,
0.011513897217810154,
0.14712050557136536,
-0.059885043650865555,
-0.004788337741047144,
-0.09176814556121826,
-0.021604273468255997,
-0.010466066189110279,
-0.04947301372885704,
0.16644002497196198,
0.07146070152521133,
0.13218016922473907,
-0.043998464941978455,
0.011619068682193756,
-0.09036295115947723,
0.052028119564056396,
-0.08011654019355774,
-0.12264809012413025,
-0.1135576143860817,
0.009124125353991985,
0.03960556909441948,
-0.044492851942777634,
0.17917659878730774,
-0.038680385798215866,
-0.09802000224590302,
0.020750921219587326,
0.028409764170646667,
-0.12498807162046432,
0.04071948304772377,
0.32787802815437317,
0.042773209512233734,
-0.012456107884645462,
0.021401340141892433,
-0.0029904297553002834,
0.029427243396639824,
0.1774984896183014,
-0.05294462665915489,
0.14576078951358795,
0.13365218043327332,
0.09680382162332535,
0.14231182634830475,
-0.044577307999134064,
-0.14819732308387756,
0.0058828420005738735,
-0.09236226975917816,
0.07478408515453339,
-0.03693468123674393,
0.16804921627044678,
0.06329865753650665,
-0.01583247072994709,
0.04586794972419739,
-0.04363425076007843,
-0.02406689152121544,
-0.14026355743408203,
-0.16883333027362823,
-0.057650063186883926,
-0.09687287360429764,
-0.01857619546353817,
-0.049695663154125214,
0.04542053863406181,
0.036977753043174744,
0.03360266238451004,
0.0060668145306408405,
0.023133648559451103,
-0.015477605164051056,
-0.04314948245882988,
0.05310385674238205,
-0.008618718944489956,
-0.05754980444908142,
-0.05752195790410042,
-0.004317390266805887,
0.09941574931144714,
0.049074169248342514,
0.01569407247006893,
0.036907583475112915,
-0.049999043345451355,
0.0682918056845665,
-0.08616650849580765,
-0.10170530527830124,
-0.03891236335039139,
-0.033688828349113464,
0.08009382337331772,
0.16582393646240234,
0.08796370774507523,
-0.0460437647998333,
0.06591898947954178,
0.18389105796813965,
-0.08547048270702362,
-0.20992639660835266,
-0.05633027106523514,
0.07885865122079849,
-0.060041408985853195,
0.06399476528167725,
-0.030394913628697395,
-0.039527229964733124,
-0.09173363447189331,
0.2221796214580536,
0.2891959846019745,
-0.012767787091434002,
0.042903874069452286,
-0.011595488525927067,
0.015729693695902824,
-0.06051130220293999,
-0.026545081287622452,
0.13514265418052673,
0.20984743535518646,
-0.01378942746669054,
-0.017319153994321823,
-0.06436402350664139,
-0.056324273347854614,
-0.07642381638288498,
0.025296680629253387,
-0.03759237006306648,
-0.11896340548992157,
-0.019352808594703674,
0.08637849986553192,
-0.12536735832691193,
-0.04353844374418259,
-0.07630427926778793,
-0.20305156707763672,
-0.06025964766740799,
-0.003695061895996332,
0.17007124423980713,
0.1142951101064682,
0.0004004357906524092,
-0.04687601700425148,
0.001420985092408955,
0.07081855088472366,
-0.011615480296313763,
-0.21325191855430603,
0.032839685678482056,
0.02549978345632553,
-0.09590385109186172,
0.062041763216257095,
-0.015472845174372196,
0.14584949612617493,
0.04095131531357765,
0.10001090168952942,
-0.07710926979780197,
0.09495579451322556,
-0.008306796662509441,
-0.12156914919614792,
-0.013483823277056217,
0.09746202081441879,
-0.015548880212008953,
0.07574519515037537,
0.053803715854883194,
-0.09370369464159012,
0.030976401641964912,
0.0037381956353783607,
-0.005390533246099949,
-0.0783744603395462,
-0.008966497145593166,
-0.050070758908987045,
0.0451955683529377,
-0.08348798751831055,
-0.04392578825354576,
-0.0471724309027195,
-0.020882776007056236,
-0.012507684528827667,
0.05200217291712761,
-0.08924737572669983,
-0.06825245916843414,
-0.14606672525405884,
-0.041598450392484665,
-0.1275741457939148,
0.03502983599901199,
-0.17111502587795258,
-0.04965359345078468,
-0.0848490446805954,
-0.032642241567373276,
-0.057769011706113815,
-0.0028728055767714977,
0.06349172443151474,
-0.02639520913362503,
0.000577407656237483,
0.011490413919091225,
0.08759641647338867,
0.10429906100034714,
-0.1205270066857338,
-0.09303131699562073
] |
null | null |
transformers
|
# Wav2Vec2-Large-Robust
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The large model pretrained on 16kHz sampled speech audio.
Speech datasets from multiple domains were used to pretrain the model:
- [Libri-Light](https://github.com/facebookresearch/libri-light): open-source audio books from the LibriVox project; clean, read-out audio data
- [CommonVoice](https://huggingface.co/datasets/common_voice): crowd-source collected audio data; read-out text snippets
- [Switchboard](https://catalog.ldc.upenn.edu/LDC97S62): telephone speech corpus; noisy telephone data
- [Fisher](https://catalog.ldc.upenn.edu/LDC2004T19): conversational telephone speech; noisy telephone data
When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
[Paper Robust Wav2Vec2](https://arxiv.org/abs/2104.01027)
Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli
**Abstract**
Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/drive/1FjTsqbYKphl9kL-eILgUc-bl4zVThL8F?usp=sharing) for more information on how to fine-tune the model.
|
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["libri_light", "common_voice", "switchboard", "fisher"]}
| null |
facebook/wav2vec2-large-robust
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"speech",
"en",
"dataset:libri_light",
"dataset:common_voice",
"dataset:switchboard",
"dataset:fisher",
"arxiv:2104.01027",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2104.01027"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-Robust
Facebook's Wav2Vec2
The large model pretrained on 16kHz sampled speech audio.
Speech datasets from multiple domains were used to pretrain the model:
- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data
- CommonVoice: crowd-source collected audio data; read-out text snippets
- Switchboard: telephone speech corpus; noisy telephone data
- Fisher: conversational telephone speech; noisy telephone data
When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.
Paper Robust Wav2Vec2
Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli
Abstract
Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.
The original model can be found under URL
# Usage
See this notebook for more information on how to fine-tune the model.
|
[
"# Wav2Vec2-Large-Robust\n\nFacebook's Wav2Vec2\n\nThe large model pretrained on 16kHz sampled speech audio. \nSpeech datasets from multiple domains were used to pretrain the model:\n- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data\n- CommonVoice: crowd-source collected audio data; read-out text snippets\n- Switchboard: telephone speech corpus; noisy telephone data\n- Fisher: conversational telephone speech; noisy telephone data\n\nWhen using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper Robust Wav2Vec2\n\nAuthors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli\n\nAbstract\nSelf-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-Robust\n\nFacebook's Wav2Vec2\n\nThe large model pretrained on 16kHz sampled speech audio. \nSpeech datasets from multiple domains were used to pretrain the model:\n- Libri-Light: open-source audio books from the LibriVox project; clean, read-out audio data\n- CommonVoice: crowd-source collected audio data; read-out text snippets\n- Switchboard: telephone speech corpus; noisy telephone data\n- Fisher: conversational telephone speech; noisy telephone data\n\nWhen using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper Robust Wav2Vec2\n\nAuthors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli\n\nAbstract\nSelf-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
84,
530,
18
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-libri_light #dataset-common_voice #dataset-switchboard #dataset-fisher #arxiv-2104.01027 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.1055021733045578,
0.19722436368465424,
-0.0045523736625909805,
-0.014827454462647438,
0.050182174891233444,
-0.008262502029538155,
0.1453416347503662,
0.13883107900619507,
-0.050838496536016464,
-0.019647756591439247,
0.12572826445102692,
0.12732793390750885,
0.034707359969615936,
0.06812888383865356,
-0.012638548389077187,
-0.20788346230983734,
0.10048490017652512,
0.05360083654522896,
-0.07069703191518784,
0.09578336030244827,
0.1007126048207283,
-0.09172462671995163,
0.033117685467004776,
0.0111141512170434,
-0.058228395879268646,
-0.004911063238978386,
0.0012984503991901875,
-0.10380434989929199,
0.14063578844070435,
0.04387421905994415,
0.1038675531744957,
0.05551423877477646,
0.010022809728980064,
-0.17726685106754303,
0.023631753399968147,
0.04595370218157768,
0.004538131877779961,
0.0843420922756195,
-0.011725298129022121,
0.0372164286673069,
0.01103033497929573,
-0.06930165737867355,
0.0033563466276973486,
0.06616301089525223,
-0.09272606670856476,
-0.27574682235717773,
-0.06537007540464401,
0.06556213647127151,
0.00015957924188114703,
0.11730474978685379,
-0.014717952348291874,
0.15976029634475708,
-0.06282360106706619,
0.08402057737112045,
0.14648999273777008,
-0.2608884274959564,
-0.0005841517704539001,
0.06561911851167679,
0.04915735498070717,
0.028589515015482903,
-0.05433395132422447,
0.04553232342004776,
0.04055625572800636,
0.02952222339808941,
-0.0421491414308548,
-0.03269321843981743,
-0.17516402900218964,
0.05390359088778496,
-0.0829867497086525,
-0.039104048162698746,
0.30955079197883606,
0.010602512396872044,
0.03246175870299339,
-0.05287696048617363,
-0.05763569846749306,
-0.021613679826259613,
0.016792379319667816,
0.022325174883008003,
0.013324008323252201,
0.03937340900301933,
-0.0030325688421726227,
-0.07407810539007187,
-0.18698768317699432,
-0.039089277386665344,
-0.16130684316158295,
0.12418027222156525,
-0.008849798701703548,
0.05957584083080292,
-0.1508776992559433,
0.02623467519879341,
0.09267675131559372,
-0.09368102997541428,
-0.011987794190645218,
-0.03104797750711441,
0.002523306990042329,
0.06225646659731865,
-0.05241724103689194,
-0.005438460037112236,
0.11139727383852005,
0.09022562205791473,
0.025984587147831917,
0.027763469144701958,
-0.05721581354737282,
0.09561042487621307,
0.028769265860319138,
0.04634876176714897,
-0.06681068986654282,
-0.0104813352227211,
0.03269088640809059,
-0.03822482377290726,
0.08007364720106125,
-0.050339922308921814,
-0.07703180611133575,
-0.09146242588758469,
-0.0007864815997891128,
0.05703514441847801,
0.12935388088226318,
0.020471839234232903,
-0.040187593549489975,
-0.009181637316942215,
0.06980053335428238,
-0.07129471004009247,
-0.007133910432457924,
0.046193357557058334,
0.038688622415065765,
0.09135115891695023,
0.08744607865810394,
0.04376833140850067,
-0.007245395798236132,
-0.029235361143946648,
-0.042019832879304886,
-0.037056952714920044,
-0.008864931762218475,
-0.0796503946185112,
0.0990624725818634,
-0.0840066522359848,
0.0218205563724041,
-0.10597646981477737,
-0.09070886671543121,
-0.013332136906683445,
0.04795487970113754,
0.00974606815725565,
-0.11012782901525497,
-0.004693032708019018,
-0.06998011469841003,
0.037575457245111465,
-0.10160727798938751,
0.0826912373304367,
-0.05800065025687218,
0.07049161940813065,
-0.015982611104846,
0.07380373030900955,
-0.17171591520309448,
0.04237695038318634,
-0.05874873697757721,
-0.012972372584044933,
0.03672442585229874,
0.026572810485959053,
-0.10182378441095352,
0.0835939347743988,
-0.06926308572292328,
-0.06996119767427444,
-0.05693626031279564,
0.04293665662407875,
0.03687360882759094,
0.12170829623937607,
-0.25824669003486633,
-0.03313235938549042,
0.14014393091201782,
-0.11769898235797882,
-0.13787607848644257,
0.09668600559234619,
-0.00022277067182585597,
0.04488847777247429,
0.07158145308494568,
0.23927149176597595,
0.02275330387055874,
-0.1350821554660797,
-0.039979856461286545,
0.13175970315933228,
-0.06355459988117218,
-0.1635778695344925,
0.10126252472400665,
-0.04086101055145264,
0.04661518707871437,
0.014387344941496849,
0.009741404093801975,
0.08020682632923126,
-0.027014143764972687,
-0.10764975845813751,
-0.03609566017985344,
-0.0699780285358429,
0.00836433470249176,
0.007966824807226658,
-0.005994569975882769,
-0.05124993249773979,
-0.038721926510334015,
0.049049247056245804,
0.11124160140752792,
-0.030048029497265816,
0.08841371536254883,
-0.08305823057889938,
0.004896766971796751,
-0.022130433470010757,
-0.012311827391386032,
-0.10057885944843292,
0.05275445803999901,
-0.07944796979427338,
0.04690804332494736,
0.04972027614712715,
0.18952837586402893,
0.06341642886400223,
-0.07074446231126785,
-0.004541577305644751,
-0.0026539424434304237,
0.09751646220684052,
0.08026748150587082,
-0.02095811255276203,
-0.18821126222610474,
0.0014628428034484386,
-0.0656958520412445,
0.018087448552250862,
-0.07057741284370422,
0.01324283517897129,
0.12497494369745255,
0.10479407012462616,
-0.02518087811768055,
0.013165026903152466,
0.043987154960632324,
-0.025255775079131126,
-0.007728499360382557,
-0.01812465861439705,
0.0733264684677124,
0.04357542842626572,
-0.0723089799284935,
0.19004188477993011,
-0.07966360449790955,
0.19026952981948853,
0.20527571439743042,
-0.12017524242401123,
0.10980348289012909,
0.057578589767217636,
-0.0534638874232769,
-0.01141649205237627,
0.02819296531379223,
-0.0025532818399369717,
0.061708904802799225,
0.03482980281114578,
0.0679427906870842,
-0.04337020218372345,
0.027543354779481888,
0.008314810693264008,
-0.04125945270061493,
-0.03649645671248436,
0.07761892676353455,
-0.02202707901597023,
-0.08361251652240753,
0.14679597318172455,
0.10789722949266434,
-0.019910139963030815,
0.14330129325389862,
-0.060962263494729996,
-0.011967661790549755,
-0.015552695840597153,
-0.03684333711862564,
-0.06067633256316185,
0.20184201002120972,
-0.2639932930469513,
-0.011964123696088791,
0.10186362266540527,
0.03485044464468956,
0.08552976697683334,
-0.17597852647304535,
-0.05965740978717804,
0.01670166477560997,
-0.04726452752947807,
-0.10635221749544144,
0.057746075093746185,
-0.015575844794511795,
0.07448013126850128,
-0.02971433475613594,
-0.06760966777801514,
0.05992066487669945,
-0.04120829328894615,
-0.07304840534925461,
0.11460185050964355,
-0.16010841727256775,
-0.2593909502029419,
-0.10228972882032394,
-0.033617984503507614,
-0.00996155384927988,
-0.017392531037330627,
0.09246892482042313,
-0.09270942956209183,
-0.026358604431152344,
0.006367599591612816,
-0.009539580903947353,
-0.09924907982349396,
0.01413635816425085,
0.03378996253013611,
0.02365901879966259,
0.017922796308994293,
-0.18273963034152985,
-0.004767753183841705,
-0.02557441033422947,
0.015617936849594116,
0.037350986152887344,
0.028316985815763474,
0.0505688451230526,
0.1237495169043541,
0.06706874817609787,
0.02838662825524807,
-0.0385250449180603,
0.21581797301769257,
-0.04865235462784767,
-0.000013483179827744607,
0.16110993921756744,
0.04473036155104637,
0.023035550490021706,
0.17470711469650269,
0.04096322879195213,
-0.03747346252202988,
-0.0399344302713871,
-0.013895401731133461,
-0.0553690530359745,
-0.24202470481395721,
-0.1263284832239151,
-0.11482038348913193,
0.02667897939682007,
-0.022042639553546906,
0.060060061514377594,
0.02429664134979248,
0.018480980768799782,
0.003805746790021658,
-0.09094376862049103,
-0.012783660553395748,
0.027590883895754814,
0.2693544328212738,
-0.10439105331897736,
0.10853228718042374,
-0.09050476551055908,
-0.012458913959562778,
0.08395057916641235,
0.14563776552677155,
0.10075204074382782,
0.05673271045088768,
0.14527767896652222,
0.07446405291557312,
0.17879076302051544,
0.04928632080554962,
0.07516862452030182,
0.021114544942975044,
-0.0056073712185025215,
-0.024239759892225266,
-0.04919438436627388,
-0.05998937785625458,
0.03456473723053932,
0.09873299300670624,
-0.100803904235363,
0.04207479581236839,
-0.08988163620233536,
0.02968413755297661,
0.13079437613487244,
0.08632845431566238,
-0.1309702843427658,
-0.023107431828975677,
0.05500911548733711,
-0.0021315990015864372,
-0.04945441335439682,
0.06883209198713303,
0.10343301296234131,
-0.033753667026758194,
0.05894845351576805,
-0.015394071117043495,
0.06845932453870773,
-0.03816480562090874,
0.054506804794073105,
-0.1130761206150055,
-0.08978299796581268,
0.027569735422730446,
0.11215812712907791,
-0.29881829023361206,
0.23912455141544342,
-0.024617880582809448,
0.017192935571074486,
-0.05100205913186073,
-0.0025680975522845984,
-0.01214130874723196,
0.0920795202255249,
0.12369832396507263,
0.04421446844935417,
-0.03359415382146835,
-0.03895895928144455,
-0.07947184890508652,
0.07007657736539841,
-0.027859866619110107,
0.12140781432390213,
-0.04917370155453682,
0.007782470900565386,
0.004982529673725367,
0.0690414234995842,
0.11246711760759354,
-0.12626731395721436,
-0.13999684154987335,
0.05781992897391319,
0.19272856414318085,
0.04150304198265076,
-0.04200087860226631,
-0.08214454352855682,
-0.09944944828748703,
0.12311463057994843,
-0.1449538916349411,
-0.04121064767241478,
-0.07192327827215195,
-0.09405524283647537,
0.14706291258335114,
-0.03608844056725502,
0.0047608609311282635,
-0.04050355777144432,
0.032662346959114075,
-0.09177059680223465,
-0.07870836555957794,
0.08559665828943253,
-0.11777326464653015,
-0.05164036527276039,
-0.03243871033191681,
0.14294080436229706,
-0.01610303856432438,
0.07371362298727036,
0.013870826922357082,
0.021335052326321602,
-0.112592414021492,
-0.08128020912408829,
0.03336814418435097,
0.043318603187799454,
0.011051253415644169,
-0.014981567859649658,
0.022817688062787056,
-0.10967591404914856,
-0.0003812556969933212,
-0.04382709786295891,
0.25895392894744873,
0.16945023834705353,
-0.07510154694318771,
0.18856105208396912,
0.17988798022270203,
-0.06296778470277786,
-0.249850794672966,
-0.0795237123966217,
-0.10940222442150116,
-0.004554206505417824,
-0.05643798038363457,
-0.19396282732486725,
0.0748649314045906,
-0.011405184864997864,
-0.09158075600862503,
0.018930764868855476,
-0.2241038829088211,
-0.0877496600151062,
0.20239734649658203,
-0.05319800600409508,
0.30567967891693115,
-0.13213498890399933,
-0.04730964079499245,
-0.06168226897716522,
-0.17146490514278412,
0.2344052940607071,
-0.09241810441017151,
0.08129192143678665,
0.03154592216014862,
0.06748547405004501,
0.001225302112288773,
-0.09023500233888626,
0.13853956758975983,
0.0037083320785313845,
-0.0340561605989933,
-0.038660552352666855,
-0.08995801955461502,
0.09771828353404999,
-0.0035371354315429926,
0.04186249151825905,
-0.08067577332258224,
0.013467767275869846,
-0.17764616012573242,
0.0007218149839900434,
-0.13287568092346191,
0.05861533805727959,
0.00045700810733251274,
-0.05755448341369629,
-0.029093492776155472,
0.0033618854358792305,
0.05125773325562477,
0.02789360284805298,
0.22371290624141693,
-0.017850752919912338,
-0.007314562797546387,
0.1369781345129013,
0.10009241849184036,
-0.16363398730754852,
-0.07993263006210327,
-0.036296915262937546,
-0.08113359659910202,
0.08939263224601746,
-0.10635747760534286,
0.02299814671278,
0.08911070227622986,
0.019883418455719948,
0.055414021015167236,
0.06335476785898209,
-0.08054444193840027,
-0.025171471759676933,
0.09416712075471878,
-0.1265592873096466,
-0.08688492327928543,
0.03648544102907181,
0.08841273933649063,
-0.006948317401111126,
0.09065716713666916,
0.16485317051410675,
-0.016083890572190285,
-0.006063494365662336,
-0.0012878855923190713,
0.0412636436522007,
-0.13229143619537354,
0.10594053566455841,
0.1123904213309288,
0.057420697063207626,
-0.13711300492286682,
0.08113327622413635,
-0.0032005696557462215,
-0.11569917947053909,
0.05069173872470856,
0.046123456209897995,
-0.05846971273422241,
-0.11674439162015915,
-0.03559434786438942,
0.013993138447403908,
-0.047827478498220444,
-0.11229301244020462,
-0.02567310258746147,
-0.10390755534172058,
0.046145156025886536,
0.10205328464508057,
0.0811382457613945,
0.0968603566288948,
-0.058244869112968445,
-0.03941020369529724,
-0.005562896374613047,
-0.0020617246627807617,
-0.03895803540945053,
-0.013089281506836414,
-0.14497870206832886,
0.09373824298381805,
0.0030066363979130983,
0.05985867977142334,
-0.035541947931051254,
-0.03839200362563133,
-0.09674739837646484,
0.05275522172451019,
-0.10651978850364685,
-0.005220239982008934,
-0.09521245956420898,
-0.007967035286128521,
0.01937747560441494,
-0.113557368516922,
-0.0324564091861248,
0.026875289157032967,
-0.10025530308485031,
0.022503884509205818,
0.0057814824394881725,
0.03162947669625282,
-0.16729554533958435,
-0.03161851689219475,
0.03280474245548248,
-0.01307988166809082,
0.09023032337427139,
0.08388813585042953,
-0.07763629406690598,
0.045094721019268036,
-0.1384887993335724,
-0.126169353723526,
0.09971323609352112,
0.025275520980358124,
0.02093643881380558,
-0.07470888644456863,
-0.04677484184503555,
0.11969492584466934,
0.07633982598781586,
0.019640153273940086,
0.09809717535972595,
-0.0706106498837471,
-0.013571102172136307,
-0.028459031134843826,
-0.04786454141139984,
-0.015794821083545685,
-0.06951722502708435,
0.15172499418258667,
0.059946656227111816,
0.12912048399448395,
-0.014720762148499489,
0.027083905413746834,
-0.09711606055498123,
0.04427436366677284,
-0.04804670438170433,
-0.12216994166374207,
-0.050342801958322525,
0.0017595746321603656,
0.03213034197688103,
-0.040760818868875504,
0.195902481675148,
-0.03730586916208267,
-0.12966464459896088,
0.04069880023598671,
0.00009021975711220875,
-0.11928430199623108,
0.019660018384456635,
0.31461265683174133,
0.047790732234716415,
-0.003441866720095277,
0.0160504300147295,
0.021534260362386703,
0.005822162609547377,
0.11852746456861496,
0.03108426369726658,
0.16487203538417816,
0.14572249352931976,
0.0960005447268486,
0.1013016402721405,
-0.06827155500650406,
-0.09707964211702347,
0.013064706698060036,
-0.06077593192458153,
0.0661875307559967,
-0.05791535601019859,
0.12265203148126602,
0.0822765976190567,
-0.06232283264398575,
0.03858954459428787,
-0.06068112701177597,
-0.03592285141348839,
-0.13612322509288788,
-0.15260125696659088,
-0.01577560417354107,
-0.0874478742480278,
-0.024267269298434258,
-0.07977017760276794,
0.04280632734298706,
0.10956785082817078,
0.05121493712067604,
-0.003193732351064682,
0.004437908064574003,
0.04207564890384674,
-0.03871125727891922,
-0.0027266910765320063,
0.01337194163352251,
0.012374655343592167,
-0.09603507071733475,
-0.0020717927254736423,
0.056463632732629776,
0.036635592579841614,
0.005584646947681904,
0.03250976279377937,
-0.03216603025794029,
0.04798557609319687,
-0.06691264361143112,
-0.088279128074646,
-0.06845417618751526,
0.01694881170988083,
0.0782584398984909,
0.21117562055587769,
0.057937491685152054,
0.01160572562366724,
0.06396588683128357,
0.19920258224010468,
-0.08185117691755295,
-0.20807047188282013,
-0.07396204024553299,
0.04141272231936455,
-0.037323202937841415,
0.05301152542233467,
-0.03237615153193474,
-0.02452992834150791,
-0.09271156787872314,
0.24990959465503693,
0.3227652311325073,
-0.05012622848153114,
0.030787654221057892,
0.014196268282830715,
0.0211699977517128,
-0.030385851860046387,
0.027582978829741478,
0.12026006728410721,
0.21267098188400269,
-0.07580132782459259,
-0.023877734318375587,
-0.05359738692641258,
-0.03708901256322861,
-0.07172543555498123,
0.008731428533792496,
-0.015322986990213394,
-0.08466457575559616,
-0.0175373125821352,
0.08722612261772156,
-0.09053058922290802,
-0.06491885334253311,
-0.0548531673848629,
-0.23678898811340332,
-0.06302282959222794,
0.0031695107463747263,
0.2075885385274887,
0.07305669039487839,
0.04276306554675102,
-0.04945579543709755,
-0.005031993146985769,
0.10950552672147751,
-0.007315382827073336,
-0.1984359622001648,
-0.02429990842938423,
0.09481343626976013,
-0.0836465060710907,
0.09484073519706726,
-0.023246722295880318,
0.11066370457410812,
0.07599058002233505,
0.056840311735868454,
-0.12002849578857422,
0.05908452346920967,
0.013424876146018505,
-0.09795855730772018,
-0.061176326125860214,
-0.005419263616204262,
-0.029918242245912552,
0.05136804282665253,
0.042413678020238876,
-0.13377796113491058,
0.028905389830470085,
0.0794382393360138,
0.0018173051066696644,
-0.04584163427352905,
0.002896553138270974,
-0.06508812308311462,
0.06323008239269257,
-0.0374574139714241,
-0.05079296976327896,
-0.054977450519800186,
-0.03579626977443695,
-0.026610808447003365,
0.03426935523748398,
-0.0846615806221962,
-0.08105576783418655,
-0.10182739049196243,
-0.03827821463346481,
-0.07160529494285583,
0.03404279798269272,
-0.11237826943397522,
-0.03881615027785301,
-0.052255406975746155,
-0.016773613169789314,
-0.04083956778049469,
0.00007119495421648026,
0.04557659849524498,
-0.006368008442223072,
-0.0012262678937986493,
0.0547809973359108,
0.04195353388786316,
0.0677310898900032,
-0.11570435017347336,
-0.08684230595827103
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **romance** on **101.5** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **romance**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "romance", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-romance-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"romance"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in romance on 101.5 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in romance. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in romance on 101.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in romance. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in romance on 101.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in romance. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
70,
251
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in romance on 101.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in romance. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.0687626302242279,
0.10878966003656387,
-0.002763911848887801,
0.007184646092355251,
0.0636642724275589,
-0.026248304173350334,
0.13648000359535217,
0.04773334786295891,
0.02108432725071907,
0.09059999883174896,
-0.013979923911392689,
-0.0678090751171112,
0.08377836644649506,
0.134928897023201,
0.046412307769060135,
-0.25656092166900635,
0.03106132708489895,
-0.0664597898721695,
0.051254753023386,
0.041084494441747665,
0.12811259925365448,
-0.07827087491750717,
0.041785385459661484,
0.04938960075378418,
-0.027998976409435272,
0.01785132847726345,
-0.0560704730451107,
-0.06524048745632172,
0.05908508226275444,
0.058972425758838654,
-0.031420107930898666,
0.03890049457550049,
0.10066094249486923,
-0.213800847530365,
0.030495287850499153,
0.02747207134962082,
0.02917519211769104,
0.011958218179643154,
0.10592533648014069,
0.032678376883268356,
0.16192753612995148,
-0.040675096213817596,
0.01343685295432806,
0.0763935074210167,
-0.04205479100346565,
-0.06508292257785797,
-0.07476044446229935,
0.14071016013622284,
0.08607616275548935,
0.11087072640657425,
-0.06969526410102844,
0.08071709424257278,
-0.028763463720679283,
0.029744621366262436,
0.06122548133134842,
-0.18085139989852905,
-0.05451403185725212,
0.04843432456254959,
0.11288602650165558,
0.03219246491789818,
-0.08674206584692001,
0.07065911591053009,
0.040642742067575455,
-0.011618212796747684,
-0.06751634180545807,
-0.023575715720653534,
0.15219014883041382,
-0.10722474008798599,
-0.11572320759296417,
-0.002268443815410137,
0.17844799160957336,
0.06516630202531815,
-0.076822429895401,
-0.15689830482006073,
0.016679266467690468,
0.21215403079986572,
-0.06755772233009338,
-0.09925279766321182,
0.01175638660788536,
0.01183145958930254,
0.05571461468935013,
-0.07217678427696228,
-0.07455218583345413,
-0.002035544253885746,
0.024786118417978287,
0.098017618060112,
0.02613164484500885,
-0.012425205670297146,
-0.0686914324760437,
-0.019366154447197914,
-0.11054114997386932,
-0.10353345423936844,
0.00920006912201643,
-0.0730394721031189,
-0.06483496725559235,
-0.04075277969241142,
-0.00025229380116797984,
-0.08802445977926254,
0.03448687493801117,
0.07194633781909943,
0.06449155509471893,
0.05902988091111183,
-0.04280490055680275,
-0.03425013646483421,
0.12841036915779114,
0.09770701825618744,
-0.10001165419816971,
-0.035485394299030304,
0.01714252308011055,
-0.007796354126185179,
0.009907101280987263,
-0.028611555695533752,
-0.047118764370679855,
-0.016141628846526146,
-0.021761588752269745,
0.0500715970993042,
0.054512955248355865,
-0.032936930656433105,
-0.03472721949219704,
-0.08612097054719925,
0.10915356129407883,
-0.06613641232252121,
0.029498254880309105,
0.0565418116748333,
-0.011042000725865364,
0.10440263897180557,
-0.06737956404685974,
0.07152073085308075,
-0.10019936412572861,
-0.003040690440684557,
-0.022929685190320015,
-0.016632456332445145,
0.032138168811798096,
-0.02625141106545925,
0.03837858512997627,
0.0011743010254576802,
0.004414675757288933,
-0.11998399347066879,
0.012129449285566807,
-0.10435117036104202,
-0.022520393133163452,
-0.08732859790325165,
-0.055929988622665405,
-0.05173812806606293,
0.02600943297147751,
0.004254123196005821,
-0.0014520431868731976,
0.015688829123973846,
-0.010400310158729553,
-0.0086374431848526,
-0.010348637588322163,
0.04844279587268829,
0.05834450200200081,
0.08482808619737625,
-0.018476704135537148,
-0.02724319137632847,
-0.10458879917860031,
0.11590731143951416,
-0.06729846447706223,
-0.015440920367836952,
-0.1395464539527893,
-0.038711581379175186,
-0.013354811817407608,
0.03147348389029503,
0.006108120549470186,
0.123381607234478,
-0.19157251715660095,
-0.06390594691038132,
0.10363367199897766,
-0.12414386123418808,
0.007981439121067524,
0.1993235945701599,
0.003540086792781949,
0.08128923922777176,
0.08839119970798492,
0.22784753143787384,
0.0323362834751606,
-0.16816243529319763,
0.001400511828251183,
-0.03516312688589096,
0.03488069027662277,
0.13206732273101807,
0.05192356929183006,
-0.058591559529304504,
0.0519854910671711,
-0.01645173691213131,
-0.03586959093809128,
-0.08053341507911682,
-0.0073542059399187565,
-0.051297374069690704,
0.013554343953728676,
-0.052114833146333694,
0.017968662083148956,
-0.009035932831466198,
-0.02324008010327816,
-0.01263429969549179,
-0.09095586836338043,
-0.10613230615854263,
0.12449000775814056,
-0.05750805139541626,
0.030203845351934433,
-0.09252281486988068,
0.06583710759878159,
0.09734923392534256,
0.0057730828411877155,
-0.10398687422275543,
0.10891768336296082,
0.04493372514843941,
-0.054924774914979935,
0.1223088875412941,
0.06939724087715149,
-0.03224007040262222,
0.017637144774198532,
-0.01624620333313942,
0.018179012462496758,
-0.015214516781270504,
0.011929796077311039,
-0.03528757020831108,
-0.11201678961515427,
-0.016887089237570763,
-0.06539130955934525,
0.12969271838665009,
-0.13847461342811584,
-0.019838731735944748,
0.024485867470502853,
0.09764115512371063,
-0.021171899512410164,
-0.029429176822304726,
0.07171463221311569,
0.0469868965446949,
0.036714933812618256,
-0.0160059854388237,
0.030764319002628326,
-0.009278927929699421,
-0.0008932014461606741,
0.05460073798894882,
-0.1475733071565628,
-0.16524522006511688,
0.09913076460361481,
0.008589688688516617,
-0.022791104391217232,
0.04393891617655754,
0.017813367769122124,
-0.006449452135711908,
-0.0380668006837368,
0.00788478460162878,
0.22883732616901398,
-0.009303281083703041,
0.06733060628175735,
-0.07433952391147614,
0.004276665393263102,
0.0157942958176136,
-0.05584080144762993,
-0.10584019869565964,
0.08951542526483536,
-0.0054040467366576195,
-0.07793419808149338,
-0.04843227192759514,
0.04345495626330376,
0.06967595964670181,
0.15492142736911774,
0.00516566913574934,
-0.07911057770252228,
-0.03403910622000694,
-0.06536215543746948,
-0.021149685606360435,
0.03289905562996864,
-0.15103819966316223,
-0.033507343381643295,
0.02960679493844509,
0.0004655680386349559,
0.05385388806462288,
-0.03586849942803383,
0.03504568710923195,
0.012899231165647507,
-0.04453117772936821,
-0.0691719725728035,
0.03740846738219261,
-0.028295675292611122,
0.047781024128198624,
0.00561638455837965,
-0.0010477396426722407,
-0.047343239188194275,
-0.060256294906139374,
-0.15040595829486847,
0.07709193229675293,
-0.0694933831691742,
-0.28334948420524597,
-0.08155830204486847,
-0.041975464671850204,
-0.024304231628775597,
0.009646044112741947,
0.043589528650045395,
-0.1231243684887886,
-0.11020592600107193,
-0.08017441630363464,
0.12495537102222443,
-0.05280527472496033,
-0.05780726671218872,
0.10417903959751129,
0.00286557013168931,
0.003516626311466098,
-0.09469502419233322,
0.010567975230515003,
-0.048571281135082245,
-0.03509977459907532,
-0.017589358612895012,
0.0030617062002420425,
0.05564418062567711,
0.1157696321606636,
0.00599676463752985,
-0.0047948542051017284,
0.0038976275827735662,
0.20851624011993408,
-0.12305618077516556,
0.07932490110397339,
0.24021948873996735,
-0.04547203704714775,
-0.0014051309553906322,
0.1384822577238083,
-0.012051243335008621,
-0.04632248356938362,
0.03792431205511093,
-0.012743380852043629,
-0.01428274903446436,
-0.22648972272872925,
-0.13566304743289948,
-0.0338110513985157,
-0.020982299000024796,
0.025646362453699112,
0.012430441565811634,
-0.008519584313035011,
0.008298927918076515,
-0.08392130583524704,
-0.047256749123334885,
0.06429551541805267,
0.032501231878995895,
0.17582981288433075,
0.005624921526759863,
0.04075351357460022,
-0.04806198552250862,
-0.016702260822057724,
0.0958842784166336,
-0.03770722821354866,
0.04790867865085602,
0.07086053490638733,
0.10297485440969467,
0.049669619649648666,
0.049929626286029816,
0.052946560084819794,
-0.0325397327542305,
-0.002719336189329624,
-0.005422980058938265,
-0.03586604818701744,
-0.05898299440741539,
0.0071932110004127026,
0.055044542998075485,
0.16400505602359772,
-0.1440047323703766,
-0.13584189116954803,
0.04135628417134285,
0.0141531340777874,
0.1161598190665245,
0.08840852975845337,
-0.023556560277938843,
-0.0913500115275383,
0.03908286616206169,
-0.08828473836183548,
-0.032853711396455765,
0.05168485641479492,
0.05649027228355408,
-0.15886077284812927,
0.09208578616380692,
0.09409423917531967,
0.07854695618152618,
-0.06169293448328972,
0.03510603308677673,
-0.05395510047674179,
0.06358294188976288,
0.00724797323346138,
0.0748356282711029,
-0.1962403804063797,
0.12235313653945923,
0.018462037667632103,
0.08851739764213562,
-0.04592270031571388,
0.022445524111390114,
0.04223936051130295,
0.010305090807378292,
0.12403526902198792,
-0.009328683838248253,
-0.08265351504087448,
-0.002354198135435581,
-0.11468132585287094,
0.026117421686649323,
0.055191997438669205,
-0.038281504064798355,
0.05286998301744461,
-0.016526831313967705,
-0.009932294487953186,
-0.037353698164224625,
0.032062266021966934,
-0.25426310300827026,
-0.1426294893026352,
0.04205554723739624,
-0.028110457584261894,
0.019649192690849304,
-0.02897089719772339,
-0.08124943822622299,
-0.12768681347370148,
0.09410927444696426,
0.04388635233044624,
-0.008312220685184002,
-0.07302626222372055,
0.00867613684386015,
0.10735829174518585,
-0.05984336882829666,
0.009852432645857334,
0.042083390057086945,
0.1502329707145691,
-0.08225395530462265,
-0.03240823745727539,
0.013750208541750908,
-0.09724316000938416,
-0.13434796035289764,
0.008053073659539223,
0.175797700881958,
0.11510659009218216,
0.07371021807193756,
0.09462198615074158,
0.016823377460241318,
-0.010282855480909348,
-0.09334045648574829,
0.0509590283036232,
0.050059180706739426,
-0.09441199898719788,
0.0280477162450552,
0.0007974114851094782,
-0.2766375243663788,
-0.14847077429294586,
-0.06765568256378174,
0.09640819579362869,
0.21854612231254578,
-0.02228885516524315,
0.16145797073841095,
0.2733500003814697,
-0.09166299551725388,
-0.22423186898231506,
-0.031143896281719208,
0.0008186834747903049,
0.020536204800009727,
0.017415033653378487,
-0.1975923329591751,
0.061900269240140915,
-0.016477491706609726,
0.01365325041115284,
-0.08829115331172943,
-0.202421635389328,
-0.12598325312137604,
0.16893412172794342,
-0.021078038960695267,
0.05773354321718216,
-0.025489797815680504,
-0.06714963912963867,
-0.05078672990202904,
-0.06171555817127228,
0.0194456297904253,
-0.05216077342629433,
0.06428564339876175,
0.04574897885322571,
0.0208261888474226,
0.02822866290807724,
0.01858440227806568,
0.11480636894702911,
0.074479840695858,
-0.03025374747812748,
-0.06928738951683044,
0.017023256048560143,
0.008770185522735119,
-0.017960894852876663,
0.1117451936006546,
0.05270494893193245,
0.0074609718285501,
-0.038913413882255554,
-0.08373314887285233,
-0.07506858557462692,
0.05180460959672928,
-0.0821782723069191,
-0.011945081874728203,
-0.05295155569911003,
0.0859108716249466,
0.030729126185178757,
-0.01060185581445694,
-0.11076939851045609,
-0.08021077513694763,
-0.052058376371860504,
0.10959769785404205,
0.2185082584619522,
-0.03712157905101776,
0.0023195575922727585,
-0.02710876427590847,
-0.04931109398603439,
0.04222225397825241,
-0.003240321297198534,
0.037537552416324615,
0.05864224210381508,
0.013632073067128658,
0.08139518648386002,
-0.032264385372400284,
-0.13204118609428406,
0.016572363674640656,
0.0319770984351635,
-0.060010265558958054,
-0.1899256408214569,
-0.04562709107995033,
-0.014270361512899399,
-0.018797336146235466,
-0.05934840813279152,
0.18262872099876404,
-0.013618601486086845,
-0.058819837868213654,
-0.007773958146572113,
0.0690702572464943,
-0.0006286404677666724,
0.11021079123020172,
0.0382411852478981,
0.028955617919564247,
-0.08504768460988998,
0.06307824701070786,
0.10897894948720932,
-0.052771106362342834,
0.05465316027402878,
0.10812903940677643,
-0.04177792742848396,
-0.05340903624892235,
-0.08808021992444992,
0.009417325258255005,
0.04090233892202377,
-0.058015692979097366,
-0.004080875776708126,
-0.12465815246105194,
0.0031390790827572346,
0.022722141817212105,
0.0030796541832387447,
-0.047694627195596695,
-0.057805005460977554,
0.0018469663336873055,
-0.11254160106182098,
0.07041119784116745,
0.10962224006652832,
-0.030647626146674156,
-0.10088629275560379,
0.11490004509687424,
0.004676605574786663,
0.0645606517791748,
-0.03396115452051163,
-0.06216045841574669,
-0.0809483453631401,
-0.009696001186966896,
-0.1208195760846138,
0.040772464126348495,
-0.12960010766983032,
-0.012985247187316418,
-0.055389873683452606,
-0.025116298347711563,
-0.0046622855588793755,
0.05044075474143028,
-0.03008834458887577,
-0.0037332421634346247,
-0.03508194908499718,
0.08453571796417236,
-0.10769782215356827,
0.059738706797361374,
0.055974725633859634,
-0.056311387568712234,
0.11514554172754288,
0.028130343183875084,
-0.042682915925979614,
0.032030899077653885,
-0.2127595841884613,
-0.06806453317403793,
-0.04155619814991951,
0.042461056262254715,
-0.0025923384819179773,
-0.1864718347787857,
-0.006110341753810644,
0.010762939229607582,
-0.0066773854196071625,
-0.018327122554183006,
0.08930418640375137,
-0.022714370861649513,
-0.005313136149197817,
-0.059660740196704865,
-0.07424689084291458,
-0.039580024778842926,
0.059999678283929825,
0.0857706367969513,
0.0022030866239219904,
0.08053109049797058,
-0.08715865015983582,
0.08488910645246506,
-0.08724351972341537,
0.02316298708319664,
-0.03323034569621086,
0.03133474290370941,
-0.07326304912567139,
-0.07753515988588333,
0.09139669686555862,
-0.013128543272614479,
0.08359849452972412,
0.044016990810632706,
-0.018280712887644768,
0.0346253365278244,
-0.026710590347647667,
-0.04862386733293533,
0.04291101172566414,
0.13910649716854095,
0.050082989037036896,
0.019595466554164886,
-0.004226659424602985,
-0.040915343910455704,
0.00120971177238971,
0.1492851823568344,
0.1440606266260147,
0.15959085524082184,
0.11656907200813293,
0.03785508871078491,
0.08383092284202576,
-0.05276675894856453,
-0.08047525584697723,
0.08381031453609467,
-0.05989006534218788,
0.03745175898075104,
-0.057578347623348236,
-0.058580849319696426,
0.05734684690833092,
-0.1521473526954651,
0.07474104315042496,
-0.047984350472688675,
-0.09355542808771133,
-0.10736729204654694,
-0.13191939890384674,
-0.06834204494953156,
-0.04013852775096893,
0.006031820084899664,
-0.1080135628581047,
0.023759901523590088,
-0.004869580734521151,
0.017396777868270874,
-0.09931648522615433,
0.06896859407424927,
-0.10836110264062881,
-0.11893613636493683,
0.16150613129138947,
-0.020113492384552956,
-0.006306193303316832,
0.007579716853797436,
0.044121257960796356,
0.023448120802640915,
0.08033356815576553,
0.047536831349134445,
0.058845266699790955,
0.018879789859056473,
0.04077461361885071,
-0.08772889524698257,
-0.0727265253663063,
0.027522316202521324,
-0.02116125263273716,
0.1081136018037796,
0.19036155939102173,
0.07405402511358261,
-0.0832393616437912,
0.013336870819330215,
0.14325909316539764,
0.03061608038842678,
-0.10405511409044266,
-0.1349479854106903,
0.054214101284742355,
-0.03366626054048538,
0.0017036269418895245,
0.0221676267683506,
-0.07649426907300949,
0.01873095892369747,
0.19768980145454407,
0.1946084052324295,
-0.04353225603699684,
0.01446482539176941,
0.007317123934626579,
0.007056362461298704,
0.02184602990746498,
0.0849226787686348,
0.09647990018129349,
0.18108588457107544,
-0.025455236434936523,
0.06577800214290619,
-0.03184814378619194,
-0.09031752496957779,
-0.09863978624343872,
0.1267511397600174,
0.017476145178079605,
-0.029127849265933037,
-0.00990322232246399,
0.1606103479862213,
-0.09351889044046402,
-0.2129078507423401,
-0.11215841770172119,
-0.06631772220134735,
-0.11032786220312119,
0.024845123291015625,
-0.04615059122443199,
0.13642390072345734,
0.07619983702898026,
-0.0058435373939573765,
0.010387091897428036,
0.18868117034435272,
0.01966070383787155,
0.017733819782733917,
-0.025266550481319427,
0.1138124167919159,
-0.10721200704574585,
0.12032736837863922,
-0.0015499508008360863,
0.06939635425806046,
0.030908027663826942,
0.04666142910718918,
-0.07097745686769485,
0.03537290543317795,
0.036535270512104034,
0.008820817805826664,
0.0528804250061512,
0.16552211344242096,
-0.016838790848851204,
0.10180319845676422,
0.1262752115726471,
-0.07081003487110138,
0.043748173862695694,
-0.026514120399951935,
-0.02307567000389099,
-0.052016180008649826,
0.15591296553611755,
-0.15717433393001556,
0.12444829195737839,
0.11488615721464157,
-0.07355859875679016,
-0.04599085822701454,
-0.01258878968656063,
0.03757679462432861,
-0.050955019891262054,
0.1148262619972229,
0.0006212901207618415,
-0.16437645256519318,
0.026869844645261765,
-0.09602108597755432,
0.0785672515630722,
-0.24067232012748718,
-0.0499524287879467,
-0.06697487831115723,
-0.011845449917018414,
0.0019668557215481997,
0.10556665062904358,
0.08152565360069275,
-0.04061879962682724,
-0.012587417848408222,
-0.022424330934882164,
0.010203620418906212,
0.09351612627506256,
-0.07484804838895798,
-0.02760555036365986
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **slavic** on **88.99999999999999** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **slavic**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "slavic", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-slavic-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"slavic"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in slavic on 88.99999999999999 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in slavic. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in slavic on 88.99999999999999 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in slavic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in slavic on 88.99999999999999 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in slavic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
70,
258
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in slavic on 88.99999999999999 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in slavic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.09110580384731293,
0.09801976382732391,
-0.0034045742359012365,
0.012329486198723316,
0.09053938090801239,
-0.06210178881883621,
0.13995948433876038,
0.031991586089134216,
0.017758920788764954,
0.1093728169798851,
-0.019901392981410027,
-0.04824995622038841,
0.06048526614904404,
0.1094266027212143,
0.055525410920381546,
-0.2493736296892166,
0.053374044597148895,
-0.09171797335147858,
0.04831759259104729,
0.04597506299614906,
0.11995595693588257,
-0.09783323109149933,
0.020683718845248222,
0.042108118534088135,
-0.02276751771569252,
0.03266837075352669,
-0.03896760195493698,
-0.09505701065063477,
0.04714178666472435,
0.039943087846040726,
-0.019785545766353607,
0.02876700833439827,
0.08472318202257156,
-0.17058829963207245,
0.03712829202413559,
0.02195107936859131,
0.023247653618454933,
-0.012751801870763302,
0.1182531788945198,
-0.01405132096260786,
0.20197686553001404,
-0.048190824687480927,
-0.008780566975474358,
0.08238562196493149,
-0.05776960402727127,
-0.10589137673377991,
-0.08375381678342819,
0.1501944214105606,
0.08416109532117844,
0.1099664643406868,
-0.0840209573507309,
0.0724552646279335,
-0.03914676979184151,
0.057528793811798096,
0.05524744465947151,
-0.1889079511165619,
-0.03899099677801132,
0.07366059720516205,
0.09391665458679199,
0.02863696776330471,
-0.09115041047334671,
0.08804425597190857,
0.03814885392785072,
-0.02336445264518261,
-0.05286683142185211,
-0.029602136462926865,
0.0735672265291214,
-0.08956390619277954,
-0.12202440947294235,
-0.015085999853909016,
0.18911610543727875,
0.04177088290452957,
-0.06802552938461304,
-0.13563601672649384,
0.02276550605893135,
0.18428181111812592,
-0.046480488032102585,
-0.060677263885736465,
0.005375598557293415,
0.022217506542801857,
0.04568547010421753,
-0.04779551923274994,
-0.07545269280672073,
-0.009064268320798874,
0.007810934446752071,
0.10039030760526657,
0.010619791224598885,
-0.01751762069761753,
-0.06012589856982231,
-0.01637502759695053,
-0.09068882465362549,
-0.1314307451248169,
-0.028453640639781952,
-0.05779222026467323,
-0.06375432014465332,
-0.03864039480686188,
-0.022356590256094933,
-0.14607645571231842,
0.034873101860284805,
0.10374557226896286,
0.0765545591711998,
0.047202907502651215,
-0.04447105526924133,
-0.028893016278743744,
0.1052536591887474,
0.058741211891174316,
-0.10319335013628006,
-0.013012618757784367,
0.0039628599770367146,
-0.042954329401254654,
0.03524394333362579,
-0.03863831236958504,
-0.040347881615161896,
0.005078841000795364,
-0.03387419134378433,
0.057721104472875595,
0.03932538256049156,
-0.027217896655201912,
-0.03376687690615654,
-0.07135726511478424,
0.06993992626667023,
-0.09173224121332169,
0.03401833400130272,
0.061206597834825516,
-0.002890863688662648,
0.10392037034034729,
-0.06281857937574387,
0.0708719938993454,
-0.12015388906002045,
0.04559260606765747,
-0.013178057037293911,
0.010620882734656334,
0.00833434984087944,
-0.039768412709236145,
0.0527019277215004,
-0.000023365739252767526,
0.007246085908263922,
-0.12721586227416992,
-0.008886028081178665,
-0.10863187164068222,
-0.0429929718375206,
-0.06740090250968933,
-0.02020123414695263,
-0.05619960278272629,
0.006319738924503326,
0.008719961158931255,
-0.008600779809057713,
-0.012949726544320583,
-0.03929007053375244,
-0.002248442033305764,
-0.004901344422250986,
0.056830670684576035,
0.056958530098199844,
0.07848609983921051,
-0.038448967039585114,
-0.0383942537009716,
-0.09996501356363297,
0.1235344186425209,
-0.09574361890554428,
0.0023243315517902374,
-0.11955752968788147,
-0.01845269650220871,
-0.03554533049464226,
0.038335397839546204,
0.020321156829595566,
0.14199094474315643,
-0.18151581287384033,
-0.0740354135632515,
0.16976699233055115,
-0.13301581144332886,
0.006550367455929518,
0.17550651729106903,
0.005890022497624159,
0.027931595221161842,
0.1158750131726265,
0.21049316227436066,
0.04120708629488945,
-0.16714917123317719,
-0.04952748864889145,
-0.07423137128353119,
0.03972812741994858,
0.09953532367944717,
0.07523956894874573,
-0.06262529641389847,
0.0999857634305954,
-0.015257112681865692,
-0.01516895741224289,
-0.06950068473815918,
0.01128509920090437,
-0.04750693589448929,
0.006707345135509968,
-0.03906404599547386,
0.019400369375944138,
-0.015170158818364143,
-0.005371556617319584,
-0.0396764874458313,
-0.08088517934083939,
-0.03788549453020096,
0.12244879454374313,
-0.05982128903269768,
0.03492043539881706,
-0.08985555917024612,
0.09032414853572845,
0.0673864409327507,
0.018465736880898476,
-0.13714094460010529,
0.11812929064035416,
0.022774813696742058,
-0.02455499954521656,
0.12329407036304474,
0.05543062463402748,
-0.02029361017048359,
-0.00497931195423007,
-0.012990516610443592,
0.030694803223013878,
-0.024424314498901367,
-0.004440934397280216,
-0.025379523634910583,
-0.11020532995462418,
0.010502482764422894,
-0.061067789793014526,
0.1132260262966156,
-0.15113185346126556,
-0.011474152095615864,
0.08778645098209381,
0.1301508992910385,
-0.0039553893730044365,
-0.04649445414543152,
0.08724584430456161,
0.05726861208677292,
0.01907099224627018,
0.000026270567104802467,
0.01889118365943432,
-0.020610354840755463,
-0.012915766797959805,
0.09257694333791733,
-0.1385984867811203,
-0.1666756421327591,
0.11703980714082718,
0.00413016090169549,
-0.004802287556231022,
0.04444430395960808,
0.0457988865673542,
-0.013184714131057262,
-0.0464099682867527,
-0.008114303462207317,
0.19630059599876404,
-0.011382346041500568,
0.06371067464351654,
-0.0796930342912674,
-0.004160979762673378,
0.01067983079701662,
-0.04410768300294876,
-0.08493811637163162,
0.10143738240003586,
-0.007506814319640398,
-0.04059438407421112,
-0.00630634231492877,
0.05352583900094032,
0.06284289807081223,
0.1907460242509842,
-0.01074985135346651,
-0.11448266357183456,
-0.011791943572461605,
-0.0537850484251976,
0.0010952547891065478,
0.037829749286174774,
-0.14359411597251892,
-0.03455743566155434,
0.03434249013662338,
0.007895072922110558,
0.029194189235568047,
-0.025930218398571014,
0.033257536590099335,
0.019757619127631187,
-0.05570487678050995,
-0.09717445820569992,
0.03476257622241974,
-0.02781863324344158,
0.03544158115983009,
-0.01644846983253956,
0.030022451654076576,
-0.052810944616794586,
-0.05884486436843872,
-0.1337668001651764,
0.08124689012765884,
-0.07784998416900635,
-0.29069462418556213,
-0.11901982873678207,
-0.02715255878865719,
-0.038686323910951614,
0.023328062146902084,
0.0631123036146164,
-0.10631934553384781,
-0.11774658411741257,
-0.07537292689085007,
0.12537938356399536,
-0.006097429431974888,
-0.0694698691368103,
0.1220029816031456,
0.009942805394530296,
0.0013482182985171676,
-0.08135320246219635,
0.016746802255511284,
-0.02871674858033657,
-0.01304724719375372,
-0.009410737082362175,
0.03408955782651901,
0.07598072290420532,
0.1181659922003746,
0.03889106959104538,
-0.013551173731684685,
0.008627310395240784,
0.17904157936573029,
-0.14884880185127258,
0.04919687658548355,
0.23464685678482056,
-0.052240658551454544,
-0.01205404382199049,
0.15047112107276917,
-0.003750495845451951,
-0.04510496184229851,
0.04909639060497284,
0.007232266943901777,
-0.029434414580464363,
-0.24574799835681915,
-0.13487355411052704,
-0.06827917695045471,
-0.01693766936659813,
0.036189004778862,
0.026165327057242393,
-0.022219007834792137,
0.02733553759753704,
-0.10322275757789612,
-0.0640711858868599,
0.05388692393898964,
0.04688616096973419,
0.17900343239307404,
0.01791749894618988,
0.06660257279872894,
-0.06591638177633286,
-0.005765897687524557,
0.11388809978961945,
-0.06816475093364716,
0.06449547410011292,
0.06831514835357666,
0.12721699476242065,
0.060174912214279175,
0.029534757137298584,
0.05157886818051338,
-0.02346274070441723,
-0.002671860856935382,
-0.0037863817997276783,
-0.011764539405703545,
-0.08781619369983673,
0.007805757690221071,
0.048253461718559265,
0.12111608684062958,
-0.12852485477924347,
-0.11673670262098312,
0.03608496114611626,
0.04175911843776703,
0.1500706523656845,
0.1006789579987526,
-0.03700392320752144,
-0.13007071614265442,
0.04469858855009079,
-0.09061706811189651,
-0.020792240276932716,
0.04195228964090347,
0.1029089018702507,
-0.14875611662864685,
0.10045521706342697,
0.08733376860618591,
0.09454191476106644,
-0.026099596172571182,
0.02427322044968605,
-0.07452178746461868,
0.046774957329034805,
0.0011891934555023909,
0.062681183218956,
-0.19110263884067535,
0.10231678932905197,
0.015990985557436943,
0.09422166645526886,
-0.06795932352542877,
0.020827820524573326,
0.060091134160757065,
0.00702222716063261,
0.12369483709335327,
0.0019936494063585997,
-0.14617325365543365,
-0.0016055017476901412,
-0.11972897499799728,
0.008374246768653393,
0.062367163598537445,
-0.06663960218429565,
0.07327070087194443,
-0.006750592961907387,
-0.012052162550389767,
-0.047941356897354126,
-0.032144878059625626,
-0.2240118682384491,
-0.1388377994298935,
0.042804911732673645,
0.028999250382184982,
0.0659056156873703,
-0.03386726975440979,
-0.09292732179164886,
-0.1229751780629158,
0.07945170253515244,
-0.0307729821652174,
-0.029804471880197525,
-0.07851769775152206,
0.0034470066893845797,
0.08077976107597351,
-0.06939119100570679,
0.028725342825055122,
0.04787754267454147,
0.13656526803970337,
-0.05937289446592331,
-0.03513640537858009,
0.025494813919067383,
-0.08808579295873642,
-0.13487020134925842,
0.012653221376240253,
0.18287765979766846,
0.10680974274873734,
0.06308187544345856,
0.07252174615859985,
0.04392114281654358,
0.015826812013983727,
-0.0875786766409874,
0.00070528918877244,
0.05738819018006325,
-0.05434984341263771,
0.06847942620515823,
-0.0019896177109330893,
-0.28623420000076294,
-0.13509626686573029,
-0.09925519675016403,
0.09971510618925095,
0.18314878642559052,
-0.028658421710133553,
0.1851353496313095,
0.2544407546520233,
-0.09059693664312363,
-0.25050485134124756,
-0.030540915206074715,
-0.008233311586081982,
0.04746820777654648,
0.061229199171066284,
-0.211570143699646,
0.10380439460277557,
0.010840803384780884,
0.013523914851248264,
-0.027153676375746727,
-0.22615018486976624,
-0.14940133690834045,
0.14152373373508453,
-0.026117227971553802,
0.047956667840480804,
-0.04099136218428612,
-0.0813509002327919,
-0.03216293081641197,
-0.08107791095972061,
0.0016247819876298308,
-0.08757130801677704,
0.08703836053609848,
0.05571091175079346,
0.03486737981438637,
0.03287890553474426,
0.01286518108099699,
0.12040470540523529,
0.0809093490242958,
-0.021254783496260643,
-0.10323768109083176,
0.05095589905977249,
0.010442066006362438,
-0.020061977207660675,
0.09010984003543854,
0.006528904661536217,
-0.0036261233035475016,
-0.06684678047895432,
-0.08655070513486862,
-0.0541347935795784,
0.06538385152816772,
-0.062162939459085464,
-0.004174844827502966,
-0.05804544314742088,
0.07615645229816437,
0.020951446145772934,
-0.01743868552148342,
-0.01905742846429348,
-0.10899736732244492,
-0.029854102060198784,
0.08398353308439255,
0.21468375623226166,
-0.042542099952697754,
0.018628284335136414,
-0.03735929727554321,
-0.04634902626276016,
0.05504843592643738,
-0.039220940321683884,
0.05269541218876839,
0.06552300602197647,
0.03196660429239273,
0.06920748203992844,
-0.034398749470710754,
-0.12119828909635544,
0.043046265840530396,
0.05491962283849716,
-0.07774224132299423,
-0.16243401169776917,
-0.03174002096056938,
-0.018255945295095444,
-0.009549789130687714,
-0.03174621984362602,
0.20326834917068481,
-0.03219952806830406,
-0.04607784375548363,
0.0017856014892458916,
0.050566416233778,
0.0031269551254808903,
0.11487388610839844,
0.0356924906373024,
0.03430035710334778,
-0.0903758630156517,
0.06531526148319244,
0.1156761646270752,
-0.0791531503200531,
0.052252981811761856,
0.11395467072725296,
-0.053461331874132156,
-0.06300843507051468,
-0.11455491185188293,
-0.002838586922734976,
0.03061732091009617,
-0.06386739015579224,
-0.01092865876853466,
-0.12035900354385376,
0.021547183394432068,
0.046023525297641754,
0.017358694225549698,
-0.037465039640665054,
-0.0298900343477726,
0.004074256867170334,
-0.09061530977487564,
0.07293206453323364,
0.1003594696521759,
-0.036192066967487335,
-0.11308086663484573,
0.11216401308774948,
-0.0056837317533791065,
0.04775131866335869,
-0.03544868156313896,
-0.06441211700439453,
-0.09743361920118332,
0.0004695711249951273,
-0.11169764399528503,
0.03906029835343361,
-0.159127339720726,
-0.008726787753403187,
-0.046677835285663605,
-0.026629867032170296,
-0.014703853987157345,
0.08379838615655899,
-0.038928840309381485,
-0.00002783702984743286,
-0.04412660747766495,
0.10505559295415878,
-0.13923954963684082,
0.06044244021177292,
0.04430170729756355,
-0.0406707264482975,
0.10754077136516571,
0.05592102184891701,
-0.04631029814481735,
0.04214078187942505,
-0.2301187664270401,
-0.04034431651234627,
-0.017320161685347557,
0.05195486545562744,
-0.009251249022781849,
-0.1361207813024521,
0.004320115316659212,
0.027512509375810623,
0.026628436520695686,
-0.01071130484342575,
0.04541904106736183,
-0.03076780214905739,
-0.027385691180825233,
-0.05157363414764404,
-0.05448249354958534,
-0.0333656407892704,
0.07466157525777817,
0.0764734297990799,
0.017303628847002983,
0.1176924854516983,
-0.09998656064271927,
0.06582242995500565,
-0.05550600588321686,
0.03738265857100487,
-0.03107805922627449,
0.008032001554965973,
-0.10579178482294083,
-0.07274773716926575,
0.07914379984140396,
-0.007062583230435848,
0.06018256023526192,
0.017183993011713028,
-0.02794087864458561,
0.06486903131008148,
-0.026232095435261726,
-0.04153447970747948,
0.042788147926330566,
0.14438600838184357,
0.061752233654260635,
0.00763760507106781,
-0.00020775952725671232,
-0.037570759654045105,
0.015233895741403103,
0.13101445138454437,
0.12488431483507156,
0.18714246153831482,
0.11451208591461182,
0.04762570559978485,
0.06965530663728714,
-0.035505134612321854,
-0.09256888926029205,
0.04018654301762581,
-0.09799789637327194,
0.03921518102288246,
-0.06446689367294312,
-0.026235904544591904,
0.07799074798822403,
-0.1395367830991745,
0.07133254408836365,
-0.042569659650325775,
-0.07397082448005676,
-0.09871614724397659,
-0.15208934247493744,
-0.06909440457820892,
-0.03880538046360016,
0.004462648183107376,
-0.10990023612976074,
0.0441468209028244,
0.02233901433646679,
0.04210450127720833,
-0.10151911526918411,
0.10997727513313293,
-0.13280805945396423,
-0.1228945255279541,
0.15421943366527557,
-0.0485246479511261,
-0.005750207230448723,
0.013179678469896317,
0.026256265118718147,
0.030959563329815865,
0.08494925498962402,
0.040500860661268234,
0.057894058525562286,
0.029774267226457596,
0.008404649794101715,
-0.10627059638500214,
-0.06857997179031372,
0.035838838666677475,
0.007423484232276678,
0.10318315774202347,
0.19458724558353424,
0.09849698841571808,
-0.06590498238801956,
0.01373390294611454,
0.17089295387268066,
0.02168850786983967,
-0.10245261341333389,
-0.15499448776245117,
-0.011865717358887196,
-0.020884428173303604,
0.0033965162001550198,
-0.010325761511921883,
-0.1071699932217598,
0.011824044398963451,
0.1978851854801178,
0.15609095990657806,
-0.023400889709591866,
0.03450607880949974,
-0.03428303822875023,
0.015506434254348278,
0.017300011590123177,
0.0704885795712471,
0.07722431421279907,
0.17248639464378357,
0.008707326836884022,
0.050807125866413116,
-0.026946010068058968,
-0.08420246094465256,
-0.13478781282901764,
0.07895582169294357,
-0.03617309778928757,
-0.04726992920041084,
0.0030305287800729275,
0.18538567423820496,
-0.11931084096431732,
-0.1690123975276947,
-0.10181927680969238,
-0.048368729650974274,
-0.11407973617315292,
0.019910519942641258,
-0.02407311275601387,
0.16007795929908752,
0.03162383288145065,
0.007021037861704826,
0.0016257748939096928,
0.17358361184597015,
0.04833010584115982,
0.02674565464258194,
-0.03485672175884247,
0.09818364679813385,
-0.0881943330168724,
0.1126350462436676,
-0.008004598319530487,
0.05058261752128601,
0.04392070695757866,
0.06060438230633736,
-0.07184503972530365,
0.015893880277872086,
0.04546608403325081,
-0.04359547048807144,
0.05013357475399971,
0.16850268840789795,
-0.023394718766212463,
0.11156082153320312,
0.11703763157129288,
-0.0625319704413414,
0.028312090784311295,
0.012139293365180492,
0.025078382343053818,
-0.05839075520634651,
0.15905889868736267,
-0.15402495861053467,
0.11962901055812836,
0.11913172900676727,
-0.06325863301753998,
-0.054926417768001556,
-0.009708714671432972,
0.04722996801137924,
-0.06344719231128693,
0.05929745361208916,
-0.00900716707110405,
-0.1804616004228592,
0.03247734159231186,
-0.09113133698701859,
0.07187041640281677,
-0.2632749080657959,
-0.04525025188922882,
-0.052441202104091644,
-0.009807941503822803,
-0.005383296404033899,
0.1095365509390831,
0.09348143637180328,
-0.05819293111562729,
-0.0077574108727276325,
-0.0942290797829628,
0.03704068809747696,
0.10626234859228134,
-0.07889170199632645,
-0.03419925644993782
] |
null | null |
transformers
|
# Wav2Vec2-Large-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained on the sv unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
|
{"language": "sv", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-sv-voxpopuli
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"sv",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"sv"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #sv #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-VoxPopuli
Facebook's Wav2Vec2 large model pretrained on the sv unlabeled subset of VoxPopuli corpus.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, here
# Fine-Tuning
Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
|
[
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the sv unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #sv #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the sv unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here",
"# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
72,
132,
57
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #sv #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-VoxPopuli\n\nFacebook's Wav2Vec2 large model pretrained on the sv unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning."
] |
[
-0.06429700553417206,
0.03374921903014183,
-0.004574027378112078,
0.0125862592831254,
0.12825971841812134,
-0.028856122866272926,
0.083600252866745,
0.015179859474301338,
0.042256928980350494,
0.013881825841963291,
0.013027173466980457,
0.028572140261530876,
0.07156909257173538,
0.09620904922485352,
0.00008217849972425029,
-0.2987090051174164,
0.05631163716316223,
0.01428634487092495,
0.0784062072634697,
0.06008395552635193,
0.1263027787208557,
-0.08553262054920197,
0.019877227023243904,
0.05310901626944542,
-0.09999855607748032,
-0.005730852950364351,
0.027307450771331787,
-0.0888870358467102,
0.12848615646362305,
0.0851716697216034,
0.08057796210050583,
0.06179148703813553,
0.04267727583646774,
-0.1551172137260437,
0.029504310339689255,
0.04701380059123039,
-0.03884690999984741,
-0.0014049727469682693,
0.12057392299175262,
-0.02263331227004528,
0.18986839056015015,
-0.02850470319390297,
-0.03500739857554436,
0.08724861592054367,
-0.12562419474124908,
-0.16728612780570984,
-0.061812374740839005,
0.13166607916355133,
0.13963359594345093,
0.08072793483734131,
-0.07384057343006134,
0.04220251739025116,
-0.05602455139160156,
0.0684327483177185,
0.08712579309940338,
-0.2937783896923065,
-0.03670505806803703,
0.1239653080701828,
0.0719723328948021,
-0.025061191990971565,
-0.10040121525526047,
0.08461745828390121,
0.016424736008048058,
0.004906074609607458,
0.002544036600738764,
-0.086180679500103,
0.03564164415001869,
-0.07754208892583847,
-0.11923013627529144,
-0.008553688414394855,
0.1911894530057907,
0.040857814252376556,
-0.0595560297369957,
-0.09368739277124405,
-0.023255588486790657,
0.17430830001831055,
-0.06093094125390053,
-0.1650274097919464,
0.011603443883359432,
0.030773669481277466,
0.061370160430669785,
-0.15407533943653107,
-0.08227147161960602,
-0.02686598151922226,
-0.0433419868350029,
0.11904967576265335,
0.03545966371893883,
-0.024892691522836685,
-0.06709050387144089,
0.028697805479168892,
-0.08158105611801147,
-0.06534648686647415,
0.007542176637798548,
-0.0977688729763031,
-0.07648324966430664,
-0.012582329101860523,
-0.07041513174772263,
-0.09827440232038498,
-0.0284283384680748,
0.07077018171548843,
0.02078389935195446,
0.04155238717794418,
-0.041790515184402466,
0.03657756373286247,
0.013247893191874027,
0.09312079846858978,
-0.12930148839950562,
0.007934204302728176,
0.0066459872759878635,
-0.03411736711859703,
-0.010474787093698978,
-0.03491253778338432,
-0.07622288167476654,
-0.08201699703931808,
0.013443449512124062,
0.0680062472820282,
0.02150094509124756,
0.022026846185326576,
-0.07052683085203171,
-0.09458497166633606,
0.035128284245729446,
-0.06787924468517303,
0.016360444948077202,
0.03126847371459007,
-0.002773863961920142,
0.18535281717777252,
-0.0023907539434731007,
0.05789434537291527,
-0.15696536004543304,
0.02662874013185501,
-0.014582871459424496,
0.014555997215211391,
-0.006665696389973164,
-0.05878826603293419,
0.021632924675941467,
-0.02659149095416069,
-0.010713714174926281,
-0.14613431692123413,
-0.0642881840467453,
-0.07759758085012436,
-0.0063306107185781,
-0.036307644098997116,
-0.08287980407476425,
-0.042511533945798874,
-0.006138796452432871,
-0.02248745784163475,
-0.033921025693416595,
-0.020423544570803642,
-0.02298927865922451,
0.022361714392900467,
-0.017616668716073036,
0.07060951739549637,
-0.04194675385951996,
0.08493553847074509,
0.007814968936145306,
-0.031122226268053055,
-0.13048455119132996,
0.13201788067817688,
-0.07105465233325958,
-0.06611110270023346,
-0.15390226244926453,
-0.06636373698711395,
-0.051179591566324234,
0.059246379882097244,
0.00424804724752903,
0.15160919725894928,
-0.18016642332077026,
-0.10898890346288681,
0.2671624422073364,
-0.09659227728843689,
0.03180382028222084,
0.1916671097278595,
0.01967148669064045,
0.04392291232943535,
0.1642347127199173,
0.13993340730667114,
0.0370013602077961,
-0.11533777415752411,
0.058080848306417465,
-0.04314948990941048,
-0.018249770626425743,
0.0502791590988636,
0.06278210133314133,
-0.012561357580125332,
0.0007911011343821883,
0.0000509193487232551,
-0.08117786049842834,
-0.05268891528248787,
-0.017444834113121033,
-0.061910852789878845,
0.027550334110856056,
-0.022118205204606056,
0.10184060037136078,
0.005829500034451485,
0.002390958834439516,
0.017705490812659264,
-0.0929742306470871,
-0.0371558703482151,
0.07456444203853607,
-0.0575423389673233,
0.07148732244968414,
-0.10778460651636124,
0.05540452525019646,
0.10454101860523224,
0.05414990335702896,
-0.15402014553546906,
0.059338293969631195,
-0.013308276422321796,
0.08711078763008118,
0.10290824621915817,
0.20294329524040222,
-0.02695094794034958,
-0.04204912856221199,
-0.07892128080129623,
0.01984838768839836,
-0.03692254424095154,
-0.0387488454580307,
-0.03239276632666588,
-0.08516450971364975,
-0.03368133679032326,
-0.03550725430250168,
0.06706885248422623,
-0.1623830795288086,
0.00405195914208889,
0.055862389504909515,
0.04959502071142197,
0.0100579047575593,
0.011350136250257492,
0.02049555629491806,
0.1005689948797226,
0.042658135294914246,
0.015687741339206696,
0.09500324726104736,
-0.007896475493907928,
-0.05797833576798439,
0.09929785132408142,
-0.05912328511476517,
0.01139977015554905,
0.135630264878273,
-0.10802989453077316,
-0.007592976093292236,
0.011813975870609283,
0.019691647961735725,
-0.0006062343600206077,
0.002672431757673621,
-0.016341375187039375,
0.2176971286535263,
0.018352048471570015,
0.08416102826595306,
-0.08111722022294998,
0.023366518318653107,
-0.010093169286847115,
-0.04954155907034874,
-0.0411965548992157,
0.08777059614658356,
0.017033537849783897,
-0.07800346612930298,
0.00354283326305449,
0.0972309485077858,
-0.014813907444477081,
0.14344251155853271,
0.020705336704850197,
-0.018937507644295692,
0.01347337570041418,
-0.050696976482868195,
-0.02241484262049198,
-0.01198482047766447,
-0.16347095370292664,
-0.02592722326517105,
0.036214396357536316,
0.025526536628603935,
0.05652331933379173,
-0.07614107429981232,
-0.005540644284337759,
0.00753667950630188,
-0.08784640580415726,
-0.04465518146753311,
0.046901438385248184,
-0.011107953265309334,
0.08331754803657532,
-0.03857804089784622,
-0.015706053003668785,
-0.010221612639725208,
-0.035943545401096344,
-0.10826397687196732,
0.11558482050895691,
-0.05518155172467232,
-0.3580605685710907,
-0.09122168272733688,
-0.08486168831586838,
-0.08693455159664154,
0.026775097474455833,
0.05135341361165047,
-0.09513519704341888,
-0.06685236096382141,
0.009518597275018692,
0.16358546912670135,
-0.038018397986888885,
-0.0836418941617012,
0.031631216406822205,
0.009298669174313545,
-0.010047328658401966,
-0.10009869188070297,
0.005507263820618391,
-0.041258953511714935,
-0.1335357427597046,
0.011218965984880924,
-0.016229858621954918,
0.041985996067523956,
0.14534378051757812,
0.03686807304620743,
-0.017956752330064774,
-0.02657722681760788,
0.20918838679790497,
-0.1163511648774147,
0.06971566379070282,
0.28704023361206055,
-0.002097019227221608,
0.018835538998246193,
0.12950530648231506,
-0.002000237349420786,
-0.061571475118398666,
-0.0031473932322114706,
0.049640510231256485,
-0.011748770251870155,
-0.26215043663978577,
-0.1303526759147644,
-0.06488357484340668,
-0.014844337478280067,
0.03289768844842911,
-0.0022037869784981012,
0.017048342153429985,
0.041201137006282806,
-0.09561996161937714,
-0.04268816486001015,
0.0657699853181839,
0.036053191870450974,
0.20968173444271088,
-0.033675968647003174,
0.14729554951190948,
-0.02176469750702381,
-0.029357369989156723,
0.06365660578012466,
0.03595634549856186,
0.07857337594032288,
0.0949423536658287,
0.09444588422775269,
0.08840573579072952,
0.05895456671714783,
0.025424115359783173,
0.00011049677414121106,
0.0016996946651488543,
-0.01807134412229061,
-0.052989810705184937,
-0.02182588167488575,
-0.043182529509067535,
0.004231198690831661,
0.12669266760349274,
-0.1488039493560791,
-0.13697277009487152,
0.0004643501015380025,
0.029354264959692955,
0.16768504679203033,
0.05397259443998337,
-0.0893053412437439,
-0.05669732019305229,
0.0362960547208786,
-0.0905727744102478,
-0.03829031437635422,
0.05617406219244003,
0.08189075440168381,
-0.16459381580352783,
0.11908812075853348,
0.031085478141903877,
0.09929159283638,
-0.02156853675842285,
0.05776060372591019,
-0.156537264585495,
-0.004396817646920681,
0.04431690275669098,
0.08496309816837311,
-0.26242172718048096,
0.21188369393348694,
0.005296120420098305,
0.06621670722961426,
-0.07459811866283417,
-0.0038754332344979048,
0.039728112518787384,
0.10746031254529953,
0.1482754349708557,
-0.010821279138326645,
-0.011534113436937332,
0.005467010196298361,
-0.018952414393424988,
0.03228047117590904,
0.029460016638040543,
-0.035726264119148254,
0.04607922583818436,
-0.013179267756640911,
0.012671775184571743,
-0.005786904599517584,
0.10105147212743759,
-0.2148606777191162,
-0.12704508006572723,
0.03192051127552986,
0.021052632480859756,
0.09610607475042343,
-0.0063322363421320915,
-0.07586676627397537,
-0.12989658117294312,
0.11255621910095215,
-0.007852939888834953,
-0.031055763363838196,
-0.0964399203658104,
0.051588673144578934,
0.02003033086657524,
-0.1094740554690361,
0.0259616207331419,
0.047239188104867935,
0.11313027143478394,
-0.09069836884737015,
-0.05055185407400131,
0.04864031448960304,
-0.08795952051877975,
-0.079894058406353,
0.048200689256191254,
0.19011148810386658,
0.0970364660024643,
0.03882423788309097,
0.10999637097120285,
-0.03815915808081627,
0.030750691890716553,
-0.11374661326408386,
0.06900861114263535,
0.016623802483081818,
-0.010315296240150928,
0.023570196703076363,
-0.05023960769176483,
-0.26078975200653076,
-0.11581820994615555,
-0.01812564767897129,
0.19454950094223022,
0.195024311542511,
-0.0012738012010231614,
0.15554749965667725,
0.24918133020401,
-0.08193953335285187,
-0.2570629119873047,
-0.06562980264425278,
-0.01583154872059822,
0.04353789612650871,
0.031206371262669563,
-0.26109540462493896,
0.061029188334941864,
0.05891309678554535,
0.003168204566463828,
-0.09507615864276886,
-0.2363004982471466,
-0.12760774791240692,
0.1823665201663971,
0.05082494020462036,
0.14593786001205444,
-0.08751458674669266,
-0.04271797090768814,
-0.0667015016078949,
-0.10365931689739227,
0.08679773658514023,
-0.10329996049404144,
0.09586003422737122,
0.04806984215974808,
-0.002206383505836129,
0.012797650881111622,
0.04432297870516777,
0.12155736982822418,
0.06787463277578354,
0.0020730728283524513,
-0.03264370560646057,
0.01624203659594059,
0.019206862896680832,
0.026596810668706894,
0.042240649461746216,
0.020785652101039886,
-0.019254133105278015,
-0.05293953791260719,
-0.11541727930307388,
-0.11914654821157455,
0.08181498944759369,
-0.05620912089943886,
-0.01855914667248726,
-0.01881810836493969,
0.09573283046483994,
0.015611414797604084,
0.014789937064051628,
-0.06677334010601044,
-0.1464856117963791,
0.03178661689162254,
0.11278023570775986,
0.24439044296741486,
-0.1350785493850708,
-0.027607832103967667,
-0.05141320824623108,
-0.04701623320579529,
0.08849449455738068,
0.0062761432491242886,
0.0652068480849266,
0.044200167059898376,
0.0060411482118070126,
0.09162996709346771,
0.025782564654946327,
-0.07026365399360657,
0.01570781134068966,
0.02942388318479061,
-0.07342320680618286,
-0.2613798677921295,
-0.06450961530208588,
-0.013787935487926006,
0.01679586060345173,
0.01902489736676216,
0.18111462891101837,
-0.018258024007081985,
-0.06248026341199875,
-0.010018136352300644,
0.0422234982252121,
-0.040810104459524155,
0.06500513851642609,
0.04155556857585907,
0.04879045486450195,
-0.11037866026163101,
0.04903285950422287,
0.10042592138051987,
-0.1262454092502594,
0.04628636687994003,
0.059653669595718384,
-0.05864860489964485,
-0.0979488268494606,
-0.1369543820619583,
0.01008713711053133,
-0.007687492296099663,
-0.07648134231567383,
0.024461109191179276,
-0.16536597907543182,
0.027151452377438545,
0.0764002874493599,
0.045612867921590805,
-0.014407750219106674,
-0.0646386593580246,
-0.040621090680360794,
-0.038457535207271576,
0.011647587642073631,
0.12799924612045288,
-0.06769884377717972,
-0.13545523583889008,
0.15827058255672455,
0.015061017125844955,
0.09950816631317139,
-0.04382410645484924,
-0.052735909819602966,
-0.1281915158033371,
0.024403482675552368,
-0.1275172084569931,
0.016950415447354317,
-0.12582699954509735,
-0.0005895911599509418,
-0.05110926553606987,
-0.016616377979516983,
-0.025922611355781555,
0.033121056854724884,
-0.10276266187429428,
0.011608587577939034,
-0.003846141044050455,
0.08044592291116714,
-0.10120996087789536,
0.07201477885246277,
0.06789632886648178,
-0.019072867929935455,
0.09040355682373047,
0.020572442561388016,
-0.04459523782134056,
0.10079817473888397,
-0.17383889853954315,
-0.04620346054434776,
0.03544904664158821,
0.029288053512573242,
-0.00895677600055933,
-0.1570405513048172,
0.010458983480930328,
0.03115740790963173,
0.04299243912100792,
-0.0017490360187366605,
0.07170109450817108,
-0.0690387487411499,
-0.011583148501813412,
-0.05037321150302887,
-0.08610519766807556,
-0.02894650585949421,
0.0728285014629364,
0.09576089680194855,
0.023698287084698677,
0.11346375942230225,
-0.0786147341132164,
0.06253673881292343,
-0.10678155720233917,
0.06767139583826065,
-0.047202032059431076,
-0.029471782967448235,
0.008211687207221985,
-0.128581240773201,
0.06331790238618851,
-0.00772059615701437,
0.06164735183119774,
0.007336515933275223,
-0.010579725727438927,
0.012804657220840454,
-0.09300485998392105,
-0.10031765699386597,
0.027965914458036423,
0.127399742603302,
0.07378975301980972,
-0.011276815086603165,
0.035986144095659256,
0.008882121182978153,
0.011206566356122494,
0.2120262086391449,
0.20312173664569855,
0.2122138887643814,
0.05288251116871834,
0.07718520611524582,
0.010586651973426342,
-0.05352070927619934,
-0.04876818507909775,
0.019357522949576378,
-0.06955485045909882,
0.03270876407623291,
-0.06797927618026733,
-0.03033163957297802,
0.08293633162975311,
-0.1397007256746292,
0.12260702252388,
0.011788388714194298,
-0.08581686019897461,
-0.14722640812397003,
-0.18699809908866882,
-0.06240210682153702,
-0.08894813805818558,
-0.022511277347803116,
-0.1225789338350296,
-0.030885113403201103,
0.031666215509176254,
0.02736932598054409,
-0.12436345219612122,
0.075728639960289,
-0.1325836032629013,
-0.15915976464748383,
0.1894141584634781,
-0.04403197392821312,
0.01161364559084177,
-0.022301720455288887,
0.01027547288686037,
0.0072622150182724,
0.10351446270942688,
0.02972012385725975,
0.038094598799943924,
-0.0250043123960495,
0.03752364218235016,
-0.07845613360404968,
-0.06142880395054817,
-0.003002871759235859,
0.0391804538667202,
0.13368991017341614,
0.20403790473937988,
0.03693673759698868,
-0.06980203837156296,
0.006743063218891621,
0.16703446209430695,
0.03169720247387886,
-0.11224882304668427,
-0.1349266767501831,
0.07828030735254288,
0.018083665519952774,
0.011066019535064697,
-0.02053285948932171,
-0.059923697263002396,
0.013270875439047813,
0.27347004413604736,
0.18082116544246674,
-0.06236298009753227,
0.029062172397971153,
0.0016404286725446582,
0.03294237330555916,
0.07974982261657715,
0.11537032574415207,
0.09052760154008865,
0.1834292858839035,
-0.04198372736573219,
0.015439008362591267,
-0.011201032437384129,
-0.05571243166923523,
-0.0846903920173645,
0.13965639472007751,
0.03555982559919357,
-0.07318733632564545,
-0.0011723185889422894,
0.15199878811836243,
-0.14327463507652283,
-0.06894713640213013,
-0.08128906786441803,
-0.05313226953148842,
-0.11058197170495987,
-0.01200076937675476,
-0.03945746272802353,
0.10608670115470886,
0.10384200513362885,
-0.022221727296710014,
-0.008326658047735691,
0.17544087767601013,
0.05037544667720795,
0.0008624223992228508,
-0.030404480174183846,
0.12799742817878723,
-0.0020383840892463923,
0.06800512969493866,
-0.0157638993114233,
0.07453057169914246,
0.05619271844625473,
0.04848816618323326,
-0.0195794515311718,
0.06187531724572182,
0.0010645780712366104,
0.03212404623627663,
0.0902710109949112,
0.13494153320789337,
0.009473258629441261,
0.0319426953792572,
0.08765026926994324,
-0.1550588309764862,
0.02185036800801754,
0.04557562619447708,
-0.04317561537027359,
-0.016754191368818283,
0.17122511565685272,
-0.18784554302692413,
0.05043114721775055,
0.1491258442401886,
-0.03904590383172035,
-0.03650316596031189,
-0.04212062433362007,
0.05951699614524841,
-0.028844859451055527,
0.0487678162753582,
-0.03744296357035637,
-0.14098669588565826,
0.006857805885374546,
-0.07602851092815399,
0.028914624825119972,
-0.18769824504852295,
-0.007576881442219019,
-0.034150782972574234,
-0.0073458547703921795,
-0.045526131987571716,
0.0869128629565239,
0.010538461618125439,
-0.052375540137290955,
0.019140444695949554,
-0.07413791865110397,
0.013418057933449745,
0.10833719372749329,
-0.0980757623910904,
-0.049442991614341736
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **uralic** on **42.5** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **uralic**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "uralic", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-uralic-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"uralic"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in uralic on 42.5 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in uralic. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in uralic on 42.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in uralic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in uralic on 42.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in uralic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
70,
253
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in uralic on 42.5 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in uralic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.06679213792085648,
0.12218085676431656,
-0.003304916899651289,
0.0039359694346785545,
0.08233853429555893,
-0.044057492166757584,
0.14404986798763275,
0.03610902652144432,
-0.002150424290448427,
0.09838239848613739,
-0.015431174077093601,
-0.058589205145835876,
0.07163230329751968,
0.10792652517557144,
0.061429914087057114,
-0.2646651566028595,
0.034876398742198944,
-0.06780911237001419,
0.03926054388284683,
0.04618022218346596,
0.11911674588918686,
-0.0857013389468193,
0.03247386962175369,
0.05531132593750954,
-0.028044577687978745,
0.02953210286796093,
-0.04657241329550743,
-0.06939323991537094,
0.048467863351106644,
0.04966317489743233,
-0.02746962010860443,
0.03434261307120323,
0.09756511449813843,
-0.18946079909801483,
0.03605882078409195,
0.03719323128461838,
0.024679571390151978,
0.011323670856654644,
0.09899160265922546,
0.024022098630666733,
0.15938334167003632,
-0.028535038232803345,
-0.010255408473312855,
0.07691226899623871,
-0.052194613963365555,
-0.07022212445735931,
-0.06552724540233612,
0.162655770778656,
0.08357901871204376,
0.10434844344854355,
-0.07817128300666809,
0.08325688540935516,
-0.026149088516831398,
0.04041282832622528,
0.07067332416772842,
-0.1927577555179596,
-0.05422862246632576,
0.05914205312728882,
0.10439242422580719,
0.026822207495570183,
-0.08270608633756638,
0.0710597038269043,
0.04906462877988815,
-0.011805989779531956,
-0.07385575771331787,
-0.03422797843813896,
0.1354355365037918,
-0.1039784774184227,
-0.11632087826728821,
0.0005505118169821799,
0.16417834162712097,
0.061780013144016266,
-0.07439187914133072,
-0.1320757269859314,
0.010408801026642323,
0.22187912464141846,
-0.05564865842461586,
-0.09339266270399094,
0.017227569594979286,
0.02167147397994995,
0.03639549762010574,
-0.07369669526815414,
-0.07548446208238602,
0.00039214169373735785,
0.012681430205702782,
0.10726926475763321,
0.02391722984611988,
-0.01470252126455307,
-0.07123005390167236,
-0.010693348944187164,
-0.08827623724937439,
-0.11783590167760849,
-0.005765123292803764,
-0.06601816415786743,
-0.06283319741487503,
-0.04007065296173096,
-0.002615100471302867,
-0.11343958973884583,
0.03672744333744049,
0.09621184319257736,
0.06333106756210327,
0.04958638176321983,
-0.05327850580215454,
-0.03060944937169552,
0.1207575649023056,
0.08485763520002365,
-0.12774795293807983,
-0.010690545663237572,
0.016123177483677864,
-0.0190520528703928,
0.01471323799341917,
-0.028531406074762344,
-0.04137510061264038,
0.007861056365072727,
-0.02949470281600952,
0.04860060662031174,
0.05564187467098236,
-0.033658627420663834,
-0.032734084874391556,
-0.09679485112428665,
0.10132889449596405,
-0.07724004238843918,
0.024026719853281975,
0.04642101004719734,
0.0016689416952431202,
0.10803759098052979,
-0.06365970522165298,
0.08073458075523376,
-0.11435555666685104,
0.0015413586515933275,
-0.028355078771710396,
-0.006032205652445555,
0.024139592424035072,
-0.02702726051211357,
0.031959276646375656,
0.004709859378635883,
0.0034620407968759537,
-0.11846142262220383,
0.008119823411107063,
-0.09961218386888504,
-0.019564637914299965,
-0.0803103819489479,
-0.038607802242040634,
-0.04303883761167526,
0.024050675332546234,
-0.008525010198354721,
0.00031831578235141933,
0.005171709693968296,
-0.014479786157608032,
-0.00915958359837532,
0.0028154964093118906,
0.04842708632349968,
0.05765838921070099,
0.080806203186512,
-0.02358466200530529,
-0.018111055716872215,
-0.09732873737812042,
0.10893809795379639,
-0.0724656730890274,
-0.013603299856185913,
-0.14082317054271698,
-0.04413120076060295,
-0.038355130702257156,
0.029234526678919792,
0.015820110216736794,
0.12353433668613434,
-0.16783447563648224,
-0.07156147807836533,
0.12728729844093323,
-0.12383916229009628,
0.024238107725977898,
0.18366579711437225,
0.0062866429798305035,
0.06113589555025101,
0.10292216390371323,
0.20596401393413544,
0.01903633214533329,
-0.16833338141441345,
-0.012141790241003036,
-0.051700785756111145,
0.033208705484867096,
0.12495037168264389,
0.0627456083893776,
-0.06394008547067642,
0.07136382907629013,
-0.016843248158693314,
-0.01743374764919281,
-0.07769974321126938,
-0.009630812332034111,
-0.04351217672228813,
0.014766260050237179,
-0.04968627542257309,
0.018384961411356926,
-0.002594145480543375,
-0.015909044072031975,
-0.007254990749061108,
-0.09350641071796417,
-0.059511471539735794,
0.12161517143249512,
-0.058940403163433075,
0.02750614657998085,
-0.10057169198989868,
0.067203089594841,
0.0640505999326706,
0.003601768985390663,
-0.12635070085525513,
0.12271998077630997,
0.030517825856804848,
-0.039852287620306015,
0.14616207778453827,
0.0787363350391388,
-0.028400056064128876,
0.006746463477611542,
-0.01931794360280037,
0.02377854473888874,
-0.029302649199962616,
0.0075520118698477745,
-0.023148251697421074,
-0.10022065043449402,
-0.0014262134209275246,
-0.06173133850097656,
0.09181895107030869,
-0.13344155251979828,
-0.020651644095778465,
0.031688980758190155,
0.11628228425979614,
-0.013279429636895657,
-0.036124225705862045,
0.08983255922794342,
0.04524047672748566,
0.028274258598685265,
-0.019687002524733543,
0.02301265113055706,
-0.017841091379523277,
0.004021919332444668,
0.04946158081293106,
-0.1264350563287735,
-0.15620361268520355,
0.09604743123054504,
0.02365446835756302,
-0.0113876573741436,
0.07375504821538925,
0.022933755069971085,
-0.014020639471709728,
-0.03779883310198784,
0.010274979285895824,
0.22673076391220093,
-0.010828954167664051,
0.062328238040208817,
-0.07643290609121323,
-0.0007051934953778982,
0.01486184448003769,
-0.0521099716424942,
-0.09232009202241898,
0.08554210513830185,
0.015090271830558777,
-0.09809833765029907,
-0.042227402329444885,
0.05128008872270584,
0.06127733737230301,
0.15441936254501343,
0.003153822850435972,
-0.09031858295202255,
-0.029462287202477455,
-0.05920063704252243,
-0.007146582938730717,
0.031127626076340675,
-0.12955494225025177,
-0.0209257323294878,
0.0240175724029541,
0.004868092015385628,
0.04772434011101723,
-0.022978052496910095,
0.04204849153757095,
0.007131802383810282,
-0.04621051251888275,
-0.08446735143661499,
0.032554011791944504,
-0.03214649483561516,
0.038827553391456604,
-0.004365770146250725,
0.007129765581339598,
-0.04834762588143349,
-0.05727645009756088,
-0.13772830367088318,
0.08788841217756271,
-0.05971049517393112,
-0.3010159134864807,
-0.09492090344429016,
-0.046372320502996445,
-0.03274693712592125,
0.010036464780569077,
0.04553624615073204,
-0.11957567185163498,
-0.11141181737184525,
-0.06774488091468811,
0.12822064757347107,
-0.03292817622423172,
-0.06109493598341942,
0.1170426607131958,
0.0050177183002233505,
0.024344859644770622,
-0.09400007873773575,
0.015140734612941742,
-0.04727662727236748,
-0.032685957849025726,
-0.02105521224439144,
0.014236852526664734,
0.06112920492887497,
0.12245308607816696,
0.019846497103571892,
-0.00821647234261036,
0.015509873628616333,
0.22087827324867249,
-0.1271946132183075,
0.07918154448270798,
0.24103805422782898,
-0.05146843194961548,
0.001216449891217053,
0.14502696692943573,
-0.006549202837049961,
-0.04962736740708351,
0.045464348047971725,
-0.00002112738548021298,
-0.014856114983558655,
-0.23279811441898346,
-0.1185762956738472,
-0.046733610332012177,
-0.03769926726818085,
0.034624818712472916,
0.017106661573052406,
-0.025057118386030197,
0.01766510121524334,
-0.08408047258853912,
-0.03736687824130058,
0.046559348702430725,
0.031418558210134506,
0.15257583558559418,
0.010087624192237854,
0.050018493086099625,
-0.049180954694747925,
-0.031428586691617966,
0.10470520704984665,
-0.04057737812399864,
0.056074801832437515,
0.08035384118556976,
0.1091451346874237,
0.06288658827543259,
0.03282955288887024,
0.06144118681550026,
-0.01994655653834343,
-0.017114408314228058,
-0.005004661623388529,
-0.030034543946385384,
-0.06220294535160065,
0.021031221374869347,
0.0408642552793026,
0.15171624720096588,
-0.13809986412525177,
-0.1095689982175827,
0.04214546084403992,
0.012241971679031849,
0.1387832760810852,
0.10031437873840332,
-0.02671266905963421,
-0.09983013570308685,
0.03549996390938759,
-0.09124045073986053,
-0.028392594307661057,
0.04710966721177101,
0.08124011754989624,
-0.15027759969234467,
0.08410860598087311,
0.07970330119132996,
0.08727724850177765,
-0.04391014575958252,
0.03250511735677719,
-0.0587364062666893,
0.06352634727954865,
0.0023211000952869654,
0.0704474002122879,
-0.1652188003063202,
0.1145772635936737,
0.014461176469922066,
0.08652323484420776,
-0.04414498805999756,
0.024841371923685074,
0.045880697667598724,
0.0179941114038229,
0.12185415625572205,
-0.004831329919397831,
-0.09279141575098038,
0.0015858325641602278,
-0.12174975872039795,
0.023244263604283333,
0.06665144860744476,
-0.05247411131858826,
0.06132354959845543,
-0.0098118232563138,
-0.009047419764101505,
-0.0318603478372097,
-0.00443534180521965,
-0.26402100920677185,
-0.13893428444862366,
0.04624801129102707,
-0.0116416085511446,
0.04242425039410591,
-0.03442935645580292,
-0.07907267659902573,
-0.12458838522434235,
0.11821053922176361,
0.005833027418702841,
-0.019371571019291878,
-0.06938659399747849,
0.02232000231742859,
0.10468059778213501,
-0.06929850578308105,
0.009753343649208546,
0.03978726267814636,
0.1409120410680771,
-0.06581315398216248,
-0.04600637033581734,
0.01265290193259716,
-0.09654150903224945,
-0.129105806350708,
0.011334619484841824,
0.17154061794281006,
0.10368438065052032,
0.06277693063020706,
0.08831657469272614,
0.021123381331562996,
-0.008034481666982174,
-0.0956345945596695,
0.026535270735621452,
0.02177570015192032,
-0.07022274285554886,
0.03874426335096359,
-0.00533097330480814,
-0.27108559012413025,
-0.14166080951690674,
-0.0669359639286995,
0.08501376211643219,
0.18302376568317413,
-0.022170638665556908,
0.15949466824531555,
0.2835913896560669,
-0.08635371178388596,
-0.22664588689804077,
-0.051323194056749344,
-0.0029779626056551933,
0.02904835343360901,
0.0487021766602993,
-0.21028999984264374,
0.10076865553855896,
-0.008541390299797058,
0.012696925550699234,
-0.06333071738481522,
-0.20457793772220612,
-0.13240161538124084,
0.1629994809627533,
-0.025052925571799278,
0.058336369693279266,
-0.028137536719441414,
-0.06778953969478607,
-0.038578297942876816,
-0.058217376470565796,
0.0060096909292042255,
-0.07819560915231705,
0.08056513220071793,
0.04950176924467087,
0.02350284904241562,
0.023678768426179886,
0.013469748198986053,
0.10751847922801971,
0.07968109101057053,
-0.02936129830777645,
-0.07898526638746262,
0.009384353645145893,
0.0050704763270914555,
-0.010200868360698223,
0.09483078122138977,
0.04724067077040672,
0.018505625426769257,
-0.03454084321856499,
-0.09069641679525375,
-0.06160912290215492,
0.050188593566417694,
-0.06823703646659851,
-0.01151097472757101,
-0.05590975284576416,
0.0932115912437439,
0.017893055453896523,
0.0001479511265642941,
-0.068564273416996,
-0.08925169706344604,
-0.03815733268857002,
0.11630312353372574,
0.21513532102108002,
-0.04979417845606804,
-0.0017876708880066872,
-0.047007642686367035,
-0.0472099706530571,
0.04727006331086159,
-0.010883771814405918,
0.04399022459983826,
0.053475990891456604,
0.02486843429505825,
0.08341579139232635,
-0.0342632420361042,
-0.12747450172901154,
0.03320667892694473,
0.040454644709825516,
-0.06044190004467964,
-0.2018066793680191,
-0.04344472289085388,
-0.021218866109848022,
-0.02698654867708683,
-0.0365581288933754,
0.1949251890182495,
-0.01513863354921341,
-0.05587548762559891,
0.0020359449554234743,
0.0622953362762928,
-0.0025872872211039066,
0.11179564148187637,
0.04626813158392906,
0.033427972346544266,
-0.08653515577316284,
0.05763203650712967,
0.12496675550937653,
-0.024711469188332558,
0.04380664974451065,
0.08742548525333405,
-0.048230528831481934,
-0.0539209321141243,
-0.0970882996916771,
-0.009784361347556114,
0.05721912905573845,
-0.05622785538434982,
-0.014163115061819553,
-0.1050579771399498,
0.009960339404642582,
-0.002455265959724784,
0.011517315171658993,
-0.04723993316292763,
-0.04886805638670921,
-0.0013848620001226664,
-0.08956123143434525,
0.06183204427361488,
0.10123738646507263,
-0.035004932433366776,
-0.10900435596704483,
0.09991065412759781,
0.013898099772632122,
0.07707950472831726,
-0.035225506871938705,
-0.059588704258203506,
-0.0943506583571434,
-0.006061762571334839,
-0.11136641353368759,
0.037360452115535736,
-0.13369764387607574,
-0.013661008328199387,
-0.0503566637635231,
-0.038493186235427856,
-0.016417045146226883,
0.07386378198862076,
-0.03020920231938362,
0.00027532997773960233,
-0.0321233905851841,
0.0896100178360939,
-0.12370900064706802,
0.07682006806135178,
0.05775454640388489,
-0.04455576464533806,
0.1090771034359932,
0.013232594355940819,
-0.05056161433458328,
0.03546047583222389,
-0.21955649554729462,
-0.061178743839263916,
-0.03688839077949524,
0.04617149382829666,
-0.00950680859386921,
-0.16565844416618347,
0.0027338124345988035,
0.016487251967191696,
0.013274146243929863,
-0.021449364721775055,
0.04939551651477814,
-0.027132507413625717,
-0.004931834060698748,
-0.07293204963207245,
-0.05863171070814133,
-0.0367201529443264,
0.060929931700229645,
0.06820105761289597,
0.004657509736716747,
0.09789112955331802,
-0.08906129747629166,
0.07536810636520386,
-0.08522368967533112,
0.029421405866742134,
-0.026257185265421867,
0.028191670775413513,
-0.09235791862010956,
-0.07617935538291931,
0.07807718217372894,
-0.014988535083830357,
0.06726527214050293,
0.024105535820126534,
-0.02815200202167034,
0.04533252492547035,
-0.03583648428320885,
-0.06531299650669098,
0.037971705198287964,
0.12420860677957535,
0.055649567395448685,
0.019125061109662056,
-0.01104298047721386,
-0.053722988814115524,
0.00401234021410346,
0.1414976269006729,
0.14941559731960297,
0.17711788415908813,
0.10302837193012238,
0.03357645869255066,
0.06860394775867462,
-0.03896772861480713,
-0.09842760860919952,
0.09177335351705551,
-0.06893552094697952,
0.03697972372174263,
-0.04702886939048767,
-0.06595152616500854,
0.07481835037469864,
-0.13211409747600555,
0.07124510407447815,
-0.03325941041111946,
-0.07723358273506165,
-0.10591834783554077,
-0.14868202805519104,
-0.07175537198781967,
-0.04131411388516426,
0.009671981446444988,
-0.10771685838699341,
0.028540512546896935,
0.015004166401922703,
0.026778649538755417,
-0.0940098986029625,
0.09120474755764008,
-0.10612943768501282,
-0.12419074773788452,
0.14736926555633545,
-0.037198979407548904,
-0.016118187457323074,
0.014274454675614834,
0.03981092572212219,
0.026072213426232338,
0.0933757871389389,
0.05101443827152252,
0.055153973400592804,
0.0234171524643898,
0.028341617435216904,
-0.10105545073747635,
-0.06726076453924179,
0.02780335023999214,
-0.008346849121153355,
0.10057517886161804,
0.18287532031536102,
0.08848954737186432,
-0.07599420845508575,
0.012274229899048805,
0.13442668318748474,
0.030670294538140297,
-0.11869263648986816,
-0.1463208645582199,
0.04312539100646973,
-0.03703462705016136,
0.0017935894429683685,
0.00804878119379282,
-0.09928624331951141,
0.01569177210330963,
0.22115205228328705,
0.1686011105775833,
-0.044484153389930725,
0.015923578292131424,
-0.007846899330615997,
0.008161848410964012,
0.024370500817894936,
0.08150674402713776,
0.0893622562289238,
0.17903858423233032,
-0.01343432068824768,
0.038571909070014954,
-0.02639237605035305,
-0.09502293169498444,
-0.11419329047203064,
0.09446514397859573,
0.007423086557537317,
-0.028611622750759125,
-0.010456336662173271,
0.18353551626205444,
-0.11455808579921722,
-0.21873024106025696,
-0.11645084619522095,
-0.05004636570811272,
-0.11625229567289352,
0.02369842492043972,
-0.040638696402311325,
0.1363164633512497,
0.05696561932563782,
-0.009318799711763859,
0.005026905331760645,
0.17274977266788483,
0.032664522528648376,
0.0261690691113472,
-0.01467880792915821,
0.11342637240886688,
-0.08574268221855164,
0.11354482918977737,
-0.0006264257826842368,
0.051347095519304276,
0.03708643838763237,
0.04079655930399895,
-0.06821247935295105,
0.02843685820698738,
0.031469956040382385,
-0.00944103766232729,
0.05009244754910469,
0.1610618680715561,
-0.009647526778280735,
0.08590596914291382,
0.10911770164966583,
-0.048116106539964676,
0.026358945295214653,
-0.007169111166149378,
0.0028027610387653112,
-0.06089426949620247,
0.15641961991786957,
-0.15368321537971497,
0.12483742833137512,
0.11084693670272827,
-0.06691168248653412,
-0.04426982253789902,
-0.011015296913683414,
0.053557299077510834,
-0.0565800666809082,
0.09479445964097977,
-0.009766473434865475,
-0.16455169022083282,
0.025004718452692032,
-0.12081698328256607,
0.07260458171367645,
-0.25856637954711914,
-0.04204718768596649,
-0.04766065999865532,
-0.009030253626406193,
0.003837834345176816,
0.11071062088012695,
0.09084881842136383,
-0.04431910067796707,
-0.008647920563817024,
-0.04329057037830353,
0.008343820460140705,
0.08305724710226059,
-0.08236958831548691,
-0.025888845324516296
] |
null | null |
transformers
|
# Wav2Vec2-large-VoxPopuli-V2
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) large model pretrained only in **west_germanic** on **66.3** unlabeled datat of the [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data in **west_germanic**. Check out [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for a more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/).
|
{"language": "west_germanic", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli-v2"], "datasets": ["voxpopuli"], "inference": false}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-west_germanic-voxpopuli-v2
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli-v2",
"dataset:voxpopuli",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2101.00390"
] |
[
"west_germanic"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
|
# Wav2Vec2-large-VoxPopuli-V2
Facebook's Wav2Vec2 large model pretrained only in west_germanic on 66.3 unlabeled datat of the VoxPopuli corpus.
The model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in west_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model.
Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation*
Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.
See the official website for more information, here.
|
[
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in west_germanic on 66.3 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in west_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n",
"# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in west_germanic on 66.3 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in west_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
70,
259
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli-v2 #dataset-voxpopuli #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-large-VoxPopuli-V2\n\nFacebook's Wav2Vec2 large model pretrained only in west_germanic on 66.3 unlabeled datat of the VoxPopuli corpus.\n\nThe model is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model for speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data in west_germanic. Check out this blog for a more in-detail explanation of how to fine-tune the model. \n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*.\n\nSee the official website for more information, here."
] |
[
-0.10055942088365555,
0.10846403241157532,
-0.0028996195178478956,
0.009561765938997269,
0.07908529788255692,
-0.05772261694073677,
0.13543981313705444,
0.021322252228856087,
0.04986365884542465,
0.10018405318260193,
-0.04566335305571556,
-0.07889857143163681,
0.0736563578248024,
0.10486911982297897,
0.043743472546339035,
-0.22985835373401642,
0.04683598875999451,
-0.08182681351900101,
0.04357025399804115,
0.04501890018582344,
0.10840938985347748,
-0.09098195284605026,
0.034836504608392715,
0.05633452162146568,
-0.017465291544795036,
0.0354093536734581,
-0.03703007102012634,
-0.082724928855896,
0.044102687388658524,
0.04125240072607994,
-0.011894788593053818,
0.024597052484750748,
0.07833389192819595,
-0.1627790629863739,
0.032443806529045105,
0.0284563098102808,
0.011464983224868774,
-0.006846443749964237,
0.13035352528095245,
-0.0289467703551054,
0.17508786916732788,
-0.014505982398986816,
-0.009033946320414543,
0.099030502140522,
-0.07375254482030869,
-0.09048540145158768,
-0.08530038595199585,
0.18184062838554382,
0.09188250452280045,
0.10809297859668732,
-0.08549953997135162,
0.04628373309969902,
-0.03751026839017868,
0.04852110520005226,
0.05191200599074364,
-0.18843577802181244,
-0.027287354692816734,
0.06869777292013168,
0.09791141003370285,
0.024913042783737183,
-0.11218316853046417,
0.06885920464992523,
0.040668584406375885,
-0.009718310087919235,
-0.06769047677516937,
-0.03676370531320572,
0.1077386662364006,
-0.0872519239783287,
-0.12085811793804169,
-0.011048532091081142,
0.19143958389759064,
0.04711835831403732,
-0.07917603850364685,
-0.13204611837863922,
0.017139364033937454,
0.18540656566619873,
-0.04453737661242485,
-0.06699008494615555,
-0.0035297309514135122,
0.03804337605834007,
0.0462537482380867,
-0.04899619519710541,
-0.07084843516349792,
-0.009834319353103638,
-0.0017136838287115097,
0.11020559817552567,
0.0052452730014920235,
-0.01715322956442833,
-0.06753891706466675,
0.007381107658147812,
-0.10481434315443039,
-0.13313397765159607,
-0.0206003338098526,
-0.0697285383939743,
-0.06799732148647308,
-0.039519309997558594,
-0.016520680859684944,
-0.12007580697536469,
0.01715664751827717,
0.10903174430131912,
0.0559583455324173,
0.04352274537086487,
-0.04665448144078255,
-0.023599794134497643,
0.12124070525169373,
0.06963605433702469,
-0.08886721730232239,
-0.031791605055332184,
-0.0020839343778789043,
-0.03958619758486748,
0.00931018777191639,
-0.03036048263311386,
-0.023791514337062836,
0.006388348992913961,
-0.03401292487978935,
0.05242697522044182,
0.041086263954639435,
-0.025541730225086212,
-0.04967369884252548,
-0.0865211933851242,
0.11458566039800644,
-0.09474293887615204,
0.02819344587624073,
0.06856979429721832,
-0.0020360718481242657,
0.11630336940288544,
-0.045741595327854156,
0.07334733754396439,
-0.11513152718544006,
0.025439370423555374,
-0.03617971017956734,
0.002272679703310132,
0.018109802156686783,
-0.035892143845558167,
0.04685089364647865,
0.00827434565871954,
0.0014576069079339504,
-0.12823499739170074,
-0.023742232471704483,
-0.10083546489477158,
-0.0368935652077198,
-0.05537667125463486,
-0.0014755093725398183,
-0.04176200553774834,
0.020552020519971848,
0.00881979800760746,
-0.007899019867181778,
-0.02630508691072464,
-0.03075973130762577,
0.0005595137481577694,
-0.0015301128150895238,
0.04812074452638626,
0.06887652724981308,
0.08334574103355408,
-0.04248684272170067,
-0.02184882201254368,
-0.11439091712236404,
0.1271718442440033,
-0.09580566734075546,
0.007328616920858622,
-0.13265138864517212,
-0.003456723876297474,
-0.027830448001623154,
0.04218399524688721,
0.019833015277981758,
0.15214702486991882,
-0.16762183606624603,
-0.08280825614929199,
0.15964831411838531,
-0.13934269547462463,
-0.002951329108327627,
0.1727449744939804,
0.007091355975717306,
0.041929468512535095,
0.11723864078521729,
0.2142491340637207,
0.028934037312865257,
-0.20101486146450043,
-0.05116597190499306,
-0.08005286753177643,
0.033452413976192474,
0.11145642399787903,
0.06700499355792999,
-0.06620405614376068,
0.06704842299222946,
-0.02363465540111065,
-0.027239877730607986,
-0.059997353702783585,
0.009719735942780972,
-0.04924425855278969,
0.012769089080393314,
-0.042746495455503464,
0.03939132019877434,
-0.01738479547202587,
-0.02438928373157978,
-0.030176017433404922,
-0.08970469981431961,
-0.04080849885940552,
0.11662561446428299,
-0.06936474144458771,
0.024484090507030487,
-0.08246297389268875,
0.05212919041514397,
0.06815210729837418,
0.02048354037106037,
-0.13039228320121765,
0.09965433925390244,
0.021064262837171555,
-0.05466258525848389,
0.13505256175994873,
0.07783778756856918,
-0.014502570033073425,
0.005548128392547369,
-0.014974679797887802,
0.004997986368834972,
-0.03548039123415947,
-0.0003137645835522562,
-0.024830497801303864,
-0.1070311963558197,
0.013213032856583595,
-0.05855410173535347,
0.08826807141304016,
-0.15644609928131104,
-0.004033462610095739,
0.08588173985481262,
0.11939717084169388,
-0.0034491028636693954,
-0.04149189963936806,
0.0794631838798523,
0.04322509095072746,
0.016635864973068237,
-0.00018163191271014512,
0.011578799225389957,
-0.02091100439429283,
-0.015018870122730732,
0.11062078922986984,
-0.15017996728420258,
-0.16731786727905273,
0.11571265757083893,
0.0016677515814080834,
-0.008854168467223644,
0.030803995206952095,
0.02578488551080227,
-0.02212803065776825,
-0.04752360284328461,
-0.03046114556491375,
0.21484161913394928,
-0.012573574669659138,
0.06644601374864578,
-0.0828726515173912,
-0.015756826847791672,
0.010535682551562786,
-0.04808569699525833,
-0.08610869199037552,
0.0998769998550415,
-0.020938336849212646,
-0.049075495451688766,
-0.0009021605947054923,
0.11235560476779938,
0.06834271550178528,
0.1885237991809845,
-0.0036877791862934828,
-0.10260061174631119,
-0.01408484485000372,
-0.0588858537375927,
-0.003842555684968829,
0.03755732253193855,
-0.14368630945682526,
-0.02796526439487934,
0.03130596503615379,
0.031148847192525864,
0.04021942615509033,
-0.028844166547060013,
0.03521887585520744,
0.004850547760725021,
-0.042849015444517136,
-0.08867691457271576,
0.034831564873456955,
-0.032222554087638855,
0.03330685943365097,
-0.02323915809392929,
0.03870011121034622,
-0.04487147554755211,
-0.054941147565841675,
-0.14818930625915527,
0.08825752884149551,
-0.07750873267650604,
-0.29108577966690063,
-0.11188027262687683,
-0.07263719290494919,
-0.046427398920059204,
0.025097861886024475,
0.06268387287855148,
-0.1051308810710907,
-0.123069629073143,
-0.06206300109624863,
0.15530335903167725,
-0.011701117269694805,
-0.07298719882965088,
0.10552002489566803,
0.005234967451542616,
0.0122351860627532,
-0.09021355956792831,
0.008708116598427296,
-0.019680039957165718,
-0.046874307096004486,
-0.015057140029966831,
0.02086617425084114,
0.07480748742818832,
0.13231982290744781,
0.0469716414809227,
-0.02101193182170391,
0.013976526446640491,
0.208998441696167,
-0.14743787050247192,
0.0709623172879219,
0.23785080015659332,
-0.0621897391974926,
-0.00015009487105999142,
0.1422273963689804,
-0.0033621920738369226,
-0.04953715577721596,
0.049326952546834946,
0.0068671307526528835,
-0.019338062033057213,
-0.25778064131736755,
-0.13042688369750977,
-0.06344851851463318,
-0.0025837935972958803,
0.04565596953034401,
0.02286396734416485,
0.021598950028419495,
0.017739754170179367,
-0.10637301951646805,
-0.04565489664673805,
0.05561857298016548,
0.04969066008925438,
0.1513112336397171,
0.013499640859663486,
0.06409620493650436,
-0.06031416729092598,
-0.00033214373979717493,
0.1163274496793747,
-0.013734961859881878,
0.07083921134471893,
0.09260175377130508,
0.11671612411737442,
0.07192201912403107,
0.013960406184196472,
0.06504634767770767,
-0.013093040324747562,
-0.0032100470270961523,
-0.013888916000723839,
-0.014559425413608551,
-0.08497356623411179,
0.019684992730617523,
0.062277618795633316,
0.10675295442342758,
-0.13271784782409668,
-0.1330120861530304,
0.0011616443516686559,
0.04454532638192177,
0.10921676456928253,
0.10023393481969833,
-0.006864956580102444,
-0.1290965974330902,
0.05073479935526848,
-0.10019008815288544,
-0.019506938755512238,
0.04534631595015526,
0.09719820320606232,
-0.1517857313156128,
0.09179456532001495,
0.08725465089082718,
0.10920781642198563,
-0.02985512651503086,
0.015562532469630241,
-0.09027405828237534,
0.05395286902785301,
-0.0014005262637510896,
0.0607881061732769,
-0.18987871706485748,
0.10780684649944305,
0.02690901793539524,
0.09386277198791504,
-0.06332936882972717,
0.013970797881484032,
0.06023633852601051,
0.022686777636408806,
0.12371805310249329,
0.008250981569290161,
-0.12873061001300812,
-0.003288071136921644,
-0.10549050569534302,
0.003908724989742041,
0.06427691876888275,
-0.06970219314098358,
0.06820519268512726,
0.00463792122900486,
-0.004137932788580656,
-0.04665553569793701,
-0.014670750126242638,
-0.24381758272647858,
-0.1288306713104248,
0.03280758857727051,
0.0073857614770531654,
0.07635662704706192,
-0.03223065286874771,
-0.07593517750501633,
-0.15321452915668488,
0.08179765194654465,
-0.057703688740730286,
-0.02217438817024231,
-0.09300095587968826,
-0.013753793202340603,
0.06898374110460281,
-0.0753185823559761,
0.021795472130179405,
0.044552434235811234,
0.13144829869270325,
-0.07834683358669281,
-0.04770035296678543,
0.040560636669397354,
-0.10785485804080963,
-0.13989754021167755,
0.003332774620503187,
0.18183551728725433,
0.14107109606266022,
0.061998892575502396,
0.07331008464097977,
0.02548450045287609,
-0.0007262584986165166,
-0.10070711374282837,
0.02603212371468544,
0.043799564242362976,
-0.04364892467856407,
0.06378497183322906,
-0.014199107885360718,
-0.29281264543533325,
-0.13875871896743774,
-0.07618283480405807,
0.06902124732732773,
0.1701488345861435,
-0.011355943977832794,
0.18058432638645172,
0.28146523237228394,
-0.08345358073711395,
-0.26536157727241516,
-0.019010089337825775,
0.0013270598137751222,
0.03820621594786644,
0.06431920826435089,
-0.21021965146064758,
0.12436439841985703,
0.013874964788556099,
0.010861357674002647,
-0.011655312031507492,
-0.1996770203113556,
-0.14156407117843628,
0.13215211033821106,
-0.024830281734466553,
0.036631859838962555,
-0.046496931463479996,
-0.07508973777294159,
-0.013221348635852337,
-0.13094711303710938,
0.015734102576971054,
-0.08170576393604279,
0.08301842212677002,
0.05787167325615883,
0.048290517181158066,
0.020378878340125084,
0.015116182155907154,
0.11596551537513733,
0.11157497018575668,
-0.01302731316536665,
-0.08636422455310822,
0.05103667452931404,
0.04920965060591698,
-0.01751684583723545,
0.09526114165782928,
-0.0019632738549262285,
0.0019777773413807154,
-0.06315082311630249,
-0.0862564817070961,
-0.06376200914382935,
0.07666074484586716,
-0.06776159256696701,
-0.01936737634241581,
-0.05555013567209244,
0.08585462719202042,
0.01470548752695322,
-0.010236591100692749,
-0.04126383364200592,
-0.10699916630983353,
-0.043612729758024216,
0.08944498002529144,
0.2142111361026764,
-0.02736721746623516,
-0.010282892733812332,
-0.05744529515504837,
-0.054315630346536636,
0.06355564296245575,
-0.06990708410739899,
0.0630783960223198,
0.06227123364806175,
0.028714269399642944,
0.0922648087143898,
-0.03296002000570297,
-0.1307058036327362,
0.03279513120651245,
0.0487477146089077,
-0.07814298570156097,
-0.16286543011665344,
-0.053489260375499725,
0.00862294901162386,
-0.005361660849303007,
-0.010770128108561039,
0.1922033280134201,
-0.01864289864897728,
-0.04557088017463684,
0.0008172858506441116,
0.0496629923582077,
-0.01147508341819048,
0.10621265321969986,
0.034646227955818176,
0.041604433208703995,
-0.08946244418621063,
0.0690198764204979,
0.10894905030727386,
-0.08757846057415009,
0.05346260592341423,
0.11137880384922028,
-0.04849709942936897,
-0.060994166880846024,
-0.13644571602344513,
0.028947308659553528,
0.027768125757575035,
-0.0584760457277298,
0.012166468426585197,
-0.1166432723402977,
0.018992438912391663,
0.03946651518344879,
0.01530543714761734,
-0.03906160593032837,
-0.027437882497906685,
0.00222172774374485,
-0.07435260713100433,
0.07153475284576416,
0.09173212945461273,
-0.03546793386340141,
-0.0996364876627922,
0.10065673291683197,
0.008702069520950317,
0.06399400532245636,
-0.03260407969355583,
-0.07540100812911987,
-0.09812209010124207,
-0.00971240270882845,
-0.12288058549165726,
0.01815309002995491,
-0.14861558377742767,
-0.008669121190905571,
-0.061367928981781006,
-0.03744008392095566,
-0.01610851287841797,
0.07825864851474762,
-0.03885854408144951,
0.0015445284079760313,
-0.034513186663389206,
0.10052464157342911,
-0.13266681134700775,
0.06277532130479813,
0.06656618416309357,
-0.050575096160173416,
0.09987536072731018,
0.05249859020113945,
-0.04557548463344574,
0.05047763139009476,
-0.21866297721862793,
-0.041091080754995346,
-0.028578869998455048,
0.04998388513922691,
-0.008157155476510525,
-0.13701914250850677,
-0.006111833266913891,
0.023582976311445236,
0.019345704466104507,
-0.017313584685325623,
0.033832941204309464,
-0.030522890388965607,
-0.017432762309908867,
-0.039504460990428925,
-0.0659051313996315,
-0.04527046158909798,
0.07396291196346283,
0.08407595008611679,
0.021285029128193855,
0.11316639930009842,
-0.08560255914926529,
0.07649075984954834,
-0.07105422019958496,
0.02834337204694748,
-0.015514461323618889,
0.010982484556734562,
-0.07541952282190323,
-0.07785367220640182,
0.06385184824466705,
-0.006025026552379131,
0.07988733053207397,
0.002141897799447179,
-0.03495778143405914,
0.05995812267065048,
-0.021588506177067757,
-0.059399642050266266,
0.03826387599110603,
0.16024281084537506,
0.04802867770195007,
0.012375866062939167,
-0.0148261534050107,
-0.03168385475873947,
0.009509215131402016,
0.13291288912296295,
0.14907877147197723,
0.17432843148708344,
0.06671475619077682,
0.03896189108490944,
0.05810810253024101,
-0.0335986353456974,
-0.11207623034715652,
0.036444634199142456,
-0.09169486165046692,
0.021863512694835663,
-0.06538818776607513,
-0.027031060308218002,
0.08106347173452377,
-0.14729684591293335,
0.07863039523363113,
-0.018535256385803223,
-0.07495064288377762,
-0.09012708812952042,
-0.1554635465145111,
-0.05433451384305954,
-0.041801974177360535,
0.005444241687655449,
-0.10536067932844162,
0.043288882821798325,
-0.01335665863007307,
0.03881155326962471,
-0.10414959490299225,
0.11533505469560623,
-0.13620319962501526,
-0.12465862184762955,
0.14332371950149536,
-0.042839862406253815,
-0.0037398997228592634,
0.02272280864417553,
0.04417469725012779,
0.025332167744636536,
0.08315151929855347,
0.03953639790415764,
0.06046347692608833,
0.018325941637158394,
0.011790405958890915,
-0.11002051830291748,
-0.07608278840780258,
0.027200767770409584,
0.003320375457406044,
0.08263912796974182,
0.19498272240161896,
0.09128217399120331,
-0.05850543826818466,
0.016098080202937126,
0.14535389840602875,
0.023270010948181152,
-0.10307531803846359,
-0.15618960559368134,
0.017634673044085503,
-0.025797221809625626,
0.003758481238037348,
-0.0059296367689967155,
-0.11618798971176147,
0.02315545827150345,
0.22028976678848267,
0.13730673491954803,
-0.04035820811986923,
0.032117657363414764,
-0.03046208992600441,
0.014768528752028942,
0.03981771692633629,
0.06792250275611877,
0.06770094484090805,
0.21656940877437592,
0.007897301577031612,
0.03817356750369072,
-0.02846311219036579,
-0.10439890623092651,
-0.10612714290618896,
0.10881905257701874,
-0.029245268553495407,
-0.056087613105773926,
-0.01901247166097164,
0.1802521049976349,
-0.1083642914891243,
-0.16655142605304718,
-0.10812663286924362,
-0.03443918377161026,
-0.0986436977982521,
0.009935774840414524,
-0.024101875722408295,
0.1533859819173813,
0.03632134199142456,
0.013600826263427734,
-0.0009083829936571419,
0.1956881433725357,
0.04122711718082428,
0.024069897830486298,
-0.020484374836087227,
0.09515762329101562,
-0.07863211631774902,
0.09467897564172745,
0.0035483345855027437,
0.048994649201631546,
0.04156693443655968,
0.05991719290614128,
-0.06545199453830719,
0.02138536050915718,
0.03823677450418472,
-0.043893635272979736,
0.02357441745698452,
0.17779022455215454,
-0.019249770790338516,
0.07708805799484253,
0.1107076108455658,
-0.078536756336689,
0.02660604752600193,
0.009448434226214886,
0.0020855586044490337,
-0.06627315282821655,
0.16319116950035095,
-0.13557347655296326,
0.12459689378738403,
0.1024610623717308,
-0.061933282762765884,
-0.042909495532512665,
-0.02012776955962181,
0.05659595504403114,
-0.051278650760650635,
0.06326460838317871,
-0.006086008157581091,
-0.1776876300573349,
0.029533345252275467,
-0.08396710455417633,
0.065625861287117,
-0.2527446448802948,
-0.042811695486307144,
-0.04789385199546814,
-0.0007678398978896439,
-0.02326199784874916,
0.12326769530773163,
0.09969455748796463,
-0.05700601637363434,
0.0019396230345591903,
-0.0590878389775753,
0.03534600883722305,
0.09845863282680511,
-0.060405611991882324,
-0.03284122422337532
] |
null | null |
transformers
|
## Evaluation on Common Voice NL Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-dutch"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "nl", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 21.1 %
|
{"language": "nl", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-dutch
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"nl",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"nl"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #nl #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
## Evaluation on Common Voice NL Test
Result: 21.1 %
|
[
"## Evaluation on Common Voice NL Test\n\n\n\nResult: 21.1 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #nl #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"## Evaluation on Common Voice NL Test\n\n\n\nResult: 21.1 %"
] |
[
65,
14
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #nl #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n## Evaluation on Common Voice NL Test\n\n\n\nResult: 21.1 %"
] |
[
-0.15278781950473785,
0.09730726480484009,
-0.003582725767046213,
-0.08663743734359741,
-0.03407006710767746,
-0.06276139616966248,
0.07327625900506973,
0.11807696521282196,
0.03449227288365364,
0.04240690544247627,
0.04329346865415573,
0.12225348502397537,
-0.006924822926521301,
-0.11025644838809967,
-0.07211896777153015,
-0.08836159855127335,
0.0580136701464653,
0.0904899537563324,
0.09593435376882553,
0.07921431213617325,
0.0810670405626297,
-0.009918224066495895,
-0.029153315350413322,
0.10858999937772751,
-0.030009519308805466,
0.03976946696639061,
0.08917101472616196,
-0.13112251460552216,
0.15089216828346252,
0.04580434784293175,
-0.016245434060692787,
0.02457619644701481,
-0.0003953156410716474,
-0.25096115469932556,
0.00008523985161446035,
-0.03696000948548317,
0.07391511648893356,
0.0009295042837038636,
0.026327813044190407,
-0.03735227510333061,
0.08939610421657562,
0.1459418088197708,
-0.03785831853747368,
0.11127033084630966,
0.03365343064069748,
-0.23651662468910217,
-0.03311651945114136,
0.018256468698382378,
0.06662006676197052,
0.06769794225692749,
-0.053734686225652695,
0.13148917257785797,
-0.14912833273410797,
0.11422700434923172,
0.11283461004495621,
-0.2783283293247223,
0.01944386586546898,
-0.0062745120376348495,
0.05910590663552284,
-0.0689321905374527,
-0.03400466963648796,
0.0629749447107315,
0.10716124624013901,
0.05562014877796173,
-0.1885501742362976,
-0.042911507189273834,
-0.19826672971248627,
-0.005502347368746996,
-0.058179475367069244,
-0.029788712039589882,
0.23609285056591034,
0.0558386892080307,
-0.04393766075372696,
-0.09704884886741638,
0.04165349528193474,
-0.021767567843198776,
0.0233369879424572,
-0.06862793862819672,
-0.01484103687107563,
0.02615644596517086,
-0.08835852146148682,
0.044577158987522125,
-0.08793500810861588,
-0.10964855551719666,
-0.11640697717666626,
0.1468510776758194,
0.037521298974752426,
0.01643620990216732,
-0.019292375072836876,
0.02928316406905651,
-0.07336196303367615,
-0.0506318062543869,
-0.06012220308184624,
0.05761750414967537,
0.014356044121086597,
0.004571389872580767,
-0.04088904336094856,
-0.10851997882127762,
0.1639864295721054,
0.063858263194561,
0.08041395246982574,
0.02641948126256466,
-0.1534566879272461,
0.0615851916372776,
0.008586724288761616,
0.140497088432312,
-0.012819573283195496,
-0.0022187361028045416,
0.05388380587100983,
-0.04115675017237663,
0.10215145349502563,
0.028505435213446617,
-0.022037848830223083,
-0.04129965230822563,
0.10462194681167603,
0.0975428894162178,
-0.039134640246629715,
-0.091416135430336,
-0.04305929318070412,
0.015333372168242931,
-0.04879368469119072,
-0.09053066372871399,
0.005663453601300716,
0.04093269258737564,
0.05196836590766907,
0.14856331050395966,
0.01583140715956688,
0.04573744907975197,
-0.06007993593811989,
0.001850857399404049,
0.06770740449428558,
0.005646361503750086,
0.08642245829105377,
-0.003917829133570194,
0.10633336752653122,
0.015301132574677467,
0.04166685789823532,
-0.05479339882731438,
-0.02704213559627533,
-0.05347286909818649,
-0.029191287234425545,
0.006455911323428154,
-0.14624792337417603,
-0.09546208381652832,
-0.04736091196537018,
0.01672186888754368,
-0.10780835151672363,
-0.018323302268981934,
-0.05877385661005974,
0.09481591731309891,
0.08248970657587051,
-0.013061324134469032,
-0.021814530715346336,
0.047225356101989746,
-0.05123307928442955,
-0.005359163973480463,
0.08827660232782364,
0.06191246584057808,
-0.08432509750127792,
-0.0300733782351017,
-0.07182422280311584,
-0.11105149239301682,
-0.0554535873234272,
0.04240837320685387,
0.01303824596107006,
0.05646810680627823,
-0.13606025278568268,
-0.1003866046667099,
0.0795787125825882,
-0.10474740713834763,
-0.07259315252304077,
0.21332678198814392,
0.04323720559477806,
-0.03550092130899429,
0.15088145434856415,
0.2794983685016632,
0.07018216699361801,
-0.15291337668895721,
-0.04916944354772568,
0.07591469585895538,
0.02062453329563141,
-0.10137670487165451,
0.05153987929224968,
-0.07328839600086212,
-0.09210408478975296,
0.03452427685260773,
0.06630681455135345,
0.06424576789140701,
-0.04035194590687752,
-0.05069142207503319,
-0.04086797311902046,
-0.08939523994922638,
-0.019023437052965164,
0.055477458983659744,
0.0024382651317864656,
-0.12948252260684967,
-0.06872204691171646,
-0.008372582495212555,
0.1177537739276886,
-0.10729219019412994,
0.06654120236635208,
-0.1257580667734146,
0.13661281764507294,
-0.13340939581394196,
-0.07316013425588608,
-0.08161625266075134,
0.20168161392211914,
-0.0170558113604784,
-0.032405462116003036,
0.12051226198673248,
0.0867396891117096,
0.02192351222038269,
-0.09816737473011017,
-0.038167454302310944,
-0.002678769873455167,
0.11093313992023468,
0.005564986728131771,
0.0030392243061214685,
-0.1490584760904312,
0.06997812539339066,
-0.038536857813596725,
-0.006348209921270609,
-0.016323933377861977,
-0.08001308888196945,
0.03713598847389221,
0.10277470201253891,
-0.05578724294900894,
0.03138379752635956,
0.05939042568206787,
0.051406290382146835,
-0.031793102622032166,
0.06333059072494507,
0.010211101733148098,
0.012774398550391197,
-0.11760518699884415,
0.21158769726753235,
-0.06589958816766739,
0.10082855820655823,
0.11838927865028381,
-0.13565205037593842,
0.07576745748519897,
0.1658993363380432,
-0.01618110202252865,
-0.012371018528938293,
-0.03714969754219055,
-0.029034465551376343,
0.2358412891626358,
-0.036507997661828995,
0.08918730914592743,
-0.11014539748430252,
-0.06026136502623558,
0.0007545711123384535,
-0.06700383871793747,
-0.005485072731971741,
0.11109903454780579,
-0.023892879486083984,
-0.04795655980706215,
0.03591448813676834,
0.048167284578084946,
-0.18108855187892914,
0.12440543621778488,
-0.09932835400104523,
-0.05597091093659401,
0.03223839029669762,
0.019417928531765938,
-0.0341406911611557,
0.08802252262830734,
-0.19402360916137695,
0.018282197415828705,
0.046791743487119675,
0.05836424604058266,
0.09918057918548584,
-0.18345673382282257,
0.02449282445013523,
-0.019656674936413765,
-0.12194088846445084,
-0.10354997217655182,
0.10291782021522522,
-0.01848270185291767,
0.038310062140226364,
-0.13653439283370972,
-0.1816483587026596,
0.038398828357458115,
-0.051841046661138535,
-0.1319705843925476,
0.03709051385521889,
-0.04259183630347252,
-0.17381970584392548,
-0.11545320600271225,
-0.0006999819306656718,
-0.05843396857380867,
-0.02798151969909668,
0.1296708732843399,
-0.11120681464672089,
-0.019031276926398277,
-0.021369721740484238,
-0.030781332403421402,
-0.06175258383154869,
0.016829291358590126,
-0.025365136563777924,
0.01083530206233263,
0.06815474480390549,
-0.16146284341812134,
-0.0288362018764019,
-0.037246134132146835,
0.007920198142528534,
-0.005522750783711672,
-0.0306599922478199,
0.013180126436054707,
0.2064610719680786,
0.08212114125490189,
-0.0042005483992397785,
-0.02298184297978878,
0.09669797122478485,
-0.08250436186790466,
-0.1251603215932846,
0.1242288202047348,
-0.012685955502092838,
-0.0752040296792984,
0.13333207368850708,
0.017283882945775986,
-0.04977080971002579,
-0.06859133392572403,
0.014977061189711094,
-0.028120655566453934,
-0.34263095259666443,
-0.1195240244269371,
-0.07045652717351913,
-0.0032913729082792997,
-0.07871050387620926,
0.053103115409612656,
0.11762213706970215,
-0.06798877567052841,
0.004460994154214859,
-0.11072108149528503,
0.009470495395362377,
-0.026359308511018753,
0.3267207741737366,
-0.038349997252225876,
0.09153775870800018,
-0.08403250575065613,
-0.06962788850069046,
0.02401324361562729,
0.04125722870230675,
0.06335390359163284,
0.10832864791154861,
0.08023332059383392,
0.025386543944478035,
0.13227543234825134,
0.08280894160270691,
0.012883506715297699,
0.03783946484327316,
0.006161727011203766,
0.04034806787967682,
-0.01998676173388958,
-0.016612207517027855,
0.03247402235865593,
0.23372477293014526,
0.04937364533543587,
-0.019475623965263367,
-0.14120832085609436,
0.03882845491170883,
0.1728757619857788,
0.17784383893013,
-0.09191691875457764,
-0.019863931462168694,
0.015705343335866928,
-0.17478801310062408,
0.0011046048020944,
0.1353948563337326,
0.026398641988635063,
-0.03924392908811569,
0.13654184341430664,
0.04671630263328552,
0.07262270897626877,
-0.006056071724742651,
0.024923522025346756,
-0.10737540572881699,
-0.07918155938386917,
0.0576011948287487,
0.10077623277902603,
-0.21312129497528076,
0.21305517852306366,
0.02048877626657486,
0.10405979305505753,
-0.004436103161424398,
0.007440070156008005,
0.006439538672566414,
0.02594422921538353,
0.11263377219438553,
-0.026333367452025414,
-0.09523618221282959,
0.015344709157943726,
-0.07530418038368225,
0.1043829396367073,
0.0030165263451635838,
0.18287406861782074,
-0.03927800804376602,
0.021264573559165,
-0.01856582798063755,
0.020023318007588387,
-0.152406707406044,
-0.15507523715496063,
0.018991105258464813,
-0.022458568215370178,
0.26876553893089294,
0.09705174714326859,
-0.026566462591290474,
-0.10584407299757004,
-0.20472335815429688,
0.13401491940021515,
-0.0934697613120079,
0.0059017944149672985,
0.009455233812332153,
-0.09404107928276062,
0.1843412071466446,
-0.015219053253531456,
-0.059267595410346985,
0.008120105601847172,
-0.02116807922720909,
-0.0015146484365686774,
-0.04201774299144745,
0.03917446732521057,
-0.07519881427288055,
-0.07654116302728653,
-0.01857515424489975,
0.3489355146884918,
-0.026623625308275223,
0.13431981205940247,
0.06328827887773514,
-0.0002866709546651691,
0.011456213891506195,
-0.02545219101011753,
0.03951689228415489,
0.06470252573490143,
-0.22326499223709106,
0.03380202874541283,
0.060400232672691345,
-0.08784400671720505,
-0.12054596096277237,
-0.043106816709041595,
0.2461029589176178,
0.10111969709396362,
-0.02362718991935253,
0.11300178617238998,
0.18945951759815216,
-0.081475168466568,
-0.1743738055229187,
-0.11840666085481644,
0.013472196646034718,
0.09709284454584122,
-0.10423526167869568,
-0.040294352918863297,
0.16320322453975677,
0.012056591920554638,
-0.09122508019208908,
0.013858874328434467,
-0.18585222959518433,
-0.10112571716308594,
0.24631346762180328,
-0.14652061462402344,
0.22269301116466522,
-0.02738090045750141,
-0.07088067382574081,
-0.01918172836303711,
-0.03309451416134834,
-0.01904807612299919,
-0.0019327932968735695,
0.07614115625619888,
0.013567198067903519,
0.1119130402803421,
0.050826262682676315,
-0.016323594376444817,
0.134155735373497,
0.027614202350378036,
-0.0504964180290699,
0.04612194001674652,
0.11221667379140854,
-0.10173783451318741,
0.05680157616734505,
0.12069644778966904,
-0.16959089040756226,
0.03222618252038956,
-0.016587091609835625,
-0.06668586283922195,
-0.06780728697776794,
0.11831291764974594,
0.07056786119937897,
0.018513372167944908,
-0.06434904038906097,
-0.09306581318378448,
-0.01106363907456398,
0.026441218331456184,
0.17239224910736084,
-0.0961463525891304,
0.11695729941129684,
0.04569721966981888,
0.24402283132076263,
-0.16734352707862854,
-0.06318527460098267,
0.004178906325250864,
-0.11448471993207932,
0.07395375519990921,
-0.06948507577180862,
0.06718355417251587,
0.137414813041687,
0.07329824566841125,
0.07003758102655411,
0.006947087123990059,
-0.11265960335731506,
0.0580579973757267,
0.0427328497171402,
-0.017309801653027534,
-0.09979324042797089,
-0.03309762477874756,
-0.008858311921358109,
0.0037025241181254387,
0.08669522404670715,
0.13525784015655518,
-0.06561041623353958,
-0.0008078599930740893,
0.008954851888120174,
-0.02906932681798935,
-0.17614135146141052,
0.2364652007818222,
0.04296623542904854,
0.017725365236401558,
-0.12080130726099014,
0.048158496618270874,
-0.06049496307969093,
-0.10002858936786652,
0.020464716479182243,
-0.07160104811191559,
0.029180871322751045,
-0.0618462860584259,
-0.04664674028754234,
0.043074917048215866,
0.06096884608268738,
-0.1779584139585495,
-0.08469878882169724,
-0.1666930913925171,
0.04590056836605072,
0.1540258526802063,
0.0704420655965805,
0.09592795372009277,
-0.07086152583360672,
-0.0780937522649765,
-0.03771280124783516,
0.05706827715039253,
0.043739914894104004,
0.02034948766231537,
-0.15845797955989838,
0.0058166696690022945,
0.041699282824993134,
0.07867532968521118,
-0.05531829223036766,
-0.07587293535470963,
-0.0027413363568484783,
0.1142270639538765,
-0.1656811535358429,
-0.005241977050900459,
-0.0707801952958107,
0.024461163207888603,
0.0964512974023819,
-0.10254444181919098,
-0.08183814585208893,
0.061498261988162994,
-0.1025688424706459,
0.059431858360767365,
0.03098195232450962,
0.09694761037826538,
-0.06781390309333801,
0.02334495820105076,
0.059191543608903885,
-0.04919962212443352,
0.06710153073072433,
0.10709016770124435,
-0.1151047870516777,
0.15281140804290771,
-0.22447162866592407,
-0.12193453311920166,
0.16304272413253784,
0.07853050529956818,
-0.08486031740903854,
-0.16230861842632294,
0.006913866847753525,
0.2249143272638321,
0.06801385432481766,
0.02598770335316658,
-0.05906558409333229,
-0.05183430016040802,
-0.09590840339660645,
-0.12516047060489655,
-0.1016165018081665,
-0.011927952989935875,
-0.06878001242876053,
0.1293637454509735,
0.12108705937862396,
0.1588190197944641,
-0.06804920732975006,
0.005035506561398506,
-0.07312706857919693,
0.04488687217235565,
-0.08108645677566528,
-0.04385372996330261,
-0.20436842739582062,
0.010182918980717659,
0.01826453022658825,
-0.03723468258976936,
0.16896070539951324,
-0.09784014523029327,
-0.060773808509111404,
0.0008615387487225235,
-0.0136024821549654,
-0.07091236859560013,
-0.00117595330812037,
0.3045240640640259,
0.05331309512257576,
0.0012157161254435778,
-0.04586152732372284,
-0.11985939741134644,
-0.006385502405464649,
0.07573391497135162,
-0.09756990522146225,
0.13955581188201904,
0.06703611463308334,
0.11464592069387436,
0.10352414101362228,
-0.0367201566696167,
-0.01465737633407116,
0.06559760123491287,
-0.04048615321516991,
0.07943091541528702,
0.0559874027967453,
0.24922412633895874,
0.10144917666912079,
-0.010413662530481815,
0.038235876709222794,
-0.0309313777834177,
-0.04534260556101799,
-0.220989391207695,
-0.06906314939260483,
-0.13286499679088593,
-0.16163557767868042,
0.07023880630731583,
-0.04528366029262543,
0.04990926384925842,
0.014192398637533188,
0.08872954547405243,
-0.057886071503162384,
-0.009663851000368595,
-0.06683016568422318,
-0.07267562299966812,
0.10318573564291,
-0.08092522621154785,
-0.006414295639842749,
-0.08862625807523727,
0.03215327113866806,
0.1264043152332306,
0.011144112795591354,
0.007015514653176069,
-0.011037194170057774,
-0.05681479722261429,
0.02373134158551693,
-0.1604716181755066,
-0.047100163996219635,
0.010775717906653881,
-0.021198485046625137,
0.13120779395103455,
0.1543625295162201,
0.09143747389316559,
-0.03154251351952553,
0.08171367645263672,
0.055013563483953476,
-0.07165966182947159,
-0.16492211818695068,
-0.10143252462148666,
0.20642200112342834,
-0.04470852017402649,
0.020508097484707832,
-0.0022517750039696693,
-0.010303348302841187,
0.008470872417092323,
0.12312895804643631,
0.1646479368209839,
0.090682253241539,
0.024352243170142174,
-0.07808493077754974,
-0.027358701452612877,
-0.051181476563215256,
-0.11401382088661194,
0.09665174037218094,
0.20910830795764923,
-0.011538640595972538,
-0.08039616048336029,
-0.08633376657962799,
-0.015811052173376083,
0.019243380054831505,
0.022312773391604424,
-0.07354683429002762,
-0.17439107596874237,
-0.0029812785796821117,
0.14094915986061096,
-0.12769734859466553,
0.07703763246536255,
-0.08302245289087296,
-0.10051941871643066,
-0.09059033542871475,
0.030941788107156754,
0.10650505870580673,
0.11225422471761703,
-0.0790039673447609,
-0.11317387223243713,
-0.05302570387721062,
0.07915279269218445,
-0.05984145402908325,
-0.17033860087394714,
-0.007120825350284576,
0.007431281730532646,
0.032092176377773285,
0.008143081329762936,
0.07654525339603424,
0.24178077280521393,
0.015416759997606277,
0.10812946408987045,
0.006136956624686718,
0.16101650893688202,
0.034526996314525604,
-0.18719549477100372,
0.07590686529874802,
0.13264943659305573,
0.05407660827040672,
0.18044255673885345,
0.07813579589128494,
-0.046864431351423264,
0.017112916335463524,
-0.16278688609600067,
-0.06848401576280594,
-0.17753815650939941,
0.036424994468688965,
-0.04125390946865082,
0.03095216117799282,
-0.031357113271951675,
-0.05229741334915161,
-0.025922145694494247,
-0.03465423732995987,
0.022356105968356133,
0.034875378012657166,
-0.08396128565073013,
-0.0026232320815324783,
-0.2061278522014618,
0.020591914653778076,
-0.08925505727529526,
-0.05827184394001961,
-0.0985088124871254,
-0.057228218764066696,
-0.02248350717127323,
-0.0606277696788311,
-0.009952562861144543,
0.029254106804728508,
0.09317027032375336,
-0.01377334538847208,
0.019175337627530098,
-0.05623883381485939,
0.07575015723705292,
0.0434434600174427,
-0.10011120140552521,
-0.07769717276096344
] |
null | null |
transformers
|
## Evaluation on Common Voice FR Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-french"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "fr", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 25.2 %
|
{"language": "fr", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-french
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"fr",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #fr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
## Evaluation on Common Voice FR Test
Result: 25.2 %
|
[
"## Evaluation on Common Voice FR Test\n\n\nResult: 25.2 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #fr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"## Evaluation on Common Voice FR Test\n\n\nResult: 25.2 %"
] |
[
69,
13
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #fr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n## Evaluation on Common Voice FR Test\n\n\nResult: 25.2 %"
] |
[
-0.1827288120985031,
0.06936447322368622,
-0.003351193852722645,
-0.0624869130551815,
-0.007106277626007795,
-0.055582378059625626,
0.03272747993469238,
0.11619596928358078,
0.01762944832444191,
0.06664461642503738,
0.04588604345917702,
0.07609887421131134,
0.022935299202799797,
-0.057216692715883255,
-0.11820793151855469,
-0.08734637498855591,
0.04680206626653671,
0.06886280328035355,
0.047024283558130264,
0.08439002931118011,
0.100523442029953,
-0.0358491949737072,
-0.0025413399562239647,
0.10670673102140427,
-0.08385927230119705,
0.015142418444156647,
0.09799834340810776,
-0.12573358416557312,
0.13055822253227234,
0.06778712570667267,
-0.02473306469619274,
0.05584068223834038,
0.0034640238154679537,
-0.1914909929037094,
0.010375208221375942,
-0.016324810683727264,
0.04748928174376488,
0.03072727471590042,
0.040599524974823,
-0.01324473973363638,
0.09669806063175201,
0.16887758672237396,
-0.04238394275307655,
0.10546014457941055,
0.018481988459825516,
-0.24697557091712952,
-0.057185057550668716,
0.011425372213125229,
0.03958934172987938,
0.09226369857788086,
-0.05463149771094322,
0.12119236588478088,
-0.14007523655891418,
0.10431492328643799,
0.1304108202457428,
-0.26835253834724426,
0.009053998626768589,
0.031891148537397385,
0.04868035390973091,
-0.055480942130088806,
-0.05814342573285103,
0.03284325823187828,
0.10045648366212845,
0.053167060017585754,
-0.1305692493915558,
-0.051654644310474396,
-0.1960040032863617,
-0.00984096247702837,
-0.07447928935289383,
-0.04060417041182518,
0.2710605263710022,
0.04957820475101471,
-0.05763520672917366,
-0.07731018960475922,
0.010767717845737934,
-0.025481348857283592,
0.009247642010450363,
-0.05837305262684822,
-0.0028683145064860582,
0.028921842575073242,
-0.03243114799261093,
0.03304588794708252,
-0.12339730560779572,
-0.12139350175857544,
-0.13335245847702026,
0.12732133269309998,
0.028411416336894035,
0.030388541519641876,
-0.014521865174174309,
0.008759514428675175,
-0.12821121513843536,
-0.05364450067281723,
-0.06161295995116234,
0.0343332402408123,
0.008481238037347794,
0.009646257385611534,
-0.09216117858886719,
-0.11819058656692505,
0.16918231546878815,
0.07439824938774109,
-0.015430635772645473,
0.009679562412202358,
-0.10648129880428314,
0.04293612763285637,
0.006614617072045803,
0.11460106819868088,
-0.028775375336408615,
0.01658153347671032,
0.03440609574317932,
-0.01550772599875927,
0.03829711675643921,
0.013791896402835846,
-0.023751646280288696,
-0.06205042079091072,
0.07439298927783966,
0.047003306448459625,
-0.040607329457998276,
-0.0709335058927536,
-0.02329977974295616,
0.0361778698861599,
0.017371803522109985,
-0.08466967940330505,
-0.002720860531553626,
0.0811217799782753,
0.05008965730667114,
0.10434922575950623,
0.02980227768421173,
0.05027155578136444,
-0.04083944484591484,
0.0038866547401994467,
0.07393641769886017,
0.008950813673436642,
0.0790829136967659,
-0.03179024159908295,
0.07991345971822739,
-0.0340045690536499,
0.03615500032901764,
-0.0740065649151802,
0.03858615458011627,
-0.07439345121383667,
-0.015178322792053223,
-0.020182710140943527,
-0.1566658318042755,
-0.027704153209924698,
-0.0765523612499237,
0.03445306420326233,
-0.08169374614953995,
0.034168168902397156,
-0.053826216608285904,
0.09937736392021179,
0.07829897850751877,
0.0043997010216116905,
-0.050786178559064865,
0.05778425931930542,
-0.028919242322444916,
-0.027164071798324585,
0.06066230311989784,
0.054834816604852676,
-0.09040235728025436,
-0.035187192261219025,
-0.07786757498979568,
-0.09104709327220917,
-0.12114289402961731,
0.029930375516414642,
-0.0016809296794235706,
0.0852002203464508,
-0.1414002925157547,
-0.07561316341161728,
0.07978106290102005,
-0.0949540063738823,
-0.07877688854932785,
0.1979769766330719,
0.049336548894643784,
-0.032299838960170746,
0.09663865715265274,
0.2643032371997833,
0.0572052001953125,
-0.1950099915266037,
-0.02516687661409378,
0.12123618274927139,
0.015906313434243202,
-0.08794981986284256,
0.0993543192744255,
-0.1198832169175148,
-0.06521310657262802,
0.025966668501496315,
0.03442549332976341,
0.04951908439397812,
-0.03899148851633072,
-0.041946057230234146,
-0.05617418512701988,
-0.09074236452579498,
-0.03077531047165394,
0.05359568074345589,
0.04110150411725044,
-0.11451815813779831,
-0.03276900202035904,
-0.042306993156671524,
0.09284789115190506,
-0.07480023801326752,
0.04563535749912262,
-0.09288859367370605,
0.17014749348163605,
-0.15301260352134705,
-0.060607150197029114,
-0.13874754309654236,
0.2670106589794159,
-0.015437483787536621,
0.009468381293118,
0.08388515561819077,
0.13382002711296082,
0.03994365781545639,
-0.10149699449539185,
-0.017146073281764984,
0.010092846117913723,
0.11455535143613815,
0.010406365618109703,
-0.03121310845017433,
-0.16812750697135925,
0.061775580048561096,
-0.05426694080233574,
0.030896563082933426,
-0.05787072703242302,
-0.05629856511950493,
0.07573720812797546,
0.09577178955078125,
-0.04214106500148773,
0.032218482345342636,
0.043347351253032684,
0.008480323478579521,
-0.021989651024341583,
0.06821691244840622,
0.004359632730484009,
0.039497021585702896,
-0.050543706864118576,
0.22855980694293976,
-0.04454033449292183,
0.18244528770446777,
0.14552877843379974,
-0.11537529528141022,
0.048280127346515656,
0.14585882425308228,
-0.021545931696891785,
0.012196037918329239,
-0.04272761568427086,
-0.03835431858897209,
0.17003943026065826,
-0.054442133754491806,
0.11752109974622726,
-0.11562550067901611,
-0.048671185970306396,
0.048632100224494934,
-0.05402665212750435,
-0.002629668917506933,
0.10181441158056259,
-0.000051426679419819266,
-0.03484754636883736,
0.06686291843652725,
0.07784891128540039,
-0.14852219820022583,
0.0884711816906929,
-0.10605480521917343,
-0.07665400952100754,
0.0246169064193964,
0.02038935385644436,
-0.05171606317162514,
0.12037822604179382,
-0.2101244181394577,
-0.0078723831102252,
0.07430575042963028,
0.03235238417983055,
0.08209507167339325,
-0.17886674404144287,
0.050672709941864014,
0.019102737307548523,
-0.1131468117237091,
-0.1171000599861145,
0.13552998006343842,
-0.01745559833943844,
0.04559845104813576,
-0.13186617195606232,
-0.17239026725292206,
0.023435791954398155,
-0.03904058039188385,
-0.1513831913471222,
0.03541276603937149,
-0.03467254340648651,
-0.1325167566537857,
-0.086550273001194,
0.04779715836048126,
-0.030731642618775368,
-0.001492612762376666,
0.12954245507717133,
-0.07540462166070938,
-0.01582977920770645,
-0.053620804101228714,
-0.051751162856817245,
-0.0777270719408989,
0.04767420142889023,
-0.03954102098941803,
0.00559282349422574,
0.09260306507349014,
-0.13335783779621124,
-0.015631716698408127,
-0.03834998235106468,
0.027018338441848755,
0.009136193431913853,
-0.0037665904965251684,
0.03352848067879677,
0.15829116106033325,
0.06555584073066711,
0.005570616107434034,
-0.02780604362487793,
0.18003977835178375,
-0.08831299841403961,
-0.1047133058309555,
0.15765975415706635,
-0.006403626408427954,
-0.07292482256889343,
0.1695265769958496,
0.009648261591792107,
-0.03978702798485756,
-0.06906752288341522,
-0.00009456900443183258,
-0.022994523867964745,
-0.29941490292549133,
-0.0739155113697052,
-0.10930188745260239,
-0.008329790085554123,
-0.07673029601573944,
0.05790828540921211,
0.10953836888074875,
-0.042521946132183075,
0.011561443097889423,
-0.09617052227258682,
0.011407414451241493,
-0.05429871752858162,
0.27352192997932434,
-0.049475155770778656,
0.10513035953044891,
-0.08464308083057404,
-0.040015749633312225,
0.05551884323358536,
-0.00016634963685646653,
0.0875391885638237,
0.09324605762958527,
0.05870424211025238,
0.022102857008576393,
0.187844917178154,
0.07523884624242783,
0.028548574075102806,
0.0021592017728835344,
0.003507893765345216,
0.022481704130768776,
-0.01742926612496376,
-0.0011986785102635622,
0.044588204473257065,
0.25233355164527893,
0.00994953140616417,
-0.046105530112981796,
-0.1618296355009079,
0.03674963861703873,
0.2106105387210846,
0.14667771756649017,
-0.08083469420671463,
-0.03405844792723656,
-0.014377828687429428,
-0.14323730766773224,
-0.018682222813367844,
0.10640969127416611,
0.07967735826969147,
-0.05498180910944939,
0.08577828854322433,
0.03263986483216286,
0.0726575180888176,
0.061835095286369324,
0.0333891287446022,
-0.09446050971746445,
-0.09642244130373001,
0.0544574111700058,
0.06851296871900558,
-0.2512226998806,
0.20314443111419678,
0.014784923754632473,
0.0682636946439743,
-0.01030189823359251,
0.020945854485034943,
0.030139951035380363,
0.05001683160662651,
0.11662424355745316,
-0.030955960974097252,
-0.09301190823316574,
0.006827433593571186,
-0.03855578601360321,
0.09385783225297928,
0.028370067477226257,
0.12340575456619263,
-0.0361500047147274,
-0.003465539775788784,
-0.025453737005591393,
0.03829145058989525,
-0.08131485432386398,
-0.16113309562206268,
-0.009289867244660854,
-0.006975384429097176,
0.2069493681192398,
0.02733469009399414,
-0.031152140349149704,
-0.08578548580408096,
-0.263333261013031,
0.033409975469112396,
-0.11064333468675613,
0.027001524344086647,
-0.03297959268093109,
-0.07854463905096054,
0.16343776881694794,
-0.01360235270112753,
0.0005883072153665125,
-0.01944718323647976,
0.0009296851931139827,
0.006287602707743645,
-0.0446307472884655,
0.06450574845075607,
-0.10801110416650772,
-0.09833185374736786,
-0.01436829473823309,
0.30605176091194153,
-0.06033499911427498,
0.1332162618637085,
0.0508170910179615,
0.028737971559166908,
-0.03793840855360031,
-0.02205779030919075,
0.050167348235845566,
0.010247540660202503,
-0.15194392204284668,
0.015067974105477333,
0.0718109980225563,
-0.16835662722587585,
-0.07511139661073685,
-0.04252573475241661,
0.22709904611110687,
0.11147776246070862,
-0.0591653548181057,
0.10346335917711258,
0.1314074993133545,
-0.03900562971830368,
-0.20406904816627502,
-0.0840739980340004,
0.027826713398098946,
0.10725564509630203,
-0.07044270634651184,
-0.02449369989335537,
0.10256139934062958,
-0.013231243938207626,
-0.10654949396848679,
0.016377301886677742,
-0.16494973003864288,
-0.10037844628095627,
0.2331072986125946,
-0.15098261833190918,
0.1970430463552475,
-0.05831577628850937,
-0.08910883218050003,
-0.0042724283412098885,
-0.07242126017808914,
-0.037378426641225815,
-0.10792464017868042,
0.06870979815721512,
0.03410946950316429,
0.0012955087004229426,
0.04620766639709473,
-0.022728819400072098,
0.1397942453622818,
0.013895628042519093,
-0.04330674558877945,
0.043405722826719284,
0.07727434486150742,
-0.10009849816560745,
0.06504971534013748,
0.1257065236568451,
-0.162325918674469,
0.06446383893489838,
-0.029480496421456337,
-0.05248815938830376,
-0.08140405267477036,
0.0991176888346672,
0.055898234248161316,
0.043071143329143524,
-0.07990001142024994,
-0.09426768869161606,
0.00013364505139179528,
0.01950543187558651,
0.1467040330171585,
-0.10508675128221512,
0.09373292326927185,
0.18801429867744446,
0.1886134296655655,
-0.12591390311717987,
-0.07046112418174744,
0.017046978697180748,
-0.10214735567569733,
0.08505412936210632,
-0.14317859709262848,
0.07165654003620148,
0.11274273693561554,
0.06533917039632797,
0.09967272728681564,
0.004309718497097492,
-0.09638724476099014,
0.04758067429065704,
0.044507648795843124,
-0.02466285042464733,
-0.13875190913677216,
-0.024774016812443733,
-0.07054130733013153,
-0.03083132579922676,
0.06934798508882523,
0.12430345267057419,
-0.07490837574005127,
0.007057950831949711,
-0.00885363481938839,
-0.03351283445954323,
-0.16597050428390503,
0.23689471185207367,
0.05827654153108597,
0.013069581240415573,
-0.13369977474212646,
0.0386296883225441,
-0.055511973798274994,
-0.11055173724889755,
0.00417456915602088,
-0.10895992070436478,
-0.018527505919337273,
-0.083367720246315,
-0.025362031534314156,
0.01784502901136875,
0.061066266149282455,
-0.15400582551956177,
-0.09134481847286224,
-0.19040851294994354,
0.06277703493833542,
0.16324885189533234,
0.0869453176856041,
0.06893117725849152,
-0.03981656953692436,
-0.055398132652044296,
-0.013777080923318863,
0.06752116978168488,
0.0427839532494545,
0.014857334084808826,
-0.15759633481502533,
0.021093368530273438,
0.026689283549785614,
0.07077515125274658,
-0.05179358646273613,
-0.03290436789393425,
0.009048751555383205,
0.07080414891242981,
-0.13602787256240845,
-0.0076327878050506115,
-0.03978992998600006,
0.0010190869215875864,
0.1073310524225235,
-0.11045651882886887,
-0.06104102358222008,
0.07297562807798386,
-0.1213374063372612,
0.055409666150808334,
0.015949301421642303,
0.11234302073717117,
-0.06115571781992912,
0.02861282415688038,
0.05820220708847046,
-0.057420648634433746,
0.08131692558526993,
0.1536519080400467,
-0.11188145726919174,
0.11310034245252609,
-0.19276383519172668,
-0.12839677929878235,
0.1539163738489151,
0.0760369822382927,
-0.057039059698581696,
-0.13037803769111633,
0.02647615782916546,
0.208451509475708,
0.05799856781959534,
0.017620300874114037,
0.0020632355008274317,
-0.0472058430314064,
-0.08573875576257706,
-0.14597934484481812,
-0.058758337050676346,
-0.0007385663338936865,
-0.07507620006799698,
0.13388696312904358,
0.15046356618404388,
0.1717139333486557,
-0.05198238044977188,
-0.0001591253385413438,
-0.10643593221902847,
0.05548514798283577,
-0.09377548098564148,
-0.05793178826570511,
-0.24478879570960999,
0.004380114376544952,
0.035516683012247086,
-0.02560502476990223,
0.17979033291339874,
-0.05458615720272064,
-0.031291618943214417,
0.019245080649852753,
0.0009013054077513516,
0.001257892232388258,
0.017903465777635574,
0.27787715196609497,
0.05615067109465599,
-0.007699651177972555,
-0.0604441873729229,
-0.07343969494104385,
0.015325007028877735,
0.03651970997452736,
-0.1030246838927269,
0.17590831220149994,
0.06098388135433197,
0.1236594021320343,
0.12271041423082352,
-0.06783140450716019,
0.052557773888111115,
0.08122922480106354,
-0.07841747999191284,
0.07338573783636093,
0.02635716274380684,
0.22622886300086975,
0.13878223299980164,
-0.04434576630592346,
0.05365889146924019,
-0.07127951830625534,
-0.03018828295171261,
-0.19489984214305878,
-0.01873028092086315,
-0.1318499743938446,
-0.16884273290634155,
0.04813820868730545,
-0.03077942319214344,
0.06101013347506523,
0.057597242295742035,
0.07523537427186966,
-0.02318904921412468,
0.026120474562048912,
-0.08029067516326904,
-0.0749770998954773,
0.1204439103603363,
-0.07572654634714127,
-0.02643420733511448,
-0.08047822117805481,
0.004854095168411732,
0.14697857201099396,
-0.0012050189543515444,
0.01409907266497612,
-0.02896902523934841,
-0.0670170783996582,
0.02301875315606594,
-0.13008037209510803,
-0.050012778490781784,
0.011873787268996239,
-0.026099227368831635,
0.09609536081552505,
0.19725236296653748,
0.08476869761943817,
-0.000776737229898572,
0.07121750712394714,
0.08152613043785095,
-0.10587233304977417,
-0.12958599627017975,
-0.1567889004945755,
0.09011081606149673,
0.003055574605241418,
0.04924953356385231,
-0.00021231347636785358,
-0.050769463181495667,
-0.001198765356093645,
0.15005014836788177,
0.20397989451885223,
0.027680616825819016,
0.04063315689563751,
-0.029468104243278503,
-0.0241690743714571,
-0.021283190697431564,
-0.1330937296152115,
0.10883200168609619,
0.17303061485290527,
-0.0159898791462183,
-0.046509575098752975,
-0.07071588188409805,
-0.011672914028167725,
0.0289204902946949,
0.06366235762834549,
-0.06524975597858429,
-0.17333462834358215,
0.028578752651810646,
0.11634358763694763,
-0.05687694624066353,
-0.044509030878543854,
-0.06209273263812065,
-0.10512129217386246,
-0.08091268688440323,
0.02500605583190918,
0.05343227460980415,
0.13525618612766266,
-0.08642815798521042,
-0.11834458261728287,
-0.02820981852710247,
0.10967264324426651,
-0.06125669553875923,
-0.14526206254959106,
-0.007695755921304226,
0.028342638164758682,
-0.038442812860012054,
-0.008238818496465683,
0.06480744481086731,
0.2388743907213211,
0.01746843196451664,
0.11057555675506592,
0.0017175407847389579,
0.1363385170698166,
0.030253054574131966,
-0.18619650602340698,
0.062209732830524445,
0.12041684240102768,
0.060726575553417206,
0.1384870409965515,
0.05897244065999985,
-0.09165814518928528,
0.039205484092235565,
-0.16147181391716003,
-0.08776333928108215,
-0.17632517218589783,
0.044420573860406876,
-0.05794651061296463,
0.018884645774960518,
-0.0060937474481761456,
-0.046096134930849075,
-0.023728830739855766,
-0.019061245024204254,
0.014693676494061947,
0.05351060628890991,
-0.06344282627105713,
-0.023496797308325768,
-0.18995513021945953,
0.045907843858003616,
-0.11572638154029846,
-0.053039561957120895,
-0.09668547660112381,
-0.06101423129439354,
-0.02966429479420185,
-0.045189905911684036,
0.016021879389882088,
0.011330039240419865,
0.0905860885977745,
-0.005030447151511908,
0.014186497777700424,
-0.014407177455723286,
0.061303891241550446,
0.07911648601293564,
-0.08976999670267105,
-0.06155848875641823
] |
null | null |
transformers
|
## Evaluation on Common Voice DE Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-german"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "de", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 18.5 %
|
{"language": "de", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-german
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"de",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #de #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
## Evaluation on Common Voice DE Test
Result: 18.5 %
|
[
"## Evaluation on Common Voice DE Test\n\nResult: 18.5 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #de #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"## Evaluation on Common Voice DE Test\n\nResult: 18.5 %"
] |
[
69,
13
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #de #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n## Evaluation on Common Voice DE Test\n\nResult: 18.5 %"
] |
[
-0.18374396860599518,
0.07834956049919128,
-0.00331890513189137,
-0.08275613188743591,
-0.010281085036695004,
-0.042587097734212875,
0.06880640238523483,
0.1255570948123932,
0.02068345993757248,
0.047840144485235214,
0.04311007633805275,
0.11305556446313858,
0.01920103095471859,
-0.06719290465116501,
-0.09244067966938019,
-0.07795384526252747,
0.07028181850910187,
0.0744030550122261,
0.1156536117196083,
0.0854984000325203,
0.10098148882389069,
-0.05489257350564003,
-0.01737954653799534,
0.09205996990203857,
-0.05682298168540001,
0.024784697219729424,
0.10018891096115112,
-0.1470477283000946,
0.14733640849590302,
0.03567349165678024,
-0.008521612733602524,
0.056837137788534164,
-0.020907150581479073,
-0.1730782687664032,
0.003263245802372694,
-0.007691232953220606,
0.06964688748121262,
0.017284134402871132,
0.03566968813538551,
0.0053685507737100124,
0.05320627614855766,
0.15158070623874664,
-0.019090985879302025,
0.0970991924405098,
0.02623474784195423,
-0.25935620069503784,
-0.04804880544543266,
-0.03583204373717308,
0.06877913326025009,
0.06427377462387085,
-0.07503672689199448,
0.1380460113286972,
-0.13304221630096436,
0.08950266987085342,
0.10201002657413483,
-0.29518353939056396,
0.02126683108508587,
0.028088463470339775,
0.09978871047496796,
-0.02132531628012657,
-0.03904593363404274,
0.05165610834956169,
0.10245607048273087,
0.058099016547203064,
-0.1324106603860855,
-0.03332265466451645,
-0.18185651302337646,
0.004570949822664261,
-0.05519992113113403,
-0.036961037665605545,
0.3031642735004425,
0.03277651220560074,
-0.05292753875255585,
-0.12224452197551727,
0.026810498908162117,
-0.009756455197930336,
0.022439733147621155,
-0.05874904990196228,
0.008873790502548218,
0.0419311560690403,
-0.0598333366215229,
0.05896162614226341,
-0.12372507154941559,
-0.11591336131095886,
-0.13183830678462982,
0.07625467330217361,
0.031405139714479446,
0.015434556640684605,
-0.03050980530679226,
0.02132364735007286,
-0.07380549609661102,
-0.06413955241441727,
-0.0415174625813961,
0.026784395799040794,
0.011891813017427921,
0.013727540150284767,
-0.07934001088142395,
-0.13373802602291107,
0.16632463037967682,
0.039311617612838745,
0.0661691278219223,
0.019089072942733765,
-0.13471704721450806,
0.07206320017576218,
0.014800065197050571,
0.162125363945961,
-0.05528881400823593,
-0.003244562540203333,
0.03179975971579552,
-0.012186008505523205,
0.06521371006965637,
0.010684391483664513,
-0.031745534390211105,
-0.07279834151268005,
0.12154386192560196,
0.07183270156383514,
-0.009778078645467758,
-0.06941545009613037,
-0.05639661103487015,
0.030009957030415535,
-0.016724850982427597,
-0.08230408281087875,
0.0031723976135253906,
0.03344816341996193,
0.05969389155507088,
0.13493385910987854,
0.01857650652527809,
0.051700886338949203,
-0.05190868675708771,
-0.0007387971272692084,
0.05874417722225189,
-0.00952211208641529,
0.06600061058998108,
-0.028426097705960274,
0.08121263980865479,
-0.0160654466599226,
0.047331616282463074,
-0.07374396175146103,
0.02385503239929676,
-0.07282409816980362,
-0.024435268715023994,
-0.0015434065135195851,
-0.16228285431861877,
-0.0440453477203846,
-0.05568750575184822,
0.024283960461616516,
-0.12002325057983398,
0.022199632599949837,
-0.06126568466424942,
0.10880367457866669,
0.05059529468417168,
0.0028457592707127333,
-0.01977633684873581,
0.06768197566270828,
-0.04500321298837662,
-0.02341003157198429,
0.02563450299203396,
0.0672229528427124,
-0.062195707112550735,
-0.008676487021148205,
-0.08029503375291824,
-0.09323197603225708,
-0.06342405825853348,
0.04545864835381508,
0.018227461725473404,
0.0625855103135109,
-0.19783945381641388,
-0.07740305364131927,
0.06469987332820892,
-0.0932687297463417,
-0.10794313251972198,
0.21606898307800293,
0.040921006351709366,
-0.012166761793196201,
0.125951886177063,
0.28888237476348877,
0.02803990989923477,
-0.1570403128862381,
-0.0464097261428833,
0.07482349127531052,
0.016795862466096878,
-0.08056362718343735,
0.07958716154098511,
-0.11626286804676056,
-0.046693701297044754,
0.031991489231586456,
0.041227176785469055,
0.035223208367824554,
-0.024957776069641113,
-0.04736190661787987,
-0.05164630338549614,
-0.08878585696220398,
-0.011200990527868271,
0.03820367157459259,
0.00864186231046915,
-0.14026808738708496,
-0.0665668249130249,
-0.019274411723017693,
0.09591112285852432,
-0.08125613629817963,
0.06274375319480896,
-0.10880298167467117,
0.174493208527565,
-0.15894991159439087,
-0.06667771190404892,
-0.11253522336483002,
0.2472318857908249,
-0.01058475486934185,
0.010589892975986004,
0.06616444140672684,
0.11152705550193787,
0.01712152361869812,
-0.11803704500198364,
-0.020633039996027946,
-0.021515630185604095,
0.11625024676322937,
0.019307786598801613,
-0.03097977116703987,
-0.1581033170223236,
0.06437893211841583,
-0.04922924563288689,
-0.01601654849946499,
0.014012932777404785,
-0.05766528844833374,
0.035620879381895065,
0.08272022008895874,
-0.045368753373622894,
0.033770911395549774,
0.05155118182301521,
0.02634364739060402,
-0.03794374316930771,
0.06025855615735054,
0.006997175049036741,
0.02837114781141281,
-0.09192778170108795,
0.23331406712532043,
-0.09294889867305756,
0.1761821061372757,
0.16276557743549347,
-0.15202178061008453,
0.07634592056274414,
0.17343877255916595,
0.0016380943125113845,
0.00034861371386796236,
-0.022720687091350555,
-0.05410723760724068,
0.21709050238132477,
-0.04761062562465668,
0.08837968111038208,
-0.13296371698379517,
-0.045687172561883926,
0.02028832398355007,
-0.04112285375595093,
-0.012595742009580135,
0.07250513881444931,
-0.019061695784330368,
0.010661163367331028,
0.059390999376773834,
0.14004483819007874,
-0.14472009241580963,
0.11890942603349686,
-0.10660016536712646,
-0.06933104991912842,
0.042329780757427216,
-0.009649712592363358,
-0.0628051906824112,
0.09816481173038483,
-0.26089054346084595,
-0.010012871585786343,
0.07503137737512589,
0.058179136365652084,
0.09025759249925613,
-0.1721123307943344,
0.04140951484441757,
-0.009280418045818806,
-0.10142732411623001,
-0.11565648764371872,
0.12208039313554764,
-0.005657171830534935,
0.06199073791503906,
-0.11851292103528976,
-0.16953033208847046,
0.03351723030209541,
-0.04874429106712341,
-0.13635560870170593,
0.046057090163230896,
-0.05059739574790001,
-0.13442716002464294,
-0.07648974657058716,
0.049836814403533936,
-0.02158639207482338,
0.005574556067585945,
0.15580014884471893,
-0.09970612823963165,
-0.014786984771490097,
-0.030660197138786316,
-0.035738930106163025,
-0.07773107290267944,
0.03610555827617645,
-0.030537810176610947,
0.0043611410073935986,
0.0775880292057991,
-0.1515543907880783,
-0.02746373601257801,
-0.02834876999258995,
0.012584955431520939,
0.006452965084463358,
-0.011681480333209038,
0.01769198477268219,
0.1919720619916916,
0.06858939677476883,
-0.02273711748421192,
-0.04966835677623749,
0.1156710833311081,
-0.08935016393661499,
-0.10985995084047318,
0.1673366278409958,
-0.0047860643826425076,
-0.07017604261636734,
0.18646429479122162,
-0.002514412859454751,
-0.05077246204018593,
-0.06915102154016495,
-0.02288128063082695,
-0.018679015338420868,
-0.3281026780605316,
-0.08980109542608261,
-0.0964217409491539,
-0.006123185623437166,
-0.09711374342441559,
0.04311750829219818,
0.06318552047014236,
-0.037614136934280396,
-0.0008264618809334934,
-0.11012391000986099,
0.021990425884723663,
-0.03309992328286171,
0.2725374400615692,
-0.0648641362786293,
0.1288035660982132,
-0.07770901173353195,
-0.041125837713479996,
0.048031169921159744,
0.03815542161464691,
0.06761599332094193,
0.10844714939594269,
0.045469533652067184,
0.042042482644319534,
0.15175209939479828,
0.07247953116893768,
0.031108148396015167,
0.026541365310549736,
0.018859194591641426,
0.0019468707032501698,
-0.01894347555935383,
-0.011737962253391743,
0.04280141741037369,
0.24121583998203278,
0.003887290833517909,
-0.05084801837801933,
-0.16058698296546936,
0.06318514049053192,
0.2049352526664734,
0.1775522381067276,
-0.11341840028762817,
-0.016433000564575195,
0.02332441136240959,
-0.15332932770252228,
-0.034129366278648376,
0.11904294043779373,
0.029678799211978912,
-0.05012445151805878,
0.10664596408605576,
0.05232492834329605,
0.07022801786661148,
0.03516850620508194,
0.045879729092121124,
-0.11109092831611633,
-0.11804244667291641,
0.04238961637020111,
0.06785771250724792,
-0.2660045325756073,
0.20138972997665405,
0.011011685244739056,
0.08246837556362152,
-0.011909645982086658,
0.027831071987748146,
0.025303734466433525,
0.04246838390827179,
0.10380403697490692,
-0.03742779791355133,
-0.10977020859718323,
0.05198619142174721,
-0.04155292361974716,
0.11290793865919113,
-0.019034113734960556,
0.13581043481826782,
-0.045264922082424164,
0.00793220940977335,
0.004599159117788076,
0.033841896802186966,
-0.08120420575141907,
-0.13933296501636505,
0.005294830072671175,
-0.014797245152294636,
0.23783151805400848,
0.0843825414776802,
-0.03365040943026543,
-0.09834321588277817,
-0.23699559271335602,
0.08621316403150558,
-0.1209835335612297,
-0.011535385623574257,
-0.0003174750308971852,
-0.1228395402431488,
0.17745448648929596,
0.004038200713694096,
-0.01990903913974762,
-0.005065454635769129,
-0.0010886316886171699,
-0.003254187759011984,
-0.04677637666463852,
0.07972162961959839,
-0.09288003295660019,
-0.09136614948511124,
-0.04288208857178688,
0.2922472357749939,
-0.07478184998035431,
0.13136006891727448,
0.048277344554662704,
0.021737702190876007,
-0.05260186269879341,
0.00061964918859303,
0.0658460408449173,
0.06535767018795013,
-0.12353358417749405,
0.02691786177456379,
0.06462201476097107,
-0.2135535627603531,
-0.08832605183124542,
-0.049919404089450836,
0.21445666253566742,
0.10636875033378601,
-0.04992415010929108,
0.10301974415779114,
0.19552403688430786,
-0.06499519944190979,
-0.19010812044143677,
-0.0973474383354187,
-0.0015185856027528644,
0.09667639434337616,
-0.06311487406492233,
-0.02463519759476185,
0.11666679382324219,
-0.01153248455375433,
-0.10730033367872238,
-0.0250248983502388,
-0.16426002979278564,
-0.10965706408023834,
0.24979324638843536,
-0.170502170920372,
0.22065754234790802,
-0.0569535568356514,
-0.06319170445203781,
-0.0273659098893404,
-0.03174830973148346,
-0.08912332355976105,
-0.046054475009441376,
0.06362909823656082,
0.03332579508423805,
0.0815473347902298,
0.04401354119181633,
-0.026667719706892967,
0.14751695096492767,
0.025294343009591103,
-0.04410439729690552,
0.03335168957710266,
0.08281754702329636,
-0.10958873480558395,
0.043452367186546326,
0.1262277364730835,
-0.1696435809135437,
0.0496777780354023,
-0.034140948206186295,
-0.048272907733917236,
-0.0632256492972374,
0.08760777860879898,
0.06199284642934799,
0.02115681581199169,
-0.07364712655544281,
-0.0958712249994278,
-0.0214727520942688,
0.009992538020014763,
0.14016622304916382,
-0.10146588087081909,
0.10114046931266785,
0.07317373156547546,
0.2374633252620697,
-0.17523545026779175,
-0.07580993324518204,
0.021937038749456406,
-0.11671892553567886,
0.09911466389894485,
-0.12640489637851715,
0.0733344554901123,
0.10684869438409805,
0.02161192148923874,
0.07813036441802979,
0.018104832619428635,
-0.1180940791964531,
0.07426127046346664,
0.05188073217868805,
-0.026442857459187508,
-0.08201029151678085,
-0.0017058398807421327,
0.00892204511910677,
-0.03393230214715004,
0.08351495116949081,
0.11599884182214737,
-0.05088959261775017,
-0.012556854635477066,
-0.008824760094285011,
-0.005375671666115522,
-0.1560174524784088,
0.23766884207725525,
0.024632301181554794,
0.02024093084037304,
-0.12061111629009247,
0.05393512174487114,
-0.045720893889665604,
-0.13629430532455444,
0.02951027639210224,
-0.13158279657363892,
0.001299458323046565,
-0.07968901842832565,
-0.04064416512846947,
0.04448939859867096,
0.07556772977113724,
-0.16055014729499817,
-0.08388759195804596,
-0.18437457084655762,
0.05602847784757614,
0.11321062594652176,
0.08451709151268005,
0.08515740931034088,
-0.04571300745010376,
-0.07932037115097046,
0.006183760706335306,
0.06364184617996216,
0.04015357419848442,
0.027771275490522385,
-0.1602773815393448,
0.00745523301884532,
0.035029102116823196,
0.058482807129621506,
-0.04922931641340256,
-0.03184416517615318,
0.005902704782783985,
0.09454435855150223,
-0.16890184581279755,
-0.00421461695805192,
-0.05686841532588005,
0.004268337972462177,
0.10108679533004761,
-0.10956886410713196,
-0.06926970928907394,
0.05706043168902397,
-0.10850372910499573,
0.06150807812809944,
0.022545017302036285,
0.13112226128578186,
-0.06144849583506584,
0.023039784282445908,
0.048289954662323,
-0.06098512187600136,
0.0954919308423996,
0.11327282339334488,
-0.11059486865997314,
0.10747267305850983,
-0.1679352968931198,
-0.13790126144886017,
0.1497882902622223,
0.06677574664354324,
-0.07547914236783981,
-0.1300525665283203,
0.012929579243063927,
0.21471251547336578,
0.054536011070013046,
0.020835986360907555,
-0.011746525764465332,
-0.04923132807016373,
-0.06858731061220169,
-0.1049184575676918,
-0.07558941841125488,
-0.008685082197189331,
-0.07990595698356628,
0.15330225229263306,
0.13442347943782806,
0.1686914563179016,
-0.05862473323941231,
-0.009635248221457005,
-0.10022063553333282,
0.056498706340789795,
-0.10247772932052612,
-0.06341508030891418,
-0.19356387853622437,
-0.0021734125912189484,
0.039330873638391495,
-0.05473851040005684,
0.19419093430042267,
-0.03089558333158493,
-0.03363312780857086,
0.026588400825858116,
0.02791263908147812,
-0.014122501015663147,
0.03286711126565933,
0.2765495777130127,
0.029996506869792938,
-0.0013052942231297493,
-0.0732707679271698,
-0.08827687054872513,
0.004265925381332636,
0.07820454984903336,
-0.13261747360229492,
0.16822825372219086,
0.08136747032403946,
0.1324511021375656,
0.10402770340442657,
-0.05529143661260605,
-0.023869570344686508,
0.032796382904052734,
-0.047836415469646454,
0.0773392915725708,
0.03265177831053734,
0.21643997728824615,
0.13840912282466888,
-0.023167219012975693,
0.04670851677656174,
-0.08751828223466873,
-0.027815701439976692,
-0.1982550024986267,
-0.017652930691838264,
-0.1278110295534134,
-0.15508344769477844,
0.03537182882428169,
-0.035221166908741,
0.0456504262983799,
0.07260006666183472,
0.061076778918504715,
-0.05201438441872597,
0.012650946155190468,
-0.046789418905973434,
-0.08054523169994354,
0.11479618400335312,
-0.06867314130067825,
-0.006029109470546246,
-0.07365208119153976,
0.002714760135859251,
0.1416763961315155,
0.038786597549915314,
0.010913782753050327,
-0.023669198155403137,
-0.10302961617708206,
0.011689402163028717,
-0.14248929917812347,
-0.05099807307124138,
0.0008469417225569487,
-0.0380316823720932,
0.0927535742521286,
0.1745259314775467,
0.07829541712999344,
-0.019564956426620483,
0.08500316739082336,
0.06713827699422836,
-0.0968695804476738,
-0.12316364794969559,
-0.1336365044116974,
0.12992236018180847,
-0.02398124895989895,
0.022583220154047012,
0.007671153638511896,
-0.05205848440527916,
-0.008533255197107792,
0.13595527410507202,
0.20166492462158203,
0.0571715272963047,
0.014714382588863373,
-0.05807702615857124,
-0.02462383545935154,
-0.053182363510131836,
-0.09071827679872513,
0.11017647385597229,
0.16491304337978363,
-0.01868629641830921,
-0.014528296887874603,
-0.08635611832141876,
-0.004944637883454561,
0.022136544808745384,
0.04916071519255638,
-0.06932234019041061,
-0.17398333549499512,
0.02553807944059372,
0.14071124792099,
-0.04147225245833397,
-0.015312588773667812,
-0.0809168890118599,
-0.10669048130512238,
-0.08245612680912018,
0.04628865793347359,
0.14018096029758453,
0.11020752787590027,
-0.07437005639076233,
-0.11485801637172699,
-0.03449089452624321,
0.0810941681265831,
-0.06443409621715546,
-0.12760721147060394,
-0.003233149880543351,
0.02579919993877411,
-0.011264116503298283,
0.01618601568043232,
0.06484249979257584,
0.22423648834228516,
0.020154815167188644,
0.12566758692264557,
-0.0034295860677957535,
0.15612637996673584,
0.031017569825053215,
-0.15033350884914398,
0.09127259999513626,
0.09992548823356628,
0.04430470988154411,
0.14078356325626373,
0.08351355791091919,
-0.06104934960603714,
0.0336388498544693,
-0.20761553943157196,
-0.08322775363922119,
-0.1760455220937729,
0.056016698479652405,
-0.048304446041584015,
0.02754604071378708,
-0.05917627736926079,
-0.06051892042160034,
-0.030845744535326958,
-0.023887578397989273,
-0.0002462354605086148,
0.041627198457717896,
-0.07581917196512222,
-0.027051517739892006,
-0.20002761483192444,
0.0441446490585804,
-0.09260454028844833,
-0.03851689398288727,
-0.1156187504529953,
-0.03198438510298729,
-0.007013289257884026,
-0.04187755659222603,
-0.017044447362422943,
0.014054755680263042,
0.0796108990907669,
-0.024129312485456467,
0.00561683950945735,
0.00213226187042892,
0.06683908402919769,
0.07008898258209229,
-0.10447801649570465,
-0.07292954623699188
] |
null | null |
transformers
|
## Evaluation on Common Voice IT Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-italian"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "it", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 22.1 %
|
{"language": "it", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-italian
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"it",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"it"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #it #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
## Evaluation on Common Voice IT Test
Result: 22.1 %
|
[
"## Evaluation on Common Voice IT Test\n\nResult: 22.1 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #it #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"## Evaluation on Common Voice IT Test\n\nResult: 22.1 %"
] |
[
69,
13
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #it #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n## Evaluation on Common Voice IT Test\n\nResult: 22.1 %"
] |
[
-0.165798157453537,
0.098654605448246,
-0.002835373394191265,
-0.09144255518913269,
-0.021955914795398712,
-0.02838749997317791,
0.08051187545061111,
0.13095246255397797,
0.02713633142411709,
0.026070691645145416,
0.04377855360507965,
0.09539125114679337,
0.021570226177573204,
-0.049453869462013245,
-0.0626644641160965,
-0.07553178071975708,
0.07081910967826843,
0.0634947195649147,
0.07667623460292816,
0.08867587894201279,
0.08674686402082443,
-0.036617521196603775,
-0.02393455244600773,
0.0780799612402916,
-0.06003371998667717,
0.014731505885720253,
0.10155675560235977,
-0.13312284648418427,
0.1453477144241333,
0.024354111403226852,
-0.006370615679770708,
0.03541213274002075,
-0.016461245715618134,
-0.19018959999084473,
0.009269322268664837,
-0.0038112804759293795,
0.057058610022068024,
0.027467774227261543,
0.055438946932554245,
0.024408511817455292,
0.06235729157924652,
0.16951800882816315,
-0.03022475354373455,
0.08506161719560623,
0.034577105194330215,
-0.22613102197647095,
-0.06369485706090927,
0.019271159544587135,
0.038775283843278885,
0.06296444684267044,
-0.0710601881146431,
0.15551631152629852,
-0.14890828728675842,
0.10194814205169678,
0.08320222795009613,
-0.28473708033561707,
0.015297963283956051,
0.010391672141849995,
0.0931297093629837,
-0.02708648517727852,
-0.040683746337890625,
0.05719437822699547,
0.08723627030849457,
0.06405854970216751,
-0.14786161482334137,
-0.04344110190868378,
-0.21285642683506012,
-0.006641303189098835,
-0.053656261414289474,
-0.0644850805401802,
0.2972860038280487,
0.04179758578538895,
-0.03986701741814613,
-0.1001482903957367,
0.010771692730486393,
-0.024710094556212425,
0.02839447744190693,
-0.04864398390054703,
0.010837230831384659,
0.047495681792497635,
-0.05207804590463638,
0.05425931513309479,
-0.12275445461273193,
-0.1303281933069229,
-0.145775705575943,
0.0957338958978653,
0.027984699234366417,
0.03325977176427841,
-0.017034731805324554,
0.009403424337506294,
-0.0674799382686615,
-0.05426252633333206,
-0.05077781155705452,
0.029670696705579758,
0.009402013383805752,
0.009014317765831947,
-0.07710453122854233,
-0.09775499999523163,
0.1765693575143814,
0.03258831426501274,
0.07466871291399002,
0.014904960989952087,
-0.1082349494099617,
0.06950995326042175,
0.023844175040721893,
0.14823301136493683,
-0.052337996661663055,
-0.011711971834301949,
0.01908070780336857,
-0.0021234815940260887,
0.0689358189702034,
0.004388251807540655,
-0.03612471744418144,
-0.057071104645729065,
0.14252617955207825,
0.058575406670570374,
0.017176413908600807,
-0.057406604290008545,
-0.03959368169307709,
0.028417354449629784,
-0.04916000738739967,
-0.08934462070465088,
0.026974182575941086,
0.038903091102838516,
0.06898519396781921,
0.14331920444965363,
0.0052446527406573296,
0.04411625862121582,
-0.05118127912282944,
-0.01918705925345421,
0.05932106450200081,
0.005072572268545628,
0.05458623915910721,
-0.029993195086717606,
0.08874941617250443,
-0.02491406723856926,
0.03696201741695404,
-0.08216837048530579,
0.01334272138774395,
-0.07269420474767685,
-0.012083575129508972,
0.0047784592024981976,
-0.13428430259227753,
-0.024105112999677658,
-0.06949445605278015,
0.047508399933576584,
-0.11846014857292175,
0.01751949079334736,
-0.07188607007265091,
0.10363901406526566,
0.032344914972782135,
0.00902183074504137,
-0.04714484140276909,
0.0522683784365654,
-0.03305748477578163,
-0.01741759479045868,
0.05365649610757828,
0.05917063355445862,
-0.07177077978849411,
-0.0006068508373573422,
-0.06288296729326248,
-0.06657366454601288,
-0.06848378479480743,
0.05413488298654556,
0.007894983515143394,
0.06890042126178741,
-0.2116236388683319,
-0.07064490765333176,
0.05737278237938881,
-0.11312855035066605,
-0.09688645601272583,
0.21442550420761108,
0.058633942157030106,
-0.007829069159924984,
0.12099608778953552,
0.29783377051353455,
0.02680819295346737,
-0.13757459819316864,
-0.06215806305408478,
0.10572519153356552,
-0.0020549537148326635,
-0.06746865063905716,
0.0910048633813858,
-0.10819017142057419,
-0.06327980011701584,
0.026859335601329803,
0.03711935132741928,
0.030851807445287704,
-0.027013422921299934,
-0.04849632456898689,
-0.05872909352183342,
-0.07595895230770111,
-0.010623990558087826,
0.028205247595906258,
0.012559201568365097,
-0.12078066915273666,
-0.07541218400001526,
-0.002966309431940317,
0.10957145690917969,
-0.06595604121685028,
0.05134887620806694,
-0.1157791018486023,
0.1817142516374588,
-0.14693401753902435,
-0.04909050837159157,
-0.11370136588811874,
0.25075969099998474,
-0.008133006282150745,
0.019352765753865242,
0.0676426813006401,
0.11920833587646484,
0.02279089018702507,
-0.1252661794424057,
0.002075745025649667,
-0.02157670073211193,
0.0949060320854187,
0.01871408522129059,
-0.022692864760756493,
-0.15400220453739166,
0.049116350710392,
-0.04265551269054413,
0.0017721297917887568,
0.004989147186279297,
-0.07495453953742981,
0.04246785491704941,
0.07621300220489502,
-0.05726498365402222,
0.03582887724041939,
0.03738631680607796,
0.022358587011694908,
-0.03610370308160782,
0.0548819936811924,
0.022754644975066185,
0.03947556018829346,
-0.07885532826185226,
0.23457187414169312,
-0.08656079322099686,
0.21231456100940704,
0.1746910661458969,
-0.13739359378814697,
0.07586809247732162,
0.17867891490459442,
0.011783741414546967,
-0.006920370738953352,
0.012688886374235153,
-0.06270649284124374,
0.17845404148101807,
-0.04242640361189842,
0.08713197708129883,
-0.13064488768577576,
-0.0349893718957901,
0.02304108440876007,
-0.05876968055963516,
0.0016593049513176084,
0.06562097370624542,
-0.035008229315280914,
-0.012563034892082214,
0.05369281396269798,
0.14077067375183105,
-0.17089031636714935,
0.14188046753406525,
-0.09704363346099854,
-0.06262299418449402,
0.012529532425105572,
-0.018198253586888313,
-0.06904426962137222,
0.11294396221637726,
-0.2670908272266388,
-0.024163762107491493,
0.06492580473423004,
0.06183100864291191,
0.08219192177057266,
-0.18710017204284668,
0.04391181096434593,
-0.012516810558736324,
-0.09979403018951416,
-0.1382335126399994,
0.08845658600330353,
-0.011524300090968609,
0.05144377052783966,
-0.10652513056993484,
-0.1544935703277588,
0.02181624062359333,
-0.04778226464986801,
-0.12915465235710144,
0.030961468815803528,
-0.055206965655088425,
-0.1492975354194641,
-0.09578754752874374,
0.04711208865046501,
-0.01770772598683834,
0.001425804803147912,
0.1471853405237198,
-0.08294112980365753,
-0.00866552721709013,
-0.02814904972910881,
-0.026882408186793327,
-0.08651448041200638,
0.0344659723341465,
-0.02863951399922371,
0.0025919009931385517,
0.07184374332427979,
-0.13610516488552094,
-0.022214453667402267,
-0.011982139199972153,
-0.003047593869268894,
0.018930107355117798,
-0.031200295314192772,
0.01490556076169014,
0.17932070791721344,
0.06393369287252426,
-0.005016721785068512,
-0.03880889341235161,
0.10294151306152344,
-0.09713613241910934,
-0.093776173889637,
0.15518079698085785,
-0.001317783840931952,
-0.06935781985521317,
0.18340237438678741,
0.01602368801832199,
-0.02130071260035038,
-0.0785663053393364,
-0.03042924962937832,
-0.011791431345045567,
-0.31321200728416443,
-0.08014290034770966,
-0.08319351822137833,
0.004619151819497347,
-0.07849206775426865,
0.05992507562041283,
0.08587931841611862,
-0.04534638300538063,
0.015879737213253975,
-0.10545188188552856,
0.002684657694771886,
-0.03968275338411331,
0.23782049119472504,
-0.06378596276044846,
0.10806345194578171,
-0.07804007083177567,
-0.012311503291130066,
0.06799915432929993,
0.04291827604174614,
0.051902756094932556,
0.11304359883069992,
0.0583818182349205,
0.04172986373305321,
0.17741069197654724,
0.09395997226238251,
0.016648612916469574,
0.04323742911219597,
0.014170589856803417,
0.012227604165673256,
-0.01995416358113289,
-0.033689144998788834,
0.04443633556365967,
0.2581261694431305,
0.006010770797729492,
-0.017796142026782036,
-0.14708781242370605,
0.051512595266103745,
0.23074249923229218,
0.1522015929222107,
-0.10674604028463364,
-0.01185588538646698,
0.005233010742813349,
-0.16554832458496094,
-0.02711785025894642,
0.103736512362957,
0.0718570277094841,
-0.05767054855823517,
0.088078074157238,
0.06294357776641846,
0.0720628947019577,
0.02266988903284073,
0.02984822541475296,
-0.09393775463104248,
-0.11488363146781921,
0.05957869440317154,
0.07938060909509659,
-0.23487018048763275,
0.18619859218597412,
0.011636294424533844,
0.08302914351224899,
0.008603032678365707,
0.013462672010064125,
0.04580005630850792,
0.06110531836748123,
0.06752870231866837,
-0.022565044462680817,
-0.09987050294876099,
0.014911235310137272,
-0.046912770718336105,
0.0864117443561554,
-0.00375254824757576,
0.15934500098228455,
-0.042387545108795166,
-0.023850340396165848,
0.001094657345674932,
0.015811737626791,
-0.05974569916725159,
-0.13004636764526367,
0.011696715839207172,
-0.023052090778946877,
0.2434060275554657,
0.13417202234268188,
-0.05268290266394615,
-0.111140675842762,
-0.2267010509967804,
0.07755360007286072,
-0.12733863294124603,
0.01848658360540867,
0.004538676235824823,
-0.1286095529794693,
0.1639680713415146,
0.01367104146629572,
-0.005610811989754438,
-0.021057577803730965,
-0.007837478071451187,
0.00650416687130928,
-0.04561154171824455,
0.08259689807891846,
-0.0913567766547203,
-0.08049560338258743,
-0.037856340408325195,
0.29828983545303345,
-0.08201387524604797,
0.13609229028224945,
0.06252389401197433,
0.023510461673140526,
-0.06767793744802475,
-0.009465998969972134,
0.05988582968711853,
0.08914420008659363,
-0.12515681982040405,
0.03318324312567711,
0.07201278209686279,
-0.16576986014842987,
-0.06758376955986023,
-0.05016505345702171,
0.2291775643825531,
0.07493295520544052,
-0.065917007625103,
0.08264210820198059,
0.17479529976844788,
-0.07081281393766403,
-0.21446341276168823,
-0.12637357413768768,
0.030897922813892365,
0.08604064583778381,
-0.10572749376296997,
-0.02034003846347332,
0.11669959872961044,
-0.0036166219506412745,
-0.10276474803686142,
-0.006393285468220711,
-0.14873991906642914,
-0.09833231568336487,
0.22237056493759155,
-0.18644274771213531,
0.1902255415916443,
-0.06677041947841644,
-0.08120663464069366,
-0.018711043521761894,
-0.030908120796084404,
-0.04894298315048218,
-0.07633940130472183,
0.07054955512285233,
0.03392942622303963,
0.07246477156877518,
0.026232797652482986,
-0.021039970219135284,
0.15159647166728973,
0.0022702424321323633,
-0.054746028035879135,
0.0261580478399992,
0.08566483110189438,
-0.07805736362934113,
0.030031800270080566,
0.12024247646331787,
-0.16456332802772522,
0.047765061259269714,
-0.025085141882300377,
-0.038656845688819885,
-0.07532187551259995,
0.0763840451836586,
0.062374796718358994,
0.0426858514547348,
-0.04757390916347504,
-0.09374205023050308,
-0.04232148081064224,
0.012442228384315968,
0.15730994939804077,
-0.10255968570709229,
0.11173895001411438,
0.08988793194293976,
0.2059003710746765,
-0.2012985348701477,
-0.0937877669930458,
0.009023941121995449,
-0.12536397576332092,
0.10165141522884369,
-0.10269434750080109,
0.06792774051427841,
0.08816438168287277,
0.04623691365122795,
0.08048585057258606,
0.011134544387459755,
-0.09716660529375076,
0.061431884765625,
0.05904638394713402,
-0.015316378325223923,
-0.1049005463719368,
-0.002408892149105668,
0.04128910228610039,
-0.03821820393204689,
0.0638241395354271,
0.1177605614066124,
-0.04605019465088844,
-0.011496231891214848,
0.0016695645172148943,
-0.02169566974043846,
-0.17033778131008148,
0.22949685156345367,
0.026591073721647263,
0.030692730098962784,
-0.12529011070728302,
0.05887012183666229,
-0.061006613075733185,
-0.12069164216518402,
0.029552515596151352,
-0.10949082672595978,
-0.00583672383800149,
-0.09300126880407333,
-0.031751181930303574,
0.045100633054971695,
0.10378672182559967,
-0.18442250788211823,
-0.06331735104322433,
-0.1748969852924347,
0.04251954331994057,
0.1226738914847374,
0.08794821053743362,
0.10090087354183197,
-0.05400417000055313,
-0.07109816372394562,
0.029724229127168655,
0.05122243985533714,
0.024434935301542282,
0.02694440446794033,
-0.1870967000722885,
0.017324300482869148,
0.05692031607031822,
0.06745226681232452,
-0.051738761365413666,
-0.062243130058050156,
0.004307088442146778,
0.08322003483772278,
-0.14736783504486084,
0.024553043767809868,
-0.05877706781029701,
0.015510090626776218,
0.10870105028152466,
-0.09736412763595581,
-0.0755387544631958,
0.04531746730208397,
-0.10527206212282181,
0.03929947689175606,
0.021213550120592117,
0.11790132522583008,
-0.053771622478961945,
0.03870149329304695,
0.04374373331665993,
-0.05457179620862007,
0.08276602625846863,
0.1177520751953125,
-0.09805838763713837,
0.11266353726387024,
-0.12778888642787933,
-0.15282106399536133,
0.13416895270347595,
0.0655641257762909,
-0.0776427686214447,
-0.13834449648857117,
0.0012857536785304546,
0.22158122062683105,
0.059761933982372284,
0.020242564380168915,
0.014811056666076183,
-0.04505782201886177,
-0.07327093929052353,
-0.11309263110160828,
-0.07801318913698196,
-0.012036851607263088,
-0.06272712349891663,
0.1584789752960205,
0.12086310237646103,
0.18568868935108185,
-0.05681753158569336,
-0.02580702304840088,
-0.0995238646864891,
0.06464149802923203,
-0.09454651921987534,
-0.06007608026266098,
-0.22396287322044373,
-0.00946970283985138,
0.02955564856529236,
-0.04684534668922424,
0.18361513316631317,
-0.030177460983395576,
-0.03215666115283966,
0.02012004517018795,
0.03569045290350914,
-0.04403156414628029,
0.012362497858703136,
0.2964365780353546,
0.039408743381500244,
-0.009367435239255428,
-0.06861230731010437,
-0.09450319409370422,
-0.013667361810803413,
0.029437921941280365,
-0.14993344247341156,
0.17071641981601715,
0.11398894339799881,
0.12250298261642456,
0.11225929856300354,
-0.048038139939308167,
-0.043080564588308334,
0.03040299005806446,
-0.062227316200733185,
0.08327047526836395,
0.01500843558460474,
0.2281564474105835,
0.12375076115131378,
-0.019517984241247177,
0.046968020498752594,
-0.09040549397468567,
-0.027107328176498413,
-0.18054643273353577,
-0.03251165896654129,
-0.12025744467973709,
-0.18892937898635864,
0.023518703877925873,
-0.01973448134958744,
0.05114280804991722,
0.11157292872667313,
0.05292094498872757,
-0.048750199377536774,
-0.0017584672896191478,
-0.07973569631576538,
-0.10088236629962921,
0.1072954460978508,
-0.07156910002231598,
0.005086960271000862,
-0.10042916983366013,
0.004084693733602762,
0.13065344095230103,
0.0355370007455349,
0.02608891949057579,
-0.008148714900016785,
-0.08082246035337448,
0.02942848391830921,
-0.14943264424800873,
-0.05501813068985939,
0.016558416187763214,
-0.035625431686639786,
0.08515625447034836,
0.15310364961624146,
0.08039822429418564,
-0.027542291209101677,
0.09504618495702744,
0.09227903932332993,
-0.10123046487569809,
-0.1539323925971985,
-0.13751435279846191,
0.12564748525619507,
-0.06250622123479843,
0.025285029783844948,
-0.0032643156591802835,
-0.048088427633047104,
-0.02303118072450161,
0.13894213736057281,
0.19606024026870728,
0.04328577220439911,
0.012649825774133205,
-0.048193033784627914,
-0.023895569145679474,
-0.06303253024816513,
-0.0781809464097023,
0.09673985838890076,
0.1582636535167694,
-0.0028858312871307135,
-0.01329187210649252,
-0.08927883952856064,
-0.009219509549438953,
0.019047487527132034,
0.027052713558077812,
-0.07838395237922668,
-0.16970154643058777,
-0.005413366481661797,
0.12547166645526886,
-0.039332639425992966,
-0.019616272300481796,
-0.08680213987827301,
-0.09754780679941177,
-0.0825326219201088,
0.04223686456680298,
0.17433251440525055,
0.11978330463171005,
-0.07080640643835068,
-0.10467828810214996,
-0.024149484932422638,
0.07567241787910461,
-0.07721208781003952,
-0.1434420347213745,
-0.02662815898656845,
0.037831421941518784,
-0.04522665590047836,
0.006591038778424263,
0.04649452120065689,
0.2177800089120865,
0.022790227085351944,
0.11851667612791061,
-0.010910235345363617,
0.14727237820625305,
0.026975687593221664,
-0.17637966573238373,
0.07510554790496826,
0.08441868424415588,
0.055223409086465836,
0.13413722813129425,
0.08203231543302536,
-0.048911597579717636,
0.017817741259932518,
-0.21166516840457916,
-0.08605901896953583,
-0.181838259100914,
0.050953276455402374,
-0.03816695883870125,
0.02707020938396454,
-0.038408175110816956,
-0.05762946233153343,
-0.015024018473923206,
-0.034307971596717834,
-0.0006572632119059563,
0.03974656015634537,
-0.0855933353304863,
-0.029406696557998657,
-0.18740859627723694,
0.031974829733371735,
-0.11571755260229111,
-0.046631015837192535,
-0.10258565098047256,
-0.03997534140944481,
-0.021205732598900795,
-0.04042608290910721,
-0.007629226893186569,
0.0018403894500806928,
0.09094472229480743,
-0.0036168983206152916,
0.01197114959359169,
-0.00950881838798523,
0.07739438861608505,
0.06329246610403061,
-0.09487608820199966,
-0.07597042620182037
] |
null | null |
transformers
|
## Evaluation on Common Voice PL Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-polish"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "pl", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 24.6 %
|
{"language": "nl", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-polish
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"nl",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"nl"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #nl #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
## Evaluation on Common Voice PL Test
Result: 24.6 %
|
[
"## Evaluation on Common Voice PL Test\n\n\n\nResult: 24.6 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #nl #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"## Evaluation on Common Voice PL Test\n\n\n\nResult: 24.6 %"
] |
[
65,
13
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #nl #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n## Evaluation on Common Voice PL Test\n\n\n\nResult: 24.6 %"
] |
[
-0.19115856289863586,
0.04745778068900108,
-0.0033434254582971334,
-0.08263006806373596,
0.0013944467063993216,
-0.05149002745747566,
0.07184232771396637,
0.1105433776974678,
0.021846920251846313,
0.051274869590997696,
0.05062657967209816,
0.12816062569618225,
0.020671973004937172,
-0.10771273076534271,
-0.06407298892736435,
-0.10671517997980118,
0.06334591656923294,
0.09158845245838165,
0.1219959482550621,
0.10081643611192703,
0.08980398625135422,
-0.038730040192604065,
-0.024663470685482025,
0.10595966130495071,
-0.026503870263695717,
0.028048545122146606,
0.09090503305196762,
-0.1428995579481125,
0.15007853507995605,
0.044651228934526443,
-0.028484473004937172,
0.04058173671364784,
-0.00414673238992691,
-0.2037045806646347,
0.0023624941240996122,
-0.03361206501722336,
0.08244471251964569,
0.001210197457112372,
0.042455222457647324,
-0.01143060252070427,
0.024073734879493713,
0.19537416100502014,
-0.025180110707879066,
0.10358348488807678,
0.026891101151704788,
-0.23316216468811035,
-0.04977727681398392,
-0.015675384551286697,
0.04759357124567032,
0.09699441492557526,
-0.057532649487257004,
0.1894720196723938,
-0.1433330923318863,
0.09252528101205826,
0.07544621080160141,
-0.31396323442459106,
0.014859087765216827,
-0.04372229054570198,
0.06916029006242752,
-0.03475594520568848,
-0.031900472939014435,
0.031713809818029404,
0.08388298749923706,
0.06002706289291382,
-0.19547955691814423,
-0.038832105696201324,
-0.17463746666908264,
-0.025180568918585777,
-0.05974000319838524,
-0.041886404156684875,
0.25751879811286926,
0.032074060291051865,
-0.05915604159235954,
-0.10738544911146164,
0.050474073737859726,
-0.0130383912473917,
0.024919327348470688,
-0.03640207275748253,
-0.005868229549378157,
0.03271302208304405,
-0.09517994523048401,
0.06482289731502533,
-0.09601778537034988,
-0.13043780624866486,
-0.12139315903186798,
0.09153744578361511,
0.057853490114212036,
0.02544146589934826,
-0.009445116855204105,
0.01928231306374073,
-0.048660799860954285,
-0.04922432452440262,
-0.0662548691034317,
0.03990228474140167,
0.026003584265708923,
0.004128105938434601,
-0.057427145540714264,
-0.11652174592018127,
0.15937474370002747,
0.05582394078373909,
0.041111212223768234,
0.03807450085878372,
-0.14519254863262177,
0.06562472879886627,
0.00012400872947182506,
0.15255261957645416,
-0.011142543517053127,
0.03514442965388298,
0.03664657101035118,
-0.011606572195887566,
0.06485193222761154,
0.04642495885491371,
-0.021799998357892036,
-0.025066396221518517,
0.10874190181493759,
0.08101171255111694,
-0.053306665271520615,
-0.0797911062836647,
-0.05028974637389183,
0.028071563690900803,
-0.05200396478176117,
-0.09719648957252502,
-0.004327955190092325,
0.054043207317590714,
0.05811811611056328,
0.17976881563663483,
0.004601596854627132,
0.03861405327916145,
-0.08789411187171936,
-0.010345199145376682,
0.06682035326957703,
0.016937416046857834,
0.07488218694925308,
0.01391605380922556,
0.08015941828489304,
0.0036554881371557713,
0.024170728400349617,
-0.05496054142713547,
-0.01872687041759491,
-0.0723118782043457,
-0.03402099758386612,
0.006041104439646006,
-0.1618233323097229,
-0.047559674829244614,
-0.025797821581363678,
-0.005316728726029396,
-0.12063262611627579,
-0.015252994373440742,
-0.06368083506822586,
0.12134119123220444,
0.0952882319688797,
-0.02404888905584812,
-0.027775444090366364,
0.07090072333812714,
-0.06756236404180527,
-0.027907779440283775,
-0.01721213012933731,
0.059983380138874054,
-0.08187933266162872,
-0.014935635961592197,
-0.0871385782957077,
-0.12662777304649353,
-0.035960786044597626,
0.03631635382771492,
0.016752371564507484,
0.06339491903781891,
-0.12090973556041718,
-0.08671744912862778,
0.11756979674100876,
-0.11931674927473068,
-0.09498804807662964,
0.20111213624477386,
0.04528982937335968,
-0.03122827783226967,
0.15040700137615204,
0.263073593378067,
0.04895217716693878,
-0.12115731835365295,
-0.04396539926528931,
0.06333862990140915,
0.04487936943769455,
-0.10359267145395279,
0.08090852946043015,
-0.1279468536376953,
-0.10914549231529236,
0.03371965140104294,
0.05100707709789276,
0.0516057126224041,
-0.03591155633330345,
-0.0377608984708786,
-0.03531204164028168,
-0.09189741313457489,
-0.024000609293580055,
0.06919283419847488,
0.018735041841864586,
-0.12258562445640564,
-0.05445411801338196,
0.0017074806382879615,
0.09330330789089203,
-0.11185917258262634,
0.060515113174915314,
-0.12449250370264053,
0.16918690502643585,
-0.1449093520641327,
-0.08369463682174683,
-0.11682731658220291,
0.22555243968963623,
-0.012185942381620407,
0.03591305390000343,
0.10416153818368912,
0.036242879927158356,
0.028737178072333336,
-0.1010877713561058,
-0.021624812856316566,
-0.026257438585162163,
0.10938438773155212,
-0.007165309973061085,
-0.0031905286014080048,
-0.14853766560554504,
0.07214149832725525,
-0.0413934588432312,
-0.005591290071606636,
0.017048297449946404,
-0.08542271703481674,
0.032774586230516434,
0.10043546557426453,
-0.060568708926439285,
0.014122970402240753,
0.04513916000723839,
0.046914830803871155,
-0.012045176699757576,
0.07312501966953278,
0.02125673182308674,
0.02152409218251705,
-0.0825861319899559,
0.22432562708854675,
-0.023590628057718277,
0.15255124866962433,
0.15596339106559753,
-0.1551218330860138,
0.05841531231999397,
0.13392923772335052,
-0.025215310975909233,
0.0009293954935856164,
-0.02943478897213936,
-0.036767568439245224,
0.26460227370262146,
-0.038870323449373245,
0.10569416731595993,
-0.13071276247501373,
-0.03230739012360573,
0.02089976891875267,
-0.04102863743901253,
0.025575336068868637,
0.09988798946142197,
0.04133811965584755,
-0.0033985355403274298,
0.04682798683643341,
0.04570711404085159,
-0.128172367811203,
0.10618763417005539,
-0.13826145231723785,
-0.077873095870018,
0.06596801429986954,
0.026401342824101448,
-0.04683279991149902,
0.09962392598390579,
-0.23416410386562347,
0.01128490548580885,
0.056042399257421494,
0.04324423521757126,
0.08382026106119156,
-0.18900802731513977,
0.04458190128207207,
-0.0257473886013031,
-0.11669085919857025,
-0.11535878479480743,
0.08282284438610077,
-0.009104551747441292,
0.04024610295891762,
-0.1242137998342514,
-0.1593068689107895,
0.0333985835313797,
-0.02695656754076481,
-0.14158064126968384,
0.03752635419368744,
-0.0414879247546196,
-0.15773408114910126,
-0.10201526433229446,
0.0386987179517746,
-0.04132609814405441,
-0.024293359369039536,
0.12105684727430344,
-0.10977145284414291,
-0.013759900815784931,
-0.01897231489419937,
-0.028517520055174828,
-0.07525567710399628,
0.03942076861858368,
-0.035908784717321396,
-0.007500899955630302,
0.08589049428701401,
-0.14018245041370392,
-0.026628602296113968,
-0.037093088030815125,
0.0011055140057578683,
-0.0004544672556221485,
-0.02359583042562008,
-0.0036781872622668743,
0.15485471487045288,
0.08287619799375534,
0.009670673869550228,
-0.048312343657016754,
0.11067570745944977,
-0.08551952242851257,
-0.1322217583656311,
0.1703500896692276,
-0.0308064017444849,
-0.07380947470664978,
0.12140864878892899,
0.006666763685643673,
-0.05892348289489746,
-0.07352334260940552,
-0.012320881709456444,
-0.016892334446310997,
-0.33888673782348633,
-0.0949513390660286,
-0.07628810405731201,
0.00009588024840923026,
-0.06905128061771393,
0.04860952869057655,
0.048554833978414536,
-0.032826561480760574,
-0.002414524555206299,
-0.10889629274606705,
-0.017449775710701942,
-0.040581244975328445,
0.3067640960216522,
-0.06087559089064598,
0.10740362852811813,
-0.08678580075502396,
-0.0769166573882103,
0.031020594760775566,
-0.010904790833592415,
0.08425599336624146,
0.11272675544023514,
0.019791044294834137,
0.02325531467795372,
0.17969205975532532,
0.06634767353534698,
0.03879120945930481,
0.012343315407633781,
0.01309514045715332,
0.03342774510383606,
-0.01929626613855362,
-0.0319037139415741,
0.016093844547867775,
0.2613919675350189,
0.038768038153648376,
-0.04318670183420181,
-0.18213480710983276,
0.03867069259285927,
0.18053901195526123,
0.15407125651836395,
-0.12100894004106522,
-0.02306157723069191,
-0.015230463817715645,
-0.17517977952957153,
-0.027380384504795074,
0.1439477652311325,
-0.028451552614569664,
-0.06212543323636055,
0.13804058730602264,
0.039746906608343124,
0.07063499093055725,
-0.017821915447711945,
0.02307792752981186,
-0.0910695344209671,
-0.10852602124214172,
0.05979542061686516,
0.08155301213264465,
-0.2112003117799759,
0.19504673779010773,
0.015416642650961876,
0.10803396254777908,
-0.017971571534872055,
0.028558559715747833,
0.014429515227675438,
0.03734052926301956,
0.10448277741670609,
-0.03598611056804657,
-0.01827297918498516,
0.049245163798332214,
-0.0822642520070076,
0.11530043184757233,
0.008549177087843418,
0.1641535460948944,
-0.02594078704714775,
0.008910533972084522,
-0.00043350766645744443,
0.009830046445131302,
-0.1806773990392685,
-0.1645146757364273,
0.0208088681101799,
-0.01712775230407715,
0.23509104549884796,
0.13578683137893677,
-0.020518604665994644,
-0.11328158527612686,
-0.23832818865776062,
0.1087232455611229,
-0.07249177247285843,
-0.020599298179149628,
-0.0018445599125698209,
-0.10135949403047562,
0.1900012493133545,
0.007549030240625143,
-0.02473178692162037,
0.00029606427415274084,
-0.04710085317492485,
0.02113838493824005,
-0.026503998786211014,
0.05215981975197792,
-0.07079150527715683,
-0.09997216612100601,
-0.020505022257566452,
0.3410753607749939,
-0.06861232966184616,
0.11800720542669296,
0.05596242845058441,
0.01769329607486725,
-0.017136210575699806,
-0.020388944074511528,
0.05092741176486015,
0.024910785257816315,
-0.150713711977005,
0.04749631881713867,
0.0767349973320961,
-0.10885158181190491,
-0.10253416746854782,
-0.06090337038040161,
0.24493728578090668,
0.07207255065441132,
-0.012897740118205547,
0.10670647025108337,
0.16838611662387848,
-0.04742098227143288,
-0.17841081321239471,
-0.09557357430458069,
0.03809113800525665,
0.08812715113162994,
-0.10586866736412048,
-0.06368035823106766,
0.13761690258979797,
-0.036365486681461334,
-0.08620921522378922,
0.007590904366225004,
-0.17349931597709656,
-0.10252291709184647,
0.23553934693336487,
-0.1306079477071762,
0.2197670191526413,
-0.052087679505348206,
-0.090872623026371,
-0.0209847129881382,
-0.006013049744069576,
-0.06587155908346176,
-0.02672967128455639,
0.07501280307769775,
0.014167451299726963,
0.08927661180496216,
0.040370211005210876,
-0.028766119852662086,
0.1241699755191803,
0.003492494812235236,
-0.08222376555204391,
0.04274268448352814,
0.12659640610218048,
-0.11161111295223236,
0.08914965391159058,
0.12768462300300598,
-0.16346827149391174,
0.052824754267930984,
-0.06650710105895996,
-0.07097648084163666,
-0.06638684868812561,
0.08611134439706802,
0.07611607015132904,
0.03584042936563492,
-0.04845350980758667,
-0.10358711332082748,
-0.03768904507160187,
0.026503516361117363,
0.12449266761541367,
-0.10613972693681717,
0.08675733208656311,
0.05690465494990349,
0.24218103289604187,
-0.13928063213825226,
-0.12286704033613205,
-0.0006045473855920136,
-0.11571744829416275,
0.09006105363368988,
-0.06699991971254349,
0.060210756957530975,
0.12913838028907776,
0.04531130567193031,
0.06476515531539917,
0.020365837961435318,
-0.11659232527017593,
0.06388111412525177,
0.04154569283127785,
-0.04369847849011421,
-0.11837136745452881,
-0.016980810090899467,
-0.007878514006733894,
-0.014863348565995693,
0.10306356847286224,
0.131330206990242,
-0.06276094913482666,
-0.007810509763658047,
-0.0024937547277659178,
-0.03153650835156441,
-0.16229531168937683,
0.24482806026935577,
0.03325241804122925,
-0.0014323131181299686,
-0.14501923322677612,
0.03812789544463158,
-0.06607343256473541,
-0.13963738083839417,
0.010397903621196747,
-0.08514445275068283,
0.006510481704026461,
-0.07300133258104324,
-0.03486454486846924,
0.03024769388139248,
0.07027158886194229,
-0.1732827126979828,
-0.118204765021801,
-0.20591680705547333,
0.08132908493280411,
0.14365153014659882,
0.07136881351470947,
0.07154843956232071,
-0.05834526568651199,
-0.07349158078432083,
-0.005216488614678383,
0.03933892771601677,
0.03560873493552208,
0.02407822012901306,
-0.14046111702919006,
0.01751517690718174,
0.04388566315174103,
0.0624348409473896,
-0.0425683856010437,
-0.05834091454744339,
-0.014038573950529099,
0.1138872429728508,
-0.17584066092967987,
-0.010091091506183147,
-0.02825871668756008,
0.011385864578187466,
0.10883669555187225,
-0.09757973253726959,
-0.05826028436422348,
0.06291774660348892,
-0.10962274670600891,
0.057131629437208176,
0.029509730637073517,
0.1094110906124115,
-0.06505046039819717,
0.05057568848133087,
0.06083943322300911,
-0.05197243392467499,
0.097817063331604,
0.13373656570911407,
-0.1279430389404297,
0.13021445274353027,
-0.11823006719350815,
-0.1303597092628479,
0.1424424648284912,
0.07551680505275726,
-0.09048528224229813,
-0.12298725545406342,
0.026112930849194527,
0.2328612208366394,
0.05909864604473114,
0.04075669124722481,
0.018476715311408043,
-0.05945703387260437,
-0.08006781339645386,
-0.08991743624210358,
-0.07472370564937592,
0.0033874642103910446,
-0.0825520008802414,
0.10088441520929337,
0.13753986358642578,
0.1634581983089447,
-0.06846752762794495,
-0.000007043706318654586,
-0.09785924106836319,
0.05406133830547333,
-0.08637033402919769,
-0.04835737496614456,
-0.18229004740715027,
-0.0036694975569844246,
0.03681810945272446,
-0.028300320729613304,
0.19262093305587769,
-0.04690825939178467,
-0.030574673786759377,
0.006851922255009413,
0.04987334460020065,
-0.018866781145334244,
0.003164546797052026,
0.296455979347229,
0.05016212910413742,
0.0005649366648867726,
-0.06854913383722305,
-0.1103629395365715,
-0.009252894669771194,
0.03288739174604416,
-0.11858250200748444,
0.14811371266841888,
0.0734797790646553,
0.1046871542930603,
0.1269814819097519,
-0.01987094059586525,
0.028946757316589355,
0.0018501220038160682,
-0.02878694422543049,
0.08132930099964142,
0.061153773218393326,
0.2917945086956024,
0.0882243663072586,
0.011424864642322063,
0.036624982953071594,
-0.06456855684518814,
-0.045331429690122604,
-0.19424854218959808,
-0.006559653207659721,
-0.12898194789886475,
-0.179124653339386,
0.07131461054086685,
-0.030976392328739166,
0.037928588688373566,
0.0441768541932106,
0.08171454071998596,
-0.0438462495803833,
0.02970125526189804,
-0.04477812349796295,
-0.09649147093296051,
0.12297696620225906,
-0.07200977206230164,
-0.003400950925424695,
-0.07629413157701492,
0.002660897094756365,
0.1373407244682312,
0.0028444353956729174,
0.015168537385761738,
-0.011044587939977646,
-0.10282175987958908,
0.0178526584059,
-0.11191819608211517,
-0.033400602638721466,
0.008486812002956867,
-0.02611842378973961,
0.11183949559926987,
0.18781715631484985,
0.06920691579580307,
-0.0007991044549271464,
0.06439979374408722,
0.030564988031983376,
-0.09083741158246994,
-0.15182219445705414,
-0.09825973957777023,
0.17496304214000702,
-0.02570100873708725,
0.021797090768814087,
0.010353492572903633,
-0.021660542115569115,
0.026563521474599838,
0.18350031971931458,
0.15915757417678833,
0.0477135144174099,
0.016508545726537704,
-0.04536568000912666,
-0.023668238893151283,
-0.021558651700615883,
-0.07153545320034027,
0.11958771198987961,
0.18963085114955902,
-0.016619740054011345,
-0.05865195766091347,
-0.09728462249040604,
0.00292938738130033,
0.018648827448487282,
0.02292332611978054,
-0.05433286726474762,
-0.16854660212993622,
0.005908402614295483,
0.14831961691379547,
-0.0905524492263794,
0.044666390866041183,
-0.06364203989505768,
-0.08392327278852463,
-0.10267560184001923,
0.008763056248426437,
0.1344827264547348,
0.12106524407863617,
-0.09762873500585556,
-0.10811039060354233,
-0.03441441431641579,
0.08786944299936295,
-0.04588925465941429,
-0.1640472710132599,
-0.01632392592728138,
0.02898412011563778,
-0.041707661002874374,
-0.023319566622376442,
0.07104125618934631,
0.27091267704963684,
0.0014801813522353768,
0.12131178379058838,
0.01571534387767315,
0.1481863558292389,
0.027569066733121872,
-0.2009546011686325,
0.08096640557050705,
0.10528454929590225,
0.06855186820030212,
0.13452763855457306,
0.052831754088401794,
-0.016356760635972023,
0.0332774892449379,
-0.14194545149803162,
-0.0925067663192749,
-0.1773587018251419,
0.05276060849428177,
-0.06630271673202515,
0.02539902739226818,
-0.028440754860639572,
-0.05725601315498352,
-0.022384323179721832,
-0.030317658558487892,
0.019043542444705963,
0.04643283411860466,
-0.13518139719963074,
-0.015812469646334648,
-0.2294529527425766,
0.004600890912115574,
-0.07328687608242035,
-0.05154861509799957,
-0.1258910745382309,
-0.033461909741163254,
-0.020711399614810944,
-0.04760504886507988,
-0.03605238348245621,
0.021579088643193245,
0.05477810651063919,
-0.008627803064882755,
0.018967613577842712,
-0.0069577256217598915,
0.06980960816144943,
0.05964793637394905,
-0.10699394345283508,
-0.07422959804534912
] |
null | null |
transformers
|
## Evaluation on Common Voice PT Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-portuguese"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "pt", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 27.1 %
|
{"language": "pt", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-portuguese
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"pt",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"pt"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #pt #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
## Evaluation on Common Voice PT Test
Result: 27.1 %
|
[
"## Evaluation on Common Voice PT Test\n\nResult: 27.1 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #pt #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"## Evaluation on Common Voice PT Test\n\nResult: 27.1 %"
] |
[
65,
13
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #pt #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n## Evaluation on Common Voice PT Test\n\nResult: 27.1 %"
] |
[
-0.1785939782857895,
0.030758654698729515,
-0.003793761134147644,
-0.09761331975460052,
0.007135030813515186,
-0.03808102756738663,
0.05415958911180496,
0.13273215293884277,
-0.008013627491891384,
0.039840005338191986,
0.034556567668914795,
0.14713351428508759,
0.02947971224784851,
-0.05564971640706062,
-0.0694560781121254,
-0.13603636622428894,
0.057243458926677704,
0.10817121714353561,
0.13818754255771637,
0.11453670263290405,
0.07817986607551575,
-0.02101656049489975,
-0.011698083020746708,
0.10185260325670242,
-0.014647279866039753,
0.002556456020101905,
0.08693493902683258,
-0.13663935661315918,
0.1558714658021927,
0.030547022819519043,
-0.023073922842741013,
0.02545427903532982,
-0.022749118506908417,
-0.1894741654396057,
0.001709371106699109,
0.002134039532393217,
0.09434659779071808,
0.013521627523005009,
0.06469043344259262,
0.019539261236786842,
0.029683507978916168,
0.21306270360946655,
-0.000017134518202510662,
0.10374969244003296,
0.02677113562822342,
-0.2500626742839813,
-0.049047842621803284,
0.03458942472934723,
0.0967242568731308,
0.09126804769039154,
-0.05899865925312042,
0.24592582881450653,
-0.12113527953624725,
0.07796421647071838,
0.08416125178337097,
-0.2937060296535492,
0.010198064148426056,
-0.03171662986278534,
0.06260514259338379,
-0.017794348299503326,
-0.02362670935690403,
-0.004287718329578638,
0.08448244631290436,
0.06702377647161484,
-0.2003324031829834,
-0.05571914464235306,
-0.10382789373397827,
-0.0445401668548584,
-0.07607191801071167,
-0.0340881273150444,
0.2906646728515625,
0.034407008439302444,
-0.056164246052503586,
-0.09395366907119751,
0.02173781953752041,
-0.016456879675388336,
0.013855124823749065,
-0.08408735692501068,
-0.013804943300783634,
0.03086186945438385,
-0.08703988045454025,
0.064907968044281,
-0.11728730797767639,
-0.14054332673549652,
-0.13069505989551544,
-0.01322135142982006,
0.06700948625802994,
0.015379846096038818,
-0.02411644719541073,
0.04586512967944145,
-0.08793818205595016,
-0.01522212103009224,
-0.06777089834213257,
0.02048272266983986,
0.029276734218001366,
-0.018355589359998703,
-0.041633620858192444,
-0.10354825109243393,
0.1278182715177536,
0.054316580295562744,
0.05191295966506004,
0.05452949181199074,
-0.1520085334777832,
0.07283208519220352,
-0.007750832475721836,
0.14078855514526367,
-0.04631775990128517,
0.06623581796884537,
0.007760978303849697,
-0.028662733733654022,
0.03499813377857208,
0.047961145639419556,
-0.005404971074312925,
-0.04251239821314812,
0.1488889902830124,
0.07065105438232422,
-0.061377473175525665,
-0.06923488527536392,
-0.03431941941380501,
-0.011599431745707989,
0.013910278677940369,
-0.07133791595697403,
-0.016609326004981995,
0.05781560391187668,
0.08495692163705826,
0.20548509061336517,
0.021054135635495186,
0.0560779795050621,
-0.07779324799776077,
-0.01164618507027626,
0.06455768644809723,
0.011932301335036755,
0.07731670886278152,
0.007374416571110487,
0.07510748505592346,
-0.009642365388572216,
0.01690385676920414,
-0.07203175127506256,
-0.0005438033258542418,
-0.06857035309076309,
-0.05820128321647644,
0.01627071015536785,
-0.1540578305721283,
-0.03866032138466835,
-0.02395974099636078,
-0.005099380388855934,
-0.11289169639348984,
0.0282355435192585,
-0.056395478546619415,
0.11712787300348282,
0.055565934628248215,
-0.044964324682950974,
-0.021914223209023476,
0.06608206778764725,
-0.051777973771095276,
-0.027804458513855934,
-0.000630331109277904,
0.04623548686504364,
-0.05038570612668991,
-0.03621566295623779,
-0.10625996440649033,
-0.12781384587287903,
-0.0940253883600235,
0.03364192321896553,
-0.019206471741199493,
0.07684329897165298,
-0.15953989326953888,
-0.09241484105587006,
0.1233653575181961,
-0.11760058999061584,
-0.10200879722833633,
0.19738583266735077,
0.05113270878791809,
0.023703651502728462,
0.13673414289951324,
0.272065132856369,
0.052032120525836945,
-0.16902931034564972,
0.00093765341443941,
0.0698506236076355,
0.010884072631597519,
-0.11310629546642303,
0.09748433530330658,
-0.16375382244586945,
-0.1144244596362114,
0.017454812303185463,
-0.009374856948852539,
0.02507784590125084,
-0.04274150729179382,
-0.033956483006477356,
-0.026964327320456505,
-0.09354638308286667,
-0.03675829619169235,
0.07200349122285843,
0.0440436415374279,
-0.0994015634059906,
-0.04990772902965546,
0.008488114923238754,
0.11042296886444092,
-0.09143748879432678,
0.0727795660495758,
-0.12750713527202606,
0.11893674731254578,
-0.14203175902366638,
-0.07896462827920914,
-0.10148552805185318,
0.25457483530044556,
-0.0013532692100852728,
0.01498261746019125,
0.0838012844324112,
0.05979534983634949,
0.054374102503061295,
-0.12777256965637207,
-0.024793602526187897,
-0.02827252447605133,
0.12055502831935883,
-0.0018533682450652122,
-0.03579707443714142,
-0.12927088141441345,
0.07675254344940186,
-0.03328545391559601,
-0.001238725846633315,
0.04046684131026268,
-0.06750812381505966,
0.01570591889321804,
0.07941199839115143,
-0.04947933182120323,
-0.00005300933480612002,
0.04639175534248352,
0.0417906790971756,
-0.013543067499995232,
0.0602007620036602,
0.029574617743492126,
0.03347446024417877,
-0.07526393234729767,
0.2339407354593277,
-0.06240962818264961,
0.16207875311374664,
0.14490604400634766,
-0.12647530436515808,
0.03954390808939934,
0.12668688595294952,
-0.02426912449300289,
-0.012775389477610588,
-0.03902025893330574,
-0.058740150183439255,
0.24747022986412048,
-0.030517008155584335,
0.09301674365997314,
-0.12018968909978867,
-0.050271499902009964,
0.025546623393893242,
-0.05261066555976868,
0.005769378039985895,
0.08914464712142944,
0.016254594549536705,
0.008338644169270992,
0.045662056654691696,
0.0768311619758606,
-0.1256861388683319,
0.10752276331186295,
-0.11948027461767197,
-0.1009645015001297,
0.06061052531003952,
0.03202034905552864,
-0.03101375885307789,
0.0670848861336708,
-0.2541636824607849,
0.014048757962882519,
0.06327153742313385,
0.06796573102474213,
0.08933588117361069,
-0.18546229600906372,
0.03312506899237633,
-0.031163079664111137,
-0.09231287986040115,
-0.127654567360878,
0.10800568759441376,
-0.00005193545439396985,
0.04353936389088631,
-0.1097332239151001,
-0.15463757514953613,
0.031684763729572296,
-0.02206234075129032,
-0.171469047665596,
0.03866424784064293,
-0.04229448735713959,
-0.16241197288036346,
-0.09834922105073929,
0.0828046053647995,
0.007949454709887505,
-0.01272918563336134,
0.12603233754634857,
-0.12107952684164047,
0.008277716115117073,
-0.018018148839473724,
-0.026441164314746857,
-0.07042182981967926,
0.04522951692342758,
-0.05051747336983681,
0.00636686198413372,
0.08674796670675278,
-0.12009070813655853,
-0.02135017141699791,
-0.04844510182738304,
-0.06881324201822281,
0.0027130490634590387,
-0.07889652252197266,
-0.0010433309944346547,
0.2341660112142563,
0.10508336126804352,
0.013104821555316448,
-0.04814421758055687,
0.08194543421268463,
-0.10034557431936264,
-0.11420948058366776,
0.15105588734149933,
-0.023386958986520767,
-0.06886673718690872,
0.10373209416866302,
-0.017344225198030472,
-0.030826682224869728,
-0.05265238881111145,
0.0035597113892436028,
-0.021293457597494125,
-0.3365507125854492,
-0.08166735619306564,
-0.04897746071219444,
0.018313171342015266,
-0.06232145428657532,
0.056007061153650284,
0.09912677109241486,
-0.02341221272945404,
0.0016660975525155663,
-0.13852818310260773,
0.022385699674487114,
-0.026312481611967087,
0.2125197798013687,
-0.06350567936897278,
0.1147138699889183,
-0.08348624408245087,
-0.07192569971084595,
0.036021888256073,
0.03697924315929413,
0.07220003008842468,
0.1152043342590332,
-0.009262424893677235,
0.040830936282873154,
0.19707311689853668,
0.05929741635918617,
0.040545158088207245,
0.023206548765301704,
0.016473490744829178,
0.013050355948507786,
-0.011505057103931904,
-0.024083223193883896,
0.03894861415028572,
0.26518258452415466,
0.02274378575384617,
-0.058375101536512375,
-0.1756182461977005,
0.04669938236474991,
0.23416754603385925,
0.14784900844097137,
-0.10374342650175095,
-0.024761362001299858,
-0.008421349339187145,
-0.16621635854244232,
-0.019796350970864296,
0.1401786506175995,
-0.020563090220093727,
-0.09460553526878357,
0.12099353224039078,
0.02593510039150715,
0.07818791270256042,
-0.02797410637140274,
0.03135845810174942,
-0.056908149272203445,
-0.15156501531600952,
0.047617994248867035,
0.07112575322389603,
-0.27258607745170593,
0.20425187051296234,
0.00581339979544282,
0.10704447329044342,
-0.012589441612362862,
0.007823814637959003,
0.015866946429014206,
0.03618728369474411,
0.10703880339860916,
-0.04048136621713638,
0.018463922664523125,
0.0030952554661780596,
-0.08052562177181244,
0.10318820178508759,
-0.002988699357956648,
0.14635993540287018,
-0.05582207441329956,
-0.0015938787255436182,
0.022024301812052727,
0.02086106687784195,
-0.10950973629951477,
-0.1526695042848587,
0.06775061041116714,
-0.01256226934492588,
0.18647703528404236,
0.08147460222244263,
-0.023738032206892967,
-0.1104179099202156,
-0.2362624555826187,
0.03698888421058655,
-0.13617020845413208,
-0.005440697073936462,
0.0232391357421875,
-0.1614706963300705,
0.16879086196422577,
-0.012116090394556522,
-0.048812318593263626,
0.01847635768353939,
-0.02506236359477043,
0.03297825902700424,
-0.040111903101205826,
0.06837721914052963,
-0.07846671342849731,
-0.11072390526533127,
-0.016722939908504486,
0.3319510817527771,
-0.05297982320189476,
0.1255865842103958,
0.05166175961494446,
-0.0052794902585446835,
-0.011989075690507889,
-0.008596493862569332,
0.06565019488334656,
0.05439009889960289,
-0.1720312088727951,
0.015981687232851982,
0.054991185665130615,
-0.10402537882328033,
-0.12832793593406677,
-0.052231986075639725,
0.2201036661863327,
0.07147189974784851,
-0.009021750651299953,
0.09638388454914093,
0.15978819131851196,
-0.0652008131146431,
-0.17387989163398743,
-0.07802920788526535,
0.04367939755320549,
0.0445738211274147,
-0.08745600283145905,
-0.05268583074212074,
0.1155984103679657,
0.04138012230396271,
-0.08757335692644119,
-0.04790449142456055,
-0.20916573703289032,
-0.10486641526222229,
0.2866731882095337,
-0.12867535650730133,
0.16529572010040283,
-0.08142070472240448,
-0.08385121822357178,
-0.004257490858435631,
-0.03227226436138153,
-0.08651186525821686,
-0.07382261753082275,
0.06355861574411392,
0.022418973967432976,
0.09654592722654343,
0.031110938638448715,
-0.028361227363348007,
0.10905725508928299,
0.05052145943045616,
-0.0786556676030159,
0.04212041571736336,
0.12259043753147125,
-0.12695759534835815,
0.07946063578128815,
0.11698471754789352,
-0.07119543105363846,
0.03403010219335556,
-0.04682719334959984,
-0.06445202976465225,
-0.09151633828878403,
0.07000406086444855,
0.07397758960723877,
0.011796790175139904,
-0.041009221225976944,
-0.09283085912466049,
-0.03925250098109245,
0.011022777296602726,
0.1420823037624359,
-0.1150013878941536,
0.13761210441589355,
0.05228440463542938,
0.23567213118076324,
-0.13515588641166687,
-0.14146263897418976,
-0.01323755457997322,
-0.1303541213274002,
0.08156811445951462,
-0.03620182350277901,
0.06899494677782059,
0.11890141665935516,
0.03966382518410683,
0.08160313963890076,
0.030619200319051743,
-0.12516699731349945,
0.07845784723758698,
0.024966083467006683,
-0.01181920524686575,
-0.1166784018278122,
-0.002279845532029867,
0.002279249718412757,
-0.016345907002687454,
0.1208842545747757,
0.12616030871868134,
-0.048957761377096176,
-0.01771347038447857,
0.0011819966603070498,
-0.017234662547707558,
-0.16695837676525116,
0.2823474705219269,
0.03740827739238739,
-0.002630959264934063,
-0.13659372925758362,
0.06000296771526337,
-0.0767587423324585,
-0.1215609610080719,
0.01296994462609291,
-0.1077381893992424,
0.005060437601059675,
-0.07170065492391586,
0.0035776044242084026,
0.024626141414046288,
0.06290965527296066,
-0.16042982041835785,
-0.08052872121334076,
-0.17074836790561676,
0.08342369645833969,
0.10989657044410706,
0.07410931587219238,
0.08255031704902649,
-0.05578923225402832,
-0.059611860662698746,
-0.020829251036047935,
0.01234350260347128,
0.04173635318875313,
0.023665547370910645,
-0.16451846063137054,
0.06393484026193619,
0.05117068067193031,
0.06342767924070358,
-0.04780666530132294,
-0.07241616398096085,
-0.020507285371422768,
0.1190013438463211,
-0.11510270833969116,
-0.007659395225346088,
-0.0589723065495491,
-0.010412090457975864,
0.11293163895606995,
-0.09148357808589935,
-0.06812218576669693,
0.05773095414042473,
-0.1046709194779396,
0.07032862305641174,
0.027363980188965797,
0.10544563084840775,
-0.05213311314582825,
0.06344731152057648,
0.061692312359809875,
-0.04252331331372261,
0.092131108045578,
0.13132771849632263,
-0.12620025873184204,
0.14479199051856995,
-0.12707659602165222,
-0.09851949661970139,
0.15338870882987976,
0.0572301484644413,
-0.0950397402048111,
-0.1201455220580101,
0.01814189925789833,
0.2150912582874298,
0.06297208368778229,
0.04683998227119446,
0.06192707270383835,
-0.04674602299928665,
-0.05771607905626297,
-0.10336166620254517,
-0.07838309556245804,
-0.002083810744807124,
-0.07660572230815887,
0.112522192299366,
0.15625843405723572,
0.1428947150707245,
-0.07871731370687485,
0.0029482238460332155,
-0.1028272733092308,
0.044226791709661484,
-0.0773889422416687,
-0.03627608343958855,
-0.1514863222837448,
0.021554403007030487,
0.03267381340265274,
-0.019861958920955658,
0.1699991077184677,
-0.07863657176494598,
0.004668996203690767,
0.014674035832285881,
0.019350243732333183,
-0.04019094631075859,
-0.0027990841772407293,
0.31640321016311646,
0.051987625658512115,
-0.01324701588600874,
-0.043666139245033264,
-0.07975521683692932,
-0.02363811992108822,
0.059270817786455154,
-0.09657050669193268,
0.1605502814054489,
0.05286410450935364,
0.10402442514896393,
0.10902193188667297,
-0.019523920491337776,
0.023455388844013214,
-0.008432939648628235,
-0.01778116449713707,
0.06652171164751053,
0.05875512957572937,
0.2793610394001007,
0.10686948895454407,
-0.02704598754644394,
0.01989198848605156,
-0.07282132655382156,
-0.044467732310295105,
-0.23141607642173767,
-0.01354787964373827,
-0.1284329891204834,
-0.18327920138835907,
0.05668003857135773,
-0.02716648578643799,
0.01059106644243002,
0.06015462428331375,
0.09111091494560242,
-0.03902493789792061,
0.028874365612864494,
-0.06826357543468475,
-0.10570397228002548,
0.11826950311660767,
-0.07167049497365952,
-0.01708516664803028,
-0.1264999508857727,
0.004253777675330639,
0.16258318722248077,
0.0037963318172842264,
0.02844163030385971,
-0.04393533989787102,
-0.11134526878595352,
0.002708710264414549,
-0.14091569185256958,
-0.03792351484298706,
0.016499124467372894,
-0.017550015822052956,
0.07837456464767456,
0.16758263111114502,
0.08342932909727097,
-0.009471391327679157,
0.07337087392807007,
0.054956868290901184,
-0.08744614571332932,
-0.11267925053834915,
-0.13645413517951965,
0.21120785176753998,
-0.02339482493698597,
0.016387829557061195,
0.03326703608036041,
-0.016464415937662125,
0.01734740659594536,
0.18236136436462402,
0.17639535665512085,
0.05152982100844383,
0.00974037405103445,
-0.03321593254804611,
-0.025811869651079178,
-0.029155079275369644,
-0.04851638898253441,
0.129459410905838,
0.1563534438610077,
-0.016100293025374413,
-0.01430895458906889,
-0.05518695339560509,
-0.003629604820162058,
0.04108237475156784,
0.012412698939442635,
-0.05308796092867851,
-0.1494297981262207,
0.007159935776144266,
0.16055375337600708,
-0.08117393404245377,
0.049989230930805206,
-0.11086775362491608,
-0.05594740808010101,
-0.0932699590921402,
0.0289467666298151,
0.0971454307436943,
0.0929664894938469,
-0.08157375454902649,
-0.08922459185123444,
-0.017055440694093704,
0.11903412640094757,
-0.035661228001117706,
-0.15978068113327026,
-0.07400019466876984,
0.041061077266931534,
0.0004277941770851612,
-0.0024143841583281755,
0.06381367146968842,
0.2220074087381363,
-0.011001079343259335,
0.11977479606866837,
0.005462510045617819,
0.1555669754743576,
0.01690012775361538,
-0.16353674232959747,
0.06005382537841797,
0.0980800911784172,
0.05546136572957039,
0.14117634296417236,
0.07252567261457443,
-0.10869857668876648,
0.015733608976006508,
-0.19245092570781708,
-0.07257619500160217,
-0.1927361935377121,
0.07350674271583557,
-0.07936662435531616,
0.025404587388038635,
-0.031393863260746,
-0.04736468940973282,
-0.007348811253905296,
-0.0314839705824852,
0.01675095409154892,
0.03923868387937546,
-0.0772024393081665,
-0.01257976796478033,
-0.2134789675474167,
0.00005652864638250321,
-0.16420064866542816,
-0.05146193504333496,
-0.1324724406003952,
-0.0351954810321331,
-0.023385731503367424,
-0.07629382610321045,
-0.02712935395538807,
0.01980810984969139,
0.007448010146617889,
0.001773171708919108,
0.012977180071175098,
0.02423112839460373,
0.07193002104759216,
0.08160515874624252,
-0.10570429265499115,
-0.06186462566256523
] |
null | null |
transformers
|
## Evaluation on Common Voice ES Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "facebook/wav2vec2-large-xlsr-53-spanish"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"]' # noqa: W605
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "es", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace("’", "'")
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 17.6 %
|
{"language": "es", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["common_voice"]}
|
automatic-speech-recognition
|
facebook/wav2vec2-large-xlsr-53-spanish
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"es",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"es"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #es #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
## Evaluation on Common Voice ES Test
Result: 17.6 %
|
[
"## Evaluation on Common Voice ES Test\n\nResult: 17.6 %"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #es #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"## Evaluation on Common Voice ES Test\n\nResult: 17.6 %"
] |
[
69,
13
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #speech #audio #es #dataset-common_voice #license-apache-2.0 #endpoints_compatible #has_space #region-us \n## Evaluation on Common Voice ES Test\n\nResult: 17.6 %"
] |
[
-0.17808641493320465,
0.038671914488077164,
-0.003617695299908519,
-0.08002137392759323,
-0.004942659754306078,
-0.038105569779872894,
0.0629613921046257,
0.12499396502971649,
0.07247012853622437,
0.048045966774225235,
0.051151227205991745,
0.13378873467445374,
0.015780873596668243,
-0.04177770018577576,
-0.08954208344221115,
-0.08962127566337585,
0.0694621205329895,
0.07840556651353836,
0.11720925569534302,
0.07589228451251984,
0.11524073779582977,
-0.05244586244225502,
-0.009089937433600426,
0.08382126688957214,
-0.033040162175893784,
0.06107369065284729,
0.08858351409435272,
-0.14936114847660065,
0.14211225509643555,
0.03958871588110924,
-0.005754999350756407,
0.055331047624349594,
-0.007294507697224617,
-0.19836363196372986,
-0.006286120507866144,
-0.019312119111418724,
0.06258508563041687,
0.0005139057175256312,
0.04019234701991081,
-0.017626309767365456,
0.008170102722942829,
0.12361519038677216,
-0.012075330130755901,
0.07402075082063675,
0.03972594812512398,
-0.2623136341571808,
-0.04174737632274628,
-0.057805392891168594,
0.0995885357260704,
0.05922713503241539,
-0.07143397629261017,
0.12293751537799835,
-0.13120409846305847,
0.09747903048992157,
0.13150274753570557,
-0.2681501805782318,
0.03190634399652481,
-0.009272216819226742,
0.09697980433702469,
-0.008419536985456944,
-0.01791706681251526,
0.05550709366798401,
0.11939121037721634,
0.05733156204223633,
-0.1356019824743271,
-0.02852959930896759,
-0.176804780960083,
0.010345305316150188,
-0.049548953771591187,
-0.0336202047765255,
0.3177267014980316,
0.03309731185436249,
-0.05135098099708557,
-0.10674668103456497,
0.013172002509236336,
0.015483282506465912,
0.05122801288962364,
-0.08107337355613708,
0.030917417258024216,
0.022521313279867172,
-0.03106905147433281,
0.05645597726106644,
-0.12381158024072647,
-0.1194906160235405,
-0.1301955282688141,
0.0736091136932373,
0.041367512196302414,
0.016965683549642563,
-0.030541714280843735,
0.009626652114093304,
-0.0810205265879631,
-0.06159621477127075,
-0.05891454964876175,
0.051562074571847916,
0.006402391940355301,
0.01952088437974453,
-0.05972154811024666,
-0.12581102550029755,
0.14529943466186523,
0.04535470902919769,
0.03107050061225891,
0.019218750298023224,
-0.08349170535802841,
0.07670792937278748,
0.02147034741938114,
0.16020739078521729,
-0.011401001363992691,
-0.021794889122247696,
0.018894124776124954,
-0.02547208033502102,
0.07078546285629272,
-0.0032235709950327873,
-0.048577431589365005,
-0.06103959307074547,
0.09052368998527527,
0.06262283772230148,
-0.02162417210638523,
-0.09432481974363327,
-0.057923249900341034,
0.044032420963048935,
0.0014311872655525804,
-0.09751997143030167,
0.02854952961206436,
0.0173476692289114,
0.0527755431830883,
0.1462879776954651,
0.013488116674125195,
0.045704010874032974,
-0.04941662400960922,
0.0028725690208375454,
0.05260729417204857,
-0.0034772881772369146,
0.0520012341439724,
-0.027615858241915703,
0.08534535020589828,
-0.012064190581440926,
0.0567304752767086,
-0.08446644991636276,
-0.0501825250685215,
-0.08096007257699966,
-0.03957279399037361,
0.007581348530948162,
-0.1634892374277115,
-0.06086710840463638,
-0.04830094426870346,
0.03685932978987694,
-0.12956950068473816,
-0.004613294266164303,
-0.06587304919958115,
0.08021951466798782,
0.0605354867875576,
0.0065337433479726315,
-0.014879484660923481,
0.06253974884748459,
-0.04365255683660507,
-0.033548254519701004,
0.01643090695142746,
0.08508016914129257,
-0.07023756206035614,
-0.021382644772529602,
-0.06718206405639648,
-0.0826711356639862,
-0.05199939012527466,
0.07203255593776703,
0.011536709032952785,
0.06223972886800766,
-0.21250151097774506,
-0.08030621707439423,
0.040210772305727005,
-0.10700821131467819,
-0.10380547493696213,
0.21241697669029236,
0.04763282090425491,
-0.032434407621622086,
0.1345718950033188,
0.304523766040802,
-0.012704753316938877,
-0.12953223288059235,
-0.0557117834687233,
0.08638162165880203,
0.042679451406002045,
-0.08593838661909103,
0.09927688539028168,
-0.11755984276533127,
-0.03680810704827309,
0.019808482378721237,
0.017791103571653366,
0.021646955981850624,
-0.035434212535619736,
-0.02756601944565773,
-0.04435313865542412,
-0.08639945834875107,
-0.0032094502821564674,
0.01607138104736805,
0.02356116473674774,
-0.15092872083187103,
-0.08086109906435013,
0.03352450206875801,
0.1012258306145668,
-0.09102044999599457,
0.07031992822885513,
-0.11089315265417099,
0.16065533459186554,
-0.13610932230949402,
-0.07575040310621262,
-0.09734805673360825,
0.2291155457496643,
-0.0017017885111272335,
0.024703092873096466,
0.06544346362352371,
0.09507446736097336,
0.007276230026036501,
-0.10893779247999191,
-0.015446305274963379,
-0.03804309293627739,
0.10796227306127548,
0.018214572221040726,
-0.04110228270292282,
-0.15511168539524078,
0.060883134603500366,
-0.05353831872344017,
-0.0728687047958374,
0.006606647279113531,
-0.05294691026210785,
0.021915379911661148,
0.09660253673791885,
-0.054472267627716064,
0.043324634432792664,
0.013606562279164791,
0.030693652108311653,
-0.045146647840738297,
0.05950741842389107,
-0.012741554528474808,
0.009027260355651379,
-0.08319470286369324,
0.22698535025119781,
-0.07790622115135193,
0.18464429676532745,
0.14974823594093323,
-0.1407860666513443,
0.0717187151312828,
0.19543765485286713,
0.014403978362679482,
0.003861892968416214,
-0.046386800706386566,
-0.08417721092700958,
0.2292913794517517,
-0.06295903772115707,
0.08553807437419891,
-0.14170536398887634,
-0.0527133010327816,
-0.0035037854686379433,
-0.05784192681312561,
-0.013242140412330627,
0.09006360918283463,
-0.03023306466639042,
0.01147694792598486,
0.07721330970525742,
0.1289556324481964,
-0.15203748643398285,
0.1297377645969391,
-0.08396755158901215,
-0.07231377810239792,
0.040564145892858505,
-0.016530318185687065,
-0.07472884654998779,
0.09618372470140457,
-0.23933370411396027,
-0.01710117608308792,
0.056564170867204666,
0.06483806669712067,
0.09892302006483078,
-0.1786177009344101,
0.04055808484554291,
0.011546107940375805,
-0.10558337718248367,
-0.137377068400383,
0.10423702746629715,
-0.02072686329483986,
0.050570014864206314,
-0.15544109046459198,
-0.16875304281711578,
0.02502647042274475,
-0.03941045701503754,
-0.14862582087516785,
0.06321793049573898,
-0.07981929928064346,
-0.1374255269765854,
-0.0773211121559143,
0.08435892313718796,
-0.005223891232162714,
0.0187196247279644,
0.18539820611476898,
-0.10776709765195847,
-0.019123107194900513,
-0.034502025693655014,
-0.015033205971121788,
-0.07636803388595581,
0.0386391282081604,
-0.004141886252909899,
-0.021185973659157753,
0.055781230330467224,
-0.15149162709712982,
-0.032962314784526825,
-0.024104945361614227,
0.008511582389473915,
-0.00574916647747159,
-0.005818746984004974,
0.016941620036959648,
0.18313461542129517,
0.04865887016057968,
-0.03607755899429321,
-0.05895669013261795,
0.1214170753955841,
-0.08266095817089081,
-0.10230061411857605,
0.16862580180168152,
0.002924666740000248,
-0.0672730877995491,
0.15143485367298126,
-0.004989953711628914,
-0.0528179332613945,
-0.07052856683731079,
-0.030144356191158295,
-0.02783340960741043,
-0.35814428329467773,
-0.07703480869531631,
-0.07853591442108154,
-0.01842583902180195,
-0.11500459164381027,
0.050639376044273376,
0.04648188129067421,
-0.03217028081417084,
-0.003948451951146126,
-0.09181997179985046,
0.015041394159197807,
-0.04367341101169586,
0.25009363889694214,
-0.06982891261577606,
0.12131325155496597,
-0.07019291818141937,
-0.03245199844241142,
0.038321055471897125,
0.05382375791668892,
0.05467066541314125,
0.1146959513425827,
0.03997402265667915,
0.039513710886240005,
0.12803925573825836,
0.05881946161389351,
0.037191241979599,
0.021597562357783318,
0.02100512944161892,
-0.00495292479172349,
-0.0009932802058756351,
-0.004618347622454166,
0.07229465246200562,
0.2520960569381714,
0.0194063913077116,
-0.06183341518044472,
-0.1409711092710495,
0.07178302854299545,
0.16208989918231964,
0.14630921185016632,
-0.133036807179451,
-0.004013338126242161,
0.04857911542057991,
-0.15012586116790771,
-0.018684890121221542,
0.1179867684841156,
0.052751775830984116,
-0.0357510931789875,
0.1262250691652298,
0.08451905101537704,
0.0672663226723671,
0.04615216329693794,
0.03928860276937485,
-0.10911267250776291,
-0.09428055584430695,
0.052385762333869934,
0.07331425696611404,
-0.23172567784786224,
0.19500967860221863,
0.005568642634898424,
0.09379231929779053,
0.0012397747486829758,
0.045897431671619415,
0.00792978797107935,
0.04879409819841385,
0.1208513155579567,
-0.03385886922478676,
-0.17468081414699554,
0.0387602224946022,
-0.029153982177376747,
0.10842327773571014,
-0.014343545772135258,
0.11707305163145065,
-0.04882873594760895,
0.025582076981663704,
0.00912800058722496,
0.03801592066884041,
-0.06981098651885986,
-0.16085249185562134,
-0.00855887308716774,
-0.03293021395802498,
0.24766595661640167,
0.059062834829092026,
-0.040200039744377136,
-0.10112069547176361,
-0.22760191559791565,
0.06558193266391754,
-0.14175981283187866,
-0.03066345676779747,
0.0019752082880586386,
-0.1397412121295929,
0.17329049110412598,
-0.012607710435986519,
-0.02336273342370987,
0.03528511896729469,
0.009605098515748978,
-0.01387591939419508,
-0.022418063133955002,
0.08052908629179001,
-0.10645585507154465,
-0.08336729556322098,
-0.05110064893960953,
0.2770298421382904,
-0.05580057203769684,
0.12773871421813965,
0.059366609901189804,
0.012935000471770763,
-0.0337720587849617,
-0.005344633478671312,
0.07271905243396759,
0.10323815792798996,
-0.1538112610578537,
0.05313287302851677,
0.06314560770988464,
-0.18679605424404144,
-0.08697710186243057,
-0.07060299068689346,
0.2019949108362198,
0.12417005747556686,
-0.051403582096099854,
0.10279827564954758,
0.19657711684703827,
-0.08931875228881836,
-0.1825341135263443,
-0.10174377262592316,
0.03211385756731033,
0.10405676066875458,
-0.04266420379281044,
-0.028646845370531082,
0.12009173631668091,
0.0020214729011058807,
-0.09855198115110397,
-0.04881675913929939,
-0.15502944588661194,
-0.11986576020717621,
0.23977909982204437,
-0.18262958526611328,
0.2026999443769455,
-0.05970769003033638,
-0.07173367589712143,
-0.037797506898641586,
0.022753512486815453,
-0.0685737207531929,
-0.02563045732676983,
0.05587374418973923,
0.03953962028026581,
0.08779974281787872,
0.03026498109102249,
0.002080801874399185,
0.17931270599365234,
0.044791288673877716,
-0.0428706519305706,
0.053257059305906296,
0.08647646009922028,
-0.13615109026432037,
0.03786730393767357,
0.12825161218643188,
-0.15998032689094543,
0.03953627869486809,
-0.06277039647102356,
-0.04986044019460678,
-0.054191794246435165,
0.09548518061637878,
0.07260989397764206,
0.024248821660876274,
-0.06626034528017044,
-0.12603436410427094,
-0.017525015398859978,
0.0028638250660151243,
0.17048311233520508,
-0.09031976759433746,
0.10236024856567383,
0.08190707117319107,
0.22391556203365326,
-0.20829813182353973,
-0.10648050159215927,
0.029881341382861137,
-0.114237941801548,
0.09958839416503906,
-0.12412795424461365,
0.0723433718085289,
0.10721515864133835,
0.02013428322970867,
0.08146779239177704,
0.029796695336699486,
-0.11761411279439926,
0.07755710929632187,
0.055811021476984024,
-0.0042914715595543385,
-0.09280760586261749,
-0.007388899568468332,
0.010105673223733902,
-0.001987715717405081,
0.0855436772108078,
0.12144976109266281,
-0.058094512671232224,
-0.018394894897937775,
-0.026616545394062996,
-0.0024236426688730717,
-0.14824271202087402,
0.26306766271591187,
0.01712041348218918,
0.01969859190285206,
-0.11933855712413788,
0.0664726048707962,
-0.05644512176513672,
-0.14900422096252441,
0.0294825229793787,
-0.1235983744263649,
0.024109171703457832,
-0.06821178644895554,
-0.04698524251580238,
0.03782663494348526,
0.037769343703985214,
-0.1801036298274994,
-0.09265944361686707,
-0.18527068197727203,
0.05656491219997406,
0.10713863372802734,
0.08625739067792892,
0.06758392602205276,
-0.028383664786815643,
-0.05076433718204498,
0.024958666414022446,
0.049881480634212494,
0.058296848088502884,
0.04435938596725464,
-0.17414747178554535,
-0.005970470607280731,
0.037649966776371,
0.06300181150436401,
-0.042097315192222595,
-0.009837712161242962,
0.012712989933788776,
0.09913577884435654,
-0.11686786264181137,
0.024341581389307976,
-0.04815216362476349,
-0.007850821129977703,
0.12733371555805206,
-0.11839834600687027,
-0.07714244723320007,
0.07085637003183365,
-0.11438801139593124,
0.06439285725355148,
0.02588530257344246,
0.14824289083480835,
-0.06377270072698593,
0.003952282480895519,
0.04065651074051857,
-0.06462119519710541,
0.07796739786863327,
0.11367478221654892,
-0.11423489451408386,
0.10903308540582657,
-0.2265922576189041,
-0.13652008771896362,
0.1622457057237625,
0.08326514065265656,
-0.06256775557994843,
-0.12511605024337769,
0.014847596175968647,
0.22200821340084076,
0.04155012220144272,
0.0015625219093635678,
-0.03453667461872101,
-0.04318232461810112,
-0.07766333222389221,
-0.10622955113649368,
-0.09290505200624466,
-0.002761994954198599,
-0.08884775638580322,
0.15644121170043945,
0.13043899834156036,
0.16418218612670898,
-0.056866586208343506,
-0.00674936780706048,
-0.09600729495286942,
0.06720279902219772,
-0.0869954526424408,
-0.0482349619269371,
-0.18280529975891113,
0.008957735262811184,
0.043198633939027786,
-0.05204031243920326,
0.20996755361557007,
-0.031980227679014206,
-0.004245894029736519,
0.024405747652053833,
0.028408687561750412,
0.0038711391389369965,
0.03576333075761795,
0.25143080949783325,
0.003557973774150014,
0.013437964022159576,
-0.09297982603311539,
-0.0866655632853508,
0.03427385166287422,
0.06728121638298035,
-0.1354254186153412,
0.14306969940662384,
0.08958360552787781,
0.16684618592262268,
0.08756896108388901,
-0.05902345851063728,
-0.03597652167081833,
-0.005049902480095625,
-0.018630681559443474,
0.058611560612916946,
0.05483401194214821,
0.1936684101819992,
0.1434164196252823,
-0.04714064300060272,
0.045537710189819336,
-0.07895288616418839,
-0.03346749022603035,
-0.21775837242603302,
-0.013647150248289108,
-0.12671133875846863,
-0.1796564906835556,
0.03766497224569321,
-0.05429219827055931,
0.05609121918678284,
0.06578519940376282,
0.05922449007630348,
-0.020531337708234787,
-0.015163308009505272,
-0.05657714977860451,
-0.07997161895036697,
0.08542174845933914,
-0.0666448324918747,
0.009570195339620113,
-0.08228051662445068,
-0.017882214859128,
0.1522398442029953,
0.029123777523636818,
-0.0014070047764107585,
-0.03606358915567398,
-0.10172372311353683,
0.015519550070166588,
-0.1223846822977066,
-0.03351841866970062,
-0.002651545451954007,
-0.045188065618276596,
0.08532446622848511,
0.1496942937374115,
0.08506133407354355,
-0.03099062852561474,
0.09067078679800034,
0.08186963200569153,
-0.10370354354381561,
-0.0985606461763382,
-0.11504950374364853,
0.09551922976970673,
0.010606871917843819,
0.06892788410186768,
0.020268984138965607,
-0.05943578854203224,
-0.04018735885620117,
0.09709260612726212,
0.1840236932039261,
0.07217416912317276,
0.0045015341602265835,
-0.08277030289173126,
-0.018771858885884285,
-0.05396094545722008,
-0.11802241951227188,
0.1051110103726387,
0.20354856550693512,
-0.01349464152008295,
-0.006329202558845282,
-0.0900176540017128,
0.01720624789595604,
0.026409653946757317,
0.050080012530088425,
-0.07585816830396652,
-0.15984338521957397,
0.037079863250255585,
0.13386861979961395,
-0.03139298036694527,
-0.042895473539829254,
-0.061031173914670944,
-0.11113642901182175,
-0.06645181775093079,
0.0651664063334465,
0.19473488628864288,
0.10246698558330536,
-0.06162558123469353,
-0.1084701418876648,
-0.04269390180706978,
0.07650759071111679,
-0.05576273053884506,
-0.1538676917552948,
-0.0014970462070778012,
0.011637682095170021,
-0.03559371456503868,
0.03369162976741791,
0.060280751436948776,
0.2563244700431824,
0.004581236280500889,
0.1314631551504135,
0.006422632839530706,
0.15502490103244781,
0.023306887596845627,
-0.14515253901481628,
0.10880638659000397,
0.10895790904760361,
0.02956320531666279,
0.16530777513980865,
0.07612483203411102,
-0.0803074911236763,
0.03754818066954613,
-0.142982617020607,
-0.09253358095884323,
-0.1649627834558487,
0.044708818197250366,
-0.06544958800077438,
0.010754848830401897,
-0.08078764379024506,
-0.056000422686338425,
-0.03473024442791939,
0.0013122732052579522,
0.017261452972888947,
0.04363613203167915,
-0.05466781556606293,
-0.00452842703089118,
-0.21375924348831177,
0.0406772717833519,
-0.08032390475273132,
-0.02147775888442993,
-0.05599622800946236,
-0.037589967250823975,
-0.025550734251737595,
-0.054200924932956696,
-0.028550731018185616,
0.008865095674991608,
0.08644108474254608,
-0.02210436575114727,
-0.003818045137450099,
-0.053496189415454865,
0.06033480539917946,
0.07520284503698349,
-0.08444634079933167,
-0.08288724720478058
] |
null | null |
transformers
|
# Wav2Vec2-XLSR-53
[Facebook's XLSR-Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information.
[Paper](https://arxiv.org/abs/2006.13979)
Authors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli
**Abstract**
This paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLSR_Wav2Vec2_on_Turkish_ASR_with_%F0%9F%A4%97_Transformers.ipynb) for more information on how to fine-tune the model.

|
{"language": "multilingual", "license": "apache-2.0", "tags": ["speech"], "datasets": ["common_voice"]}
| null |
facebook/wav2vec2-large-xlsr-53
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"speech",
"multilingual",
"dataset:common_voice",
"arxiv:2006.13979",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.13979"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #pretraining #speech #multilingual #dataset-common_voice #arxiv-2006.13979 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLSR-53
Facebook's XLSR-Wav2Vec2
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.
Paper
Authors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli
Abstract
This paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages.
The original model can be found under URL
# Usage
See this notebook for more information on how to fine-tune the model.
!model image
|
[
"# Wav2Vec2-XLSR-53 \n\nFacebook's XLSR-Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.\n\nPaper\n\nAuthors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nThis paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages.\n\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model.\n\n!model image"
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #speech #multilingual #dataset-common_voice #arxiv-2006.13979 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLSR-53 \n\nFacebook's XLSR-Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.\n\nPaper\n\nAuthors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nThis paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages.\n\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model.\n\n!model image"
] |
[
70,
369,
21
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #pretraining #speech #multilingual #dataset-common_voice #arxiv-2006.13979 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-XLSR-53 \n\nFacebook's XLSR-Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.\n\nPaper\n\nAuthors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nThis paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages.\n\nThe original model can be found under URL# Usage\n\nSee this notebook for more information on how to fine-tune the model.\n\n!model image"
] |
[
-0.11283499747514725,
0.07704280316829681,
-0.0034502262715250254,
0.038050804287195206,
0.03414734825491905,
-0.026889530941843987,
0.08195896446704865,
0.08268766850233078,
-0.06691808253526688,
0.04348493367433548,
0.0010866106022149324,
-0.06909238547086716,
0.06956940144300461,
0.1260322630405426,
0.06930668652057648,
-0.19992676377296448,
0.02932550013065338,
-0.1200559213757515,
0.12697407603263855,
0.05808667838573456,
0.1129130870103836,
-0.06820153445005417,
0.038328055292367935,
0.03332068771123886,
-0.07377658784389496,
-0.0005262751947157085,
-0.013337521813809872,
-0.03926435485482216,
0.07440097630023956,
0.044180162250995636,
0.06734099239110947,
0.031037509441375732,
0.0767231434583664,
-0.15146437287330627,
0.02118813991546631,
0.03740987181663513,
0.0587102547287941,
0.04062088578939438,
0.11512063443660736,
0.010763823986053467,
0.10768996179103851,
-0.015109243802726269,
-0.010405332781374454,
0.11906745284795761,
-0.05944652482867241,
-0.14015890657901764,
-0.12054689973592758,
0.09377606958150864,
0.06689805537462234,
0.06900568306446075,
-0.05048405006527901,
0.046119049191474915,
0.016535047441720963,
0.05178651586174965,
0.027430159971117973,
-0.22681835293769836,
-0.02987687662243843,
0.027049031108617783,
0.0425972081720829,
-0.01515196729451418,
-0.06576208025217056,
0.02258678898215294,
0.024302829056978226,
0.01103728637099266,
-0.005804249085485935,
-0.040396060794591904,
0.011045499704778194,
-0.08678089082241058,
-0.11718442291021347,
0.03996855765581131,
0.04176706448197365,
-0.01357356645166874,
-0.13225984573364258,
-0.202390655875206,
-0.0071360087022185326,
0.045523472130298615,
0.0034832360688596964,
-0.03521784394979477,
0.009702560491859913,
0.042993709444999695,
0.08285598456859589,
-0.10496052354574203,
-0.10729768127202988,
-0.000552184646949172,
-0.03511091694235802,
0.12022317200899124,
0.06211083009839058,
0.019574308767914772,
-0.014080647379159927,
0.06921139359474182,
-0.08793629705905914,
-0.061459895223379135,
-0.04463055357336998,
-0.09274227917194366,
-0.11182362586259842,
-0.01847662404179573,
-0.04214486479759216,
-0.1649557650089264,
0.010533236898481846,
0.09181801974773407,
-0.03645629063248634,
0.06843661516904831,
-0.053998202085494995,
0.007807845715433359,
0.07611323148012161,
0.14440779387950897,
-0.0633731335401535,
-0.062276698648929596,
0.02457515522837639,
-0.03582986444234848,
0.012160600163042545,
0.013330441899597645,
-0.03867383673787117,
-0.03583814203739166,
-0.022983824834227562,
0.056513573974370956,
-0.0206863172352314,
-0.010427317582070827,
-0.052150677889585495,
-0.032844722270965576,
0.08205272257328033,
-0.13156366348266602,
0.022742100059986115,
0.0006358484970405698,
0.016855819150805473,
0.17085309326648712,
0.048776838928461075,
-0.016049252822995186,
-0.09975860267877579,
-0.017156092450022697,
-0.0032872019801288843,
0.01844099536538124,
-0.05840197950601578,
-0.06713380664587021,
0.020294880494475365,
-0.0010822487529367208,
-0.051788080483675,
-0.10197591781616211,
-0.03187349811196327,
-0.06571976840496063,
0.03069683164358139,
-0.07281821221113205,
-0.000808197888545692,
-0.053089309483766556,
-0.024632170796394348,
0.0102451853454113,
0.0011959890834987164,
-0.013123666867613792,
0.006567634176462889,
-0.01353750191628933,
-0.0028479734901338816,
0.04607672616839409,
-0.015777571126818657,
0.023468054831027985,
-0.022044075652956963,
0.0074428534135222435,
-0.0947200134396553,
0.1401156485080719,
-0.04339902848005295,
-0.06040806323289871,
-0.11973161995410919,
-0.018265606835484505,
-0.04230678081512451,
0.053886692970991135,
0.04866666719317436,
0.09946422278881073,
-0.19695541262626648,
-0.015559808351099491,
0.24213889241218567,
-0.14107102155685425,
0.05334136262536049,
0.14664709568023682,
-0.021091127768158913,
0.10414732247591019,
0.14276129007339478,
0.07025496661663055,
0.09914375096559525,
-0.17023862898349762,
-0.08004623651504517,
-0.005758018232882023,
-0.05031279847025871,
0.06524401158094406,
0.0798007994890213,
-0.06074079871177673,
0.06247705593705177,
0.013874506577849388,
0.04778173938393593,
0.02897368185222149,
-0.005286724306643009,
-0.038971830159425735,
0.0015157844172790647,
-0.052474260330200195,
0.00016775830590631813,
-0.026388349011540413,
0.0004782270116265863,
-0.039818476885557175,
-0.14448845386505127,
-0.07228972762823105,
0.10717350989580154,
-0.07040930539369583,
0.05792064592242241,
-0.11171428114175797,
-0.04007670655846596,
-0.007385880686342716,
0.0060163079760968685,
-0.1266362965106964,
0.0530371367931366,
0.025544701144099236,
-0.08274255692958832,
0.11550246924161911,
0.011584443971514702,
0.014121266081929207,
0.04717976599931717,
-0.031618714332580566,
0.043559107929468155,
-0.014187155291438103,
-0.009551961906254292,
-0.05460361763834953,
-0.08026959002017975,
-0.03072584979236126,
-0.0749308317899704,
0.050660379230976105,
-0.07456856220960617,
-0.011345206759870052,
0.03775358945131302,
0.04676692932844162,
0.01882355846464634,
-0.0622716024518013,
0.04494677856564522,
0.03683256357908249,
0.013572344556450844,
-0.03871281072497368,
0.023579759523272514,
-0.01338996086269617,
-0.0054725296795368195,
0.11264391243457794,
-0.07506272196769714,
-0.16662372648715973,
0.06687881797552109,
0.017158376052975655,
-0.11114933341741562,
0.018163533881306648,
0.001129700685851276,
-0.04700303077697754,
-0.06443007290363312,
-0.06132129952311516,
0.24240444600582123,
0.008614983409643173,
0.10173171013593674,
-0.054763492196798325,
-0.015700900927186012,
0.003383115166798234,
-0.010118221864104271,
-0.013387400656938553,
0.05956518277525902,
-0.0423242524266243,
-0.1275552362203598,
0.002239516470581293,
0.03372321277856827,
0.044146180152893066,
0.13201190531253815,
-0.0023073621559888124,
-0.09460053592920303,
-0.024619637057185173,
0.015081087127327919,
-0.0016451228875666857,
0.04562336951494217,
-0.024211663752794266,
-0.02221629209816456,
0.011465844698250294,
0.06933538615703583,
0.08737818896770477,
-0.03920019045472145,
0.08508072048425674,
0.017347456887364388,
-0.0450640544295311,
-0.04677311331033707,
0.004697092343121767,
-0.0037950691767036915,
0.04767856001853943,
0.0048940470442175865,
0.03096163645386696,
-0.004860817454755306,
-0.04518349841237068,
-0.0892462208867073,
0.10697918385267258,
-0.10980472713708878,
-0.21904192864894867,
-0.1896825134754181,
-0.0473775714635849,
-0.007763942703604698,
-0.008669554255902767,
0.015218897722661495,
-0.05936989188194275,
-0.08181404322385788,
-0.07919007539749146,
0.17118920385837555,
-0.07117429375648499,
-0.029393859207630157,
-0.01366912666708231,
0.03142266348004341,
0.007065217010676861,
-0.12779101729393005,
0.004925474990159273,
-0.021969832479953766,
-0.0886516273021698,
-0.01995660550892353,
0.01390411052852869,
-0.025857768952846527,
0.101711206138134,
0.0015545949572697282,
-0.027766680344939232,
-0.021200159564614296,
0.12738627195358276,
-0.04794665426015854,
0.06055799126625061,
0.17455025017261505,
-0.08971767127513885,
0.005061483476310968,
0.05400126427412033,
0.026283733546733856,
-0.07046114653348923,
0.026414813473820686,
-0.007846466265618801,
-0.046495381742715836,
-0.21715179085731506,
-0.08204295486211777,
-0.0488492026925087,
-0.032038167119026184,
0.0014175481628626585,
0.009766936302185059,
-0.08476303517818451,
-0.007142443209886551,
-0.03126058354973793,
-0.04982614517211914,
0.09195291250944138,
0.030772361904382706,
0.10084711015224457,
-0.04856426641345024,
0.07679544389247894,
-0.0515240803360939,
0.004845081828534603,
0.09261257946491241,
0.022538529708981514,
0.17815741896629333,
0.03483734279870987,
0.12303584814071655,
0.09187706559896469,
0.03709040954709053,
0.059943150728940964,
0.05836259946227074,
-0.011905217543244362,
0.03564769774675369,
-0.01829734817147255,
-0.04269767180085182,
0.021519077941775322,
0.017444593831896782,
0.05075501650571823,
-0.03406596556305885,
-0.04611707478761673,
-0.04592417925596237,
0.011161853559315205,
0.19328488409519196,
0.08140478283166885,
-0.07539240270853043,
-0.07437486201524734,
0.015299863182008266,
-0.13651402294635773,
-0.048359040170907974,
0.04291774332523346,
0.14068804681301117,
-0.10369038581848145,
0.056003015488386154,
0.00558832474052906,
0.08142341673374176,
-0.11012260615825653,
-0.011977572925388813,
-0.04626326635479927,
0.10255168378353119,
0.02769484557211399,
0.04583727940917015,
-0.10858642309904099,
0.09065558761358261,
0.025273242965340614,
0.11152034997940063,
-0.021957561373710632,
0.06420587748289108,
0.028139924630522728,
-0.03970003128051758,
0.12023671716451645,
-0.004444480873644352,
-0.022080648690462112,
-0.018383005633950233,
-0.16406236588954926,
0.013974078930914402,
0.09343486279249191,
0.01299375481903553,
0.08091814815998077,
-0.004407296422868967,
-0.006217413116246462,
-0.03629317507147789,
0.02371528558433056,
-0.19526484608650208,
-0.1686725914478302,
0.029412997886538506,
0.004987943917512894,
-0.008303053677082062,
-0.0339604914188385,
-0.05323220416903496,
-0.10104837268590927,
0.20070675015449524,
-0.18659259378910065,
-0.07802874594926834,
-0.09847905486822128,
-0.008485905826091766,
0.15352009236812592,
-0.040807344019412994,
0.013643020763993263,
0.03359578177332878,
0.0756131187081337,
-0.06665043532848358,
-0.09816280007362366,
0.015854759141802788,
-0.07816483080387115,
-0.11545655131340027,
0.010618330910801888,
0.18200305104255676,
0.13301780819892883,
0.02506907284259796,
0.028285037726163864,
0.0025658525992184877,
0.00692110788077116,
-0.09470397233963013,
-0.01807788386940956,
0.19004127383232117,
-0.02001386135816574,
0.04865728318691254,
-0.021541321650147438,
-0.19138720631599426,
-0.08770742267370224,
-0.027709119021892548,
0.08579464256763458,
0.15127117931842804,
-0.059365469962358475,
0.20173566043376923,
0.1408432424068451,
-0.12564091384410858,
-0.22071625292301178,
-0.034873928874731064,
0.055206265300512314,
0.09621904790401459,
0.04417577385902405,
-0.18590758740901947,
0.06935745477676392,
0.022356346249580383,
-0.01151628140360117,
-0.0137661537155509,
-0.2129279226064682,
-0.12447436153888702,
0.09049641340970993,
-0.03738657012581825,
0.13946664333343506,
-0.014348305761814117,
-0.022264821454882622,
-0.015063255093991756,
0.02054806426167488,
0.06713435053825378,
-0.07564068585634232,
0.10207773000001907,
0.034088339656591415,
-0.02914155274629593,
0.03703952208161354,
-0.008833153173327446,
0.0964091569185257,
0.02184983901679516,
0.0011378496419638395,
-0.0014462468679994345,
0.07926172763109207,
-0.02473088912665844,
0.0009421991417184472,
0.11238882690668106,
0.05741772800683975,
0.00042301518260501325,
-0.052761808037757874,
-0.08578813821077347,
-0.025785597041249275,
0.0442873015999794,
-0.01708618924021721,
0.011170590296387672,
-0.05188875272870064,
0.06552515178918839,
0.0035803867504000664,
0.0016711861826479435,
-0.0025152317248284817,
-0.08761642873287201,
-0.07796411216259003,
0.19044530391693115,
0.2202286273241043,
-0.014923905953764915,
0.005053902976214886,
-0.006962331011891365,
-0.0416666604578495,
0.05994202196598053,
-0.03930952399969101,
0.027905113995075226,
0.06746761500835419,
-0.025735676288604736,
0.046656202524900436,
0.005205246619880199,
-0.16501538455486298,
0.05044390261173248,
0.057878296822309494,
-0.02783169411122799,
-0.14065471291542053,
-0.03082495741546154,
-0.03134435787796974,
-0.03717954456806183,
0.010407686233520508,
0.21686747670173645,
-0.07992426306009293,
-0.03828940540552139,
-0.008829913102090359,
0.061169397085905075,
-0.07838037610054016,
0.13538934290409088,
-0.012289764359593391,
0.008312328718602657,
-0.06802379339933395,
0.12437205761671066,
0.06683008372783661,
-0.02162240631878376,
0.044530726969242096,
0.11219531297683716,
-0.05265665799379349,
-0.04668252915143967,
-0.07774978876113892,
0.010461756959557533,
0.06036316230893135,
-0.07908511906862259,
0.05860680714249611,
-0.07124106585979462,
-0.01211811974644661,
0.06987394392490387,
0.03444908559322357,
0.07317710667848587,
-0.06751953810453415,
-0.0041921730153262615,
-0.06758778542280197,
0.03651759400963783,
0.0618118941783905,
-0.030920328572392464,
-0.03947160020470619,
0.10271470248699188,
0.005975044798105955,
0.027117399498820305,
-0.039774443954229355,
-0.06063070148229599,
-0.07823672890663147,
0.004290222190320492,
-0.08343823254108429,
0.03917544335126877,
-0.09628622978925705,
-0.01823434978723526,
0.0036703546065837145,
-0.010923792608082294,
0.02353612706065178,
0.058837372809648514,
-0.03000332973897457,
-0.005469831638038158,
-0.06084141880273819,
0.06665240973234177,
-0.056770071387290955,
0.049211736768484116,
0.0157928504049778,
-0.054285675287246704,
0.09710726141929626,
-0.00563447829335928,
-0.051232289522886276,
0.06372831761837006,
-0.1380549669265747,
-0.029545489698648453,
-0.02082645520567894,
0.006704603787511587,
-0.031127501279115677,
-0.14940877258777618,
-0.00906412024050951,
0.03586835041642189,
0.01648540422320366,
-0.011016705073416233,
0.11984864622354507,
-0.039086125791072845,
-0.004250975325703621,
-0.033543605357408524,
0.021826155483722687,
-0.04238918796181679,
0.02359938621520996,
0.09527924656867981,
0.08956354111433029,
0.09130973368883133,
-0.07923871278762817,
0.049455560743808746,
-0.08496175706386566,
0.01758723333477974,
-0.020479002967476845,
0.0377681590616703,
-0.02226804383099079,
-0.06751756370067596,
0.06398941576480865,
0.00300971744582057,
0.11743538081645966,
-0.02775682508945465,
-0.02308448776602745,
0.018820006400346756,
-0.09734950959682465,
-0.12227129191160202,
0.04302840679883957,
0.04925796017050743,
0.028604643419384956,
0.011279027909040451,
-0.045370977371931076,
-0.007135163061320782,
-0.028899241238832474,
0.001197566743940115,
0.09007997810840607,
0.16800859570503235,
0.04174347594380379,
0.09917538613080978,
0.007680694572627544,
-0.024770474061369896,
-0.06150226667523384,
0.07336399704217911,
-0.043659698218107224,
-0.009165255352854729,
-0.08804965764284134,
0.039211951196193695,
0.05841079354286194,
-0.08158641308546066,
0.12114355713129044,
0.025002343580126762,
-0.06063664332032204,
-0.11551617085933685,
-0.14968885481357574,
-0.030171671882271767,
-0.015203814022243023,
-0.022118542343378067,
-0.08470948785543442,
0.05034681037068367,
0.0452386811375618,
0.058331217616796494,
-0.0365823395550251,
0.07871471345424652,
-0.13190525770187378,
-0.12587624788284302,
0.10044799745082855,
-0.01412445493042469,
0.05353058502078056,
0.04507525637745857,
0.034923747181892395,
0.020820224657654762,
0.09165626019239426,
0.09256342053413391,
0.06742732226848602,
0.035419002175331116,
-0.011751395650207996,
-0.10363729298114777,
-0.06137193366885185,
-0.01011671032756567,
0.0016104296082630754,
0.09280545264482498,
0.16405285894870758,
0.07801465690135956,
-0.11288043856620789,
-0.01808147504925728,
0.12032034993171692,
-0.032858457416296005,
-0.08029184490442276,
-0.16249261796474457,
0.13770844042301178,
0.0011141321156173944,
0.012146682478487492,
-0.017801934853196144,
-0.05307852849364281,
-0.004998805932700634,
0.23181940615177155,
0.16916675865650177,
0.0025036947336047888,
-0.009713759645819664,
-0.003937411122024059,
-0.002070182003080845,
-0.00020410583238117397,
0.14662574231624603,
0.0010627867886796594,
0.2565791606903076,
0.011359591037034988,
0.07089962810277939,
-0.004520745947957039,
-0.06445277482271194,
-0.05983234941959381,
0.07989686727523804,
-0.08835660666227341,
0.0002931035414803773,
-0.03713022544980049,
0.13718755543231964,
-0.03477713093161583,
-0.20463576912879944,
-0.0189613439142704,
-0.05907367914915085,
-0.08558008819818497,
0.021447353065013885,
-0.016458649188280106,
0.03831871598958969,
0.04300934448838234,
-0.001450648414902389,
0.026088356971740723,
0.20494985580444336,
0.006069448310881853,
-0.00702032120898366,
-0.019180092960596085,
0.032419633120298386,
-0.15483014285564423,
0.08659528940916061,
0.04011744633316994,
0.10108508914709091,
0.047117602080106735,
0.020989472046494484,
-0.09267856925725937,
0.02041364088654518,
-0.06336674839258194,
-0.018219327554106712,
0.040007516741752625,
0.2009275257587433,
-0.008667295798659325,
0.1262592077255249,
0.03163481131196022,
-0.038102176040410995,
0.025590643286705017,
-0.05354967713356018,
-0.030406499281525612,
-0.06603670120239258,
0.09842584282159805,
-0.09756781160831451,
0.16056010127067566,
0.11987930536270142,
-0.04972986876964569,
0.007655337918549776,
-0.013969780877232552,
0.028436459600925446,
-0.051757533103227615,
0.10627588629722595,
-0.03937399387359619,
-0.1760139763355255,
-0.024110954254865646,
-0.07518045604228973,
0.03948822617530823,
-0.1815205216407776,
-0.021795762702822685,
-0.023876531049609184,
0.0186954103410244,
-0.04463530331850052,
0.08487042039632797,
0.03684660419821739,
-0.009510884992778301,
-0.01204892247915268,
-0.07123006880283356,
0.04313994199037552,
0.04921945929527283,
-0.09017694741487503,
-0.037704452872276306
] |
null | null |
transformers
|
# Wav2Vec2-Large
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/drive/1FjTsqbYKphl9kL-eILgUc-bl4zVThL8F?usp=sharing) for more information on how to fine-tune the model.
|
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["librispeech_asr"]}
| null |
facebook/wav2vec2-large
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2006.11477"
] |
[
"en"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large
Facebook's Wav2Vec2
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.
Paper
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
Abstract
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under URL
# Usage
See this notebook for more information on how to fine-tune the model.
|
[
"# Wav2Vec2-Large \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL",
"# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
66,
319,
18
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large \n\nFacebook's Wav2Vec2\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out this blog for more information.\n\nPaper\n\nAuthors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli\n\nAbstract\nWe show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.\nThe original model can be found under URL# Usage\n\nSee this notebook for more information on how to fine-tune the model."
] |
[
-0.10663709789514542,
0.08504707366228104,
-0.005525061395019293,
-0.02761026658117771,
-0.013179843313992023,
-0.0241811852902174,
0.09775140136480331,
0.11102207005023956,
-0.05539820343255997,
0.1218370795249939,
0.015030414797365665,
-0.059694331139326096,
0.0700860470533371,
0.13341212272644043,
0.0251314640045166,
-0.230526864528656,
0.04218287020921707,
-0.07245875149965286,
0.10220040380954742,
0.06494098901748657,
0.1273716539144516,
-0.11771927028894424,
0.010298208333551884,
0.020042458549141884,
-0.015616384334862232,
-0.0035681454464793205,
-0.012452530674636364,
-0.07708340138196945,
0.069676473736763,
0.024527378380298615,
0.06319721043109894,
0.09761665761470795,
0.08661378920078278,
-0.19660741090774536,
0.014291871339082718,
0.03669994696974754,
0.04087051749229431,
0.06799447536468506,
0.055798761546611786,
0.010389991104602814,
0.08725277334451675,
-0.08027797937393188,
0.007200780790299177,
0.10296111553907394,
-0.06198037788271904,
-0.10369203984737396,
-0.11067145317792892,
0.1148793175816536,
0.051282450556755066,
0.019945120438933372,
-0.04716210439801216,
0.023503828793764114,
-0.0276800524443388,
0.06240534037351608,
0.15409180521965027,
-0.26330655813217163,
-0.020116811618208885,
0.023744674399495125,
-0.02288120612502098,
-0.012209352105855942,
-0.09243851155042648,
0.07487137615680695,
0.028942538425326347,
-0.013027013279497623,
0.025661785155534744,
0.005697139073163271,
0.014095768332481384,
-0.039175719022750854,
-0.11986197531223297,
-0.040420107543468475,
0.10388588160276413,
0.026928123086690903,
-0.13669352233409882,
-0.1659722924232483,
-0.04471960291266441,
-0.004500592593103647,
-0.04449436813592911,
-0.046311136335134506,
0.012687375769019127,
0.01890379562973976,
0.019988661631941795,
-0.05148034170269966,
-0.12596894800662994,
-0.021522613242268562,
-0.05280280485749245,
0.16139262914657593,
0.03723643720149994,
0.02084466814994812,
0.005233773496001959,
0.04331047087907791,
-0.05399421975016594,
-0.07356449961662292,
-0.07448091357946396,
-0.0497107170522213,
-0.08551105856895447,
-0.01899469457566738,
-0.04938190430402756,
-0.18939906358718872,
-0.01817752607166767,
0.14930611848831177,
0.0015965752536430955,
0.048416849225759506,
-0.007202248089015484,
0.012268032878637314,
0.06644662469625473,
0.1961575746536255,
-0.012051822617650032,
-0.06278745830059052,
0.02773386985063553,
-0.013436888344585896,
0.05581802874803543,
-0.030451886355876923,
-0.03240756690502167,
-0.03440236300230026,
0.0376201868057251,
0.0672793909907341,
-0.016104482114315033,
-0.017485497519373894,
-0.07131895422935486,
-0.025504695251584053,
0.07822748273611069,
-0.16335928440093994,
0.01952565647661686,
0.021575238555669785,
0.030703898519277573,
0.11903748661279678,
0.04073677211999893,
-0.029122503474354744,
-0.10558228939771652,
0.03235020861029625,
-0.005335413385182619,
-0.02660076506435871,
-0.020251117646694183,
-0.07983635365962982,
0.032149605453014374,
-0.023254791274666786,
-0.06399928033351898,
-0.09043390303850174,
-0.04132113978266716,
-0.014239158481359482,
0.03628407046198845,
-0.03376651927828789,
0.012685483321547508,
-0.023517334833741188,
-0.03353238105773926,
0.006483221892267466,
-0.01828068494796753,
0.004544857423752546,
-0.001144647249020636,
0.015553833916783333,
0.03459566831588745,
0.09214338660240173,
0.033441927284002304,
0.026904411613941193,
-0.002277673687785864,
0.010774281807243824,
-0.11821964383125305,
0.08963049203157425,
-0.060348186641931534,
-0.07013708353042603,
-0.11513279378414154,
-0.05637214705348015,
-0.069183848798275,
0.03086056374013424,
0.05431331321597099,
0.08563195168972015,
-0.18203362822532654,
-0.057643525302410126,
0.21064315736293793,
-0.11930117756128311,
0.042751915752887726,
0.1362331658601761,
0.016709402203559875,
0.08309926092624664,
0.12856127321720123,
0.11067312955856323,
0.07070209830999374,
-0.23732995986938477,
-0.11805320531129837,
-0.004097138065844774,
-0.012809813022613525,
0.12723177671432495,
0.06282151490449905,
-0.05404164642095566,
0.10298202931880951,
0.018427396193146706,
0.054532624781131744,
-0.0717044398188591,
-0.029197853058576584,
-0.06879763305187225,
0.00046761988778598607,
-0.04118158668279648,
0.07993987947702408,
-0.028481321409344673,
-0.004666508175432682,
-0.04901065304875374,
-0.13620389997959137,
-0.03068215772509575,
0.1280435025691986,
-0.06983078271150589,
0.06538590043783188,
-0.15117716789245605,
-0.022959493100643158,
-0.04059692099690437,
0.014153075404465199,
-0.1628294289112091,
0.08974798023700714,
0.017946818843483925,
-0.029254071414470673,
0.07368052750825882,
0.03984778746962547,
0.03803318366408348,
0.02648104354739189,
-0.036870844662189484,
0.042344775050878525,
-0.0921442061662674,
0.01322043314576149,
-0.08573523908853531,
-0.07072526961565018,
-0.07764071226119995,
-0.04181056469678879,
0.05010724067687988,
-0.05441979691386223,
0.020377788692712784,
0.12184739857912064,
0.05496230348944664,
0.03294863551855087,
-0.05768502876162529,
0.03819914907217026,
0.0026147919707000256,
0.0027014336083084345,
-0.028434397652745247,
0.010361098684370518,
-0.027309253811836243,
-0.001485391054302454,
0.05345342308282852,
-0.09393695741891861,
-0.14645721018314362,
0.06372981518507004,
0.03368069976568222,
-0.047075238078832626,
0.10332284867763519,
-0.05703553184866905,
-0.013737945817410946,
-0.11425983905792236,
-0.1063452661037445,
0.1537592113018036,
0.032463885843753815,
0.05731213465332985,
-0.07233450561761856,
-0.023241844028234482,
0.026746252551674843,
-0.009494570083916187,
-0.051699381321668625,
0.038901932537555695,
-0.027236348018050194,
-0.0770401805639267,
-0.03336542472243309,
0.0879230946302414,
0.07585714757442474,
0.10507889837026596,
-0.026182575151324272,
-0.13738292455673218,
-0.030842037871479988,
-0.02116386592388153,
0.011567129753530025,
0.08785014599561691,
-0.03367814049124718,
-0.0306911189109087,
0.03317160904407501,
0.04034917429089546,
0.05188484117388725,
-0.10368479043245316,
0.09573137760162354,
0.04251271113753319,
-0.083786740899086,
-0.06571920961141586,
-0.002703581703826785,
-0.01900997944176197,
0.06878336519002914,
0.017467213794589043,
0.02725534699857235,
-0.0364382378757,
-0.02419918030500412,
-0.09076913446187973,
0.05935220792889595,
-0.06591501832008362,
-0.25526779890060425,
-0.14047737419605255,
0.02851857803761959,
0.003759180661290884,
-0.014182979241013527,
0.005287242587655783,
-0.04719456657767296,
-0.11893193423748016,
-0.07931052148342133,
0.11237268894910812,
-0.03790464997291565,
0.029533887282013893,
0.13750270009040833,
0.0456157810986042,
0.021987704560160637,
-0.12330644577741623,
0.0002463829005137086,
-0.02592651918530464,
-0.045043036341667175,
0.03162946552038193,
-0.0007091037114150822,
0.028374623507261276,
0.104917973279953,
-0.021450994536280632,
-0.002721466589719057,
-0.021509507670998573,
0.16436505317687988,
-0.05378424748778343,
0.124606192111969,
0.15107519924640656,
-0.05726362019777298,
0.01329964678734541,
0.060615286231040955,
0.014007910154759884,
-0.09206894785165787,
0.015726791694760323,
0.029426032677292824,
-0.04146979749202728,
-0.19789007306098938,
-0.056733760982751846,
-0.07702036201953888,
0.03608794882893562,
0.0443941168487072,
0.024004679173231125,
-0.06280369311571121,
-0.030708029866218567,
-0.0886874571442604,
-0.027595628052949905,
0.07776928693056107,
0.03940669447183609,
0.0835714042186737,
-0.032580841332674026,
0.07788673788309097,
-0.08509532362222672,
-0.03190472722053528,
0.11206868290901184,
0.010968811810016632,
0.16727936267852783,
0.05775386840105057,
0.17715571820735931,
0.06340676546096802,
0.022421175613999367,
0.04977286234498024,
0.10611192137002945,
-0.02500087022781372,
0.021302133798599243,
-0.021646764129400253,
-0.07630378752946854,
-0.0385143905878067,
0.037247538566589355,
0.07990043610334396,
-0.08226056396961212,
-0.05467907711863518,
-0.027017971500754356,
0.07096449285745621,
0.2340129166841507,
0.12223777174949646,
-0.03818689286708832,
-0.14078406989574432,
0.019882000982761383,
-0.1433475762605667,
-0.033630166202783585,
0.008549257181584835,
0.11809944361448288,
-0.08101959526538849,
0.09414802491664886,
0.01737085171043873,
0.07823599874973297,
-0.054526083171367645,
0.02728082612156868,
-0.1549220234155655,
0.07570695132017136,
0.01647520624101162,
0.07680398970842361,
-0.12086805701255798,
0.05766328424215317,
0.04291000962257385,
0.15201030671596527,
-0.053138311952352524,
0.020505929365754128,
0.01302478276193142,
-0.03453684598207474,
0.1135958582162857,
0.003770675975829363,
-0.08937183022499084,
0.001332117710262537,
-0.18188199400901794,
0.016275759786367416,
0.14756326377391815,
0.003173117060214281,
0.10950078070163727,
-0.022228803485631943,
-0.0213193167001009,
0.00015036216063890606,
0.034286729991436005,
-0.1828182190656662,
-0.13446365296840668,
0.08404457569122314,
0.03141997754573822,
-0.04562174528837204,
-0.04178459197282791,
-0.06413211673498154,
-0.07511783391237259,
0.2171546369791031,
-0.15727517008781433,
-0.05257699638605118,
-0.10341732203960419,
-0.009598844684660435,
0.1338089406490326,
-0.03935305401682854,
0.015885965898633003,
0.05334323272109032,
0.1808612197637558,
-0.08090125024318695,
-0.09428709745407104,
-0.0045357574708759785,
-0.03780823573470116,
-0.17904874682426453,
0.011692876927554607,
0.18088310956954956,
0.03783770650625229,
0.07665710151195526,
0.03737378120422363,
0.027115069329738617,
0.01568186841905117,
-0.09243802726268768,
0.06009436026215553,
0.13986550271511078,
-0.05478613078594208,
-0.008539192378520966,
-0.03854510188102722,
-0.13939860463142395,
-0.06942804902791977,
-0.039181653410196304,
0.1169421374797821,
0.25157615542411804,
-0.0737110823392868,
0.21128156781196594,
0.14091303944587708,
-0.06768113374710083,
-0.2925198972225189,
-0.05737493932247162,
0.016414040699601173,
0.10033895075321198,
0.03934749215841293,
-0.19473455846309662,
0.022672628983855247,
-0.0055305431596934795,
-0.04622562974691391,
0.002186498837545514,
-0.18447059392929077,
-0.1286054104566574,
0.0996445044875145,
-0.014333457686007023,
0.15910282731056213,
-0.0231498870998621,
-0.009659986943006516,
-0.038721878081560135,
-0.06820797175168991,
0.06843258440494537,
-0.06593156605958939,
0.13736815750598907,
0.022802192717790604,
-0.0005105010350234807,
0.05978003516793251,
-0.0026851980946958065,
0.09768349677324295,
-0.02030416764318943,
-0.010639695450663567,
0.009768166579306126,
0.017315352335572243,
0.13118061423301697,
-0.038174599409103394,
0.1347823143005371,
0.02226892299950123,
0.021227603778243065,
-0.009901685640215874,
-0.07466308027505875,
-0.022015180438756943,
0.01908096671104431,
-0.05067460611462593,
-0.01734856888651848,
-0.07088582962751389,
0.039394013583660126,
0.03829207643866539,
-0.005873795598745346,
0.045861996710300446,
-0.053420133888721466,
-0.14841631054878235,
0.18634986877441406,
0.12976767122745514,
-0.02670864947140217,
0.03215232491493225,
-0.011130311526358128,
-0.048297151923179626,
0.048145148903131485,
-0.10550215095281601,
0.07055968046188354,
0.07778779417276382,
0.024366550147533417,
0.04760471731424332,
0.004345546010881662,
-0.15860524773597717,
-0.016490355134010315,
0.07740562409162521,
-0.07847809046506882,
-0.19236735999584198,
-0.0005196940037421882,
-0.06237015873193741,
-0.10200037807226181,
0.03280381113290787,
0.1787174642086029,
-0.04725191369652748,
-0.027019482105970383,
0.00028310302877798676,
0.06774435937404633,
-0.05279424786567688,
0.12297908961772919,
0.010970165021717548,
0.030868522822856903,
-0.06212623789906502,
0.13441622257232666,
0.09027934074401855,
-0.02823556959629059,
0.05545056238770485,
0.039584871381521225,
-0.060057684779167175,
-0.017941514030098915,
-0.06916020810604095,
-0.026593074202537537,
0.07264457643032074,
-0.04133974388241768,
-0.01884440705180168,
-0.0977988913655281,
0.02715718187391758,
0.06603291630744934,
-0.00041387020610272884,
0.0871974304318428,
-0.028752390295267105,
0.04799871891736984,
-0.07190213352441788,
0.05640950798988342,
0.020865293219685555,
-0.00200862530618906,
-0.04876973479986191,
0.15370917320251465,
-0.030774608254432678,
0.05736267939209938,
-0.048409804701805115,
-0.05644049867987633,
-0.04347249120473862,
0.00513031892478466,
-0.08274953067302704,
0.03889000415802002,
-0.04148019850254059,
-0.03899231180548668,
0.012870267033576965,
-0.010652265511453152,
-0.016506342217326164,
0.05823700502514839,
-0.058845724910497665,
-0.012622475624084473,
-0.07863491028547287,
0.01916273683309555,
-0.11083003878593445,
0.04556707292795181,
0.04593785107135773,
-0.10387209057807922,
0.1036170944571495,
0.025581449270248413,
-0.021975010633468628,
0.07736124098300934,
-0.12936341762542725,
-0.07650788128376007,
0.012930621393024921,
0.0486089363694191,
-0.00238181184977293,
-0.1041010245680809,
-0.027381500229239464,
0.026278669014573097,
-0.013862520456314087,
-0.005907096434384584,
0.10227181017398834,
-0.04572099447250366,
-0.008481720462441444,
-0.0673905462026596,
-0.011858795769512653,
-0.04209170117974281,
0.03292438015341759,
0.05100119858980179,
0.039062123745679855,
0.09628644585609436,
-0.06726842373609543,
0.020397869870066643,
-0.139324352145195,
0.034715402871370316,
-0.0179226603358984,
-0.00891879852861166,
-0.009166011586785316,
-0.07479589432477951,
0.056378323584795,
-0.0361938439309597,
0.10552889108657837,
-0.07173875719308853,
-0.042987365275621414,
0.036439746618270874,
-0.10003682971000671,
-0.16560079157352448,
0.025089455768465996,
0.14727403223514557,
0.048781391233205795,
-0.016369374468922615,
-0.05361609905958176,
-0.04223928600549698,
0.015959909185767174,
0.056730613112449646,
0.1679132580757141,
0.1312153935432434,
0.05735240876674652,
0.09556838124990463,
-0.02475009858608246,
-0.021679112687706947,
-0.12610086798667908,
0.11642282456159592,
-0.0716407299041748,
0.00987838115543127,
-0.05178319290280342,
-0.024421636015176773,
0.14842446148395538,
-0.11149363219738007,
0.1293209195137024,
-0.011947394348680973,
-0.05668462812900543,
-0.1318221539258957,
-0.11105402559041977,
-0.056005410850048065,
0.029084395617246628,
0.008492748253047466,
-0.09390457719564438,
0.05517078563570976,
-0.04774006828665733,
0.02701636031270027,
-0.017391465604305267,
0.11657296866178513,
-0.16151472926139832,
-0.07964982837438583,
0.11967963725328445,
-0.02535705268383026,
-0.003932951018214226,
0.07608216255903244,
0.029032494872808456,
0.07120484858751297,
0.08524445444345474,
0.12878720462322235,
0.0789913609623909,
0.026650240644812584,
0.00855537410825491,
-0.05486563965678215,
-0.08695188909769058,
-0.0070570893585681915,
-0.041837744414806366,
0.07814889401197433,
0.1102280467748642,
0.08091704547405243,
-0.06548062711954117,
0.005955396220088005,
0.1243690624833107,
-0.04224780946969986,
-0.13815554976463318,
-0.16389100253582,
0.09054749459028244,
0.060826219618320465,
0.03874533250927925,
0.03558768332004547,
-0.09135937690734863,
-0.012150580063462257,
0.20593835413455963,
0.11706258356571198,
0.0349271334707737,
0.030133375898003578,
-0.01373146753758192,
0.01628102920949459,
0.010347321629524231,
0.09142749011516571,
-0.017562704160809517,
0.24438810348510742,
0.0105211790651083,
0.055937305092811584,
-0.0029163251165300608,
-0.06629259884357452,
-0.09693306684494019,
0.15242427587509155,
-0.06416252255439758,
-0.008757187984883785,
-0.05821868032217026,
0.07126326113939285,
0.005946868099272251,
-0.28337398171424866,
0.019529489800333977,
-0.11054351925849915,
-0.07918459177017212,
0.047112058848142624,
-0.010450853034853935,
0.01871573179960251,
0.06499723345041275,
-0.01724027656018734,
0.026525095105171204,
0.1812143474817276,
0.012941849417984486,
0.0013558375649154186,
0.05911168083548546,
0.05759964510798454,
-0.10949834436178207,
0.09647572040557861,
0.044606782495975494,
0.059677109122276306,
0.04940967261791229,
0.004001009743660688,
-0.08793468028306961,
0.08938165754079819,
-0.019659608602523804,
-0.05347540229558945,
0.032647885382175446,
0.2203354835510254,
-0.055448878556489944,
0.07514143735170364,
0.056654468178749084,
-0.010379370301961899,
0.006676122080534697,
0.07818446308374405,
-0.036008428782224655,
-0.05436410382390022,
0.06932836771011353,
-0.07306321710348129,
0.10059982538223267,
0.10577546060085297,
-0.04310188814997673,
0.0025211875326931477,
-0.04420212656259537,
0.0425630621612072,
-0.007638896815478802,
0.06816543638706207,
-0.04090479388833046,
-0.13683171570301056,
0.0024715003091841936,
-0.03149564936757088,
0.06319468468427658,
-0.22370581328868866,
-0.042014073580503464,
-0.018317561596632004,
-0.016116594895720482,
-0.0583970732986927,
0.11007897555828094,
0.0968625470995903,
-0.017899056896567345,
-0.01990189030766487,
-0.061351608484983444,
0.04162953794002533,
0.05403352901339531,
-0.12091853469610214,
-0.03609886020421982
] |
null | null |
transformers
|
# Wav2Vec2-Large-LV60 finetuned on multi-lingual Common Voice
This checkpoint leverages the pretrained checkpoint [wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60)
and is fine-tuned on [CommonVoice](https://huggingface.co/datasets/common_voice) to recognize phonetic labels in multiple languages.
When using the model make sure that your speech input is sampled at 16kHz.
Note that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words
has to be used to map the phonetic output labels to output words.
[Paper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition](https://arxiv.org/abs/2109.11680)
Authors: Qiantong Xu, Alexei Baevski, Michael Auli
**Abstract**
Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-lv-60-espeak-cv-ft")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-lv-60-espeak-cv-ft")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values
# retrieve logits
with torch.no_grad():
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
# => should give ['m ɪ s t ɚ k w ɪ l t ɚ ɹ ɪ z ð ɪ ɐ p ɑː s əl ʌ v ð ə m ɪ d əl k l æ s ᵻ z æ n d w iː ɑːɹ ɡ l æ d t ə w ɛ l k ə m h ɪ z ɡ ɑː s p əl']
```
|
{"language": "multilingual", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition", "phoneme-recognition"], "datasets": ["common_voice"], "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-lv-60-espeak-cv-ft
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"phoneme-recognition",
"multilingual",
"dataset:common_voice",
"arxiv:2109.11680",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.11680"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #phoneme-recognition #multilingual #dataset-common_voice #arxiv-2109.11680 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-LV60 finetuned on multi-lingual Common Voice
This checkpoint leverages the pretrained checkpoint wav2vec2-large-lv60
and is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.
When using the model make sure that your speech input is sampled at 16kHz.
Note that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words
has to be used to map the phonetic output labels to output words.
Paper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition
Authors: Qiantong Xu, Alexei Baevski, Michael Auli
Abstract
Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.
The original model can be found under URL
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
|
[
"# Wav2Vec2-Large-LV60 finetuned on multi-lingual Common Voice\n\nThis checkpoint leverages the pretrained checkpoint wav2vec2-large-lv60 \nand is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.\n\nWhen using the model make sure that your speech input is sampled at 16kHz. \nNote that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words \nhas to be used to map the phonetic output labels to output words.\n\nPaper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition\n\nAuthors: Qiantong Xu, Alexei Baevski, Michael Auli\n\nAbstract\nRecent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #phoneme-recognition #multilingual #dataset-common_voice #arxiv-2109.11680 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-LV60 finetuned on multi-lingual Common Voice\n\nThis checkpoint leverages the pretrained checkpoint wav2vec2-large-lv60 \nand is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.\n\nWhen using the model make sure that your speech input is sampled at 16kHz. \nNote that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words \nhas to be used to map the phonetic output labels to output words.\n\nPaper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition\n\nAuthors: Qiantong Xu, Alexei Baevski, Michael Auli\n\nAbstract\nRecent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
83,
324,
25
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #phoneme-recognition #multilingual #dataset-common_voice #arxiv-2109.11680 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-LV60 finetuned on multi-lingual Common Voice\n\nThis checkpoint leverages the pretrained checkpoint wav2vec2-large-lv60 \nand is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.\n\nWhen using the model make sure that your speech input is sampled at 16kHz. \nNote that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words \nhas to be used to map the phonetic output labels to output words.\n\nPaper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition\n\nAuthors: Qiantong Xu, Alexei Baevski, Michael Auli\n\nAbstract\nRecent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.\n\nThe original model can be found under URL# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
-0.10005098581314087,
0.03528950735926628,
-0.0042594363912940025,
-0.006572366692125797,
0.11245253682136536,
-0.030818842351436615,
0.1818901002407074,
0.05696943402290344,
0.03415491059422493,
0.05692852661013603,
-0.11531136929988861,
0.04442331939935684,
0.06263837963342667,
0.0205406304448843,
0.028360558673739433,
-0.14688144624233246,
0.035729024559259415,
-0.05542737618088722,
0.22049248218536377,
0.0510304793715477,
0.04375874996185303,
-0.05367010459303856,
0.04288921132683754,
0.0132896164432168,
-0.06278056651353836,
0.017508748918771744,
0.010932535864412785,
-0.08568286150693893,
0.08894231170415878,
-0.0025494322180747986,
0.07138258963823318,
0.031187236309051514,
0.02432267554104328,
-0.13626188039779663,
0.006321468390524387,
0.05898549035191536,
0.002056347904726863,
-0.01652071252465248,
0.10949084907770157,
-0.060015350580215454,
0.04715043678879738,
-0.000033055013773264363,
-0.01176928449422121,
0.0687323585152626,
-0.052301883697509766,
-0.138194739818573,
-0.098430335521698,
0.07952804118394852,
0.028365524485707283,
0.1227678433060646,
-0.04322298616170883,
0.10114160925149918,
0.10125882923603058,
0.06627804040908813,
0.15397199988365173,
-0.25596052408218384,
0.033772725611925125,
0.057411763817071915,
0.09924084693193436,
0.08298247307538986,
-0.03418571501970291,
0.04837135970592499,
0.02094823308289051,
0.00017419444338884205,
-0.058899398893117905,
-0.04434405267238617,
0.01606867089867592,
-0.0920284315943718,
-0.08404344320297241,
-0.0006058310391381383,
0.12428024411201477,
-0.06380382925271988,
-0.11791470646858215,
-0.14226175844669342,
-0.020186875015497208,
0.008196074515581131,
0.0035523935221135616,
-0.021918484941124916,
0.01113054994493723,
0.012209118343889713,
0.036726467311382294,
-0.04856742173433304,
-0.0515165776014328,
-0.023473884910345078,
-0.09606170654296875,
0.09958622604608536,
0.048935480415821075,
0.0021491963416337967,
-0.047264572232961655,
0.049982257187366486,
-0.0700053870677948,
-0.028451355174183846,
-0.033392298966646194,
-0.026247523725032806,
-0.09252659976482391,
0.03900827094912529,
0.009131859056651592,
-0.1791945993900299,
0.0031087365932762623,
0.028414200991392136,
0.013323470950126648,
0.051315441727638245,
-0.13290785253047943,
0.0034623732790350914,
0.03448542580008507,
0.06825823336839676,
0.02176455780863762,
-0.04306356608867645,
0.00386586575768888,
-0.05388252064585686,
0.053347133100032806,
-0.03721051290631294,
-0.07195772230625153,
-0.04710609093308449,
0.05528615787625313,
0.0960567370057106,
0.010383999906480312,
-0.014437618665397167,
-0.013349591754376888,
-0.0608457513153553,
0.12238350510597229,
-0.11680664122104645,
0.0403568409383297,
0.013158229179680347,
0.06590522825717926,
0.13539326190948486,
0.05215509980916977,
0.01924450509250164,
-0.13155896961688995,
0.00916718877851963,
0.01826317235827446,
0.026784732937812805,
-0.04563946649432182,
-0.1352485567331314,
0.006907660048455,
-0.09439957141876221,
-0.08518388867378235,
-0.15516093373298645,
-0.12966465950012207,
-0.06377961486577988,
-0.04715360701084137,
-0.026136307045817375,
0.037407852709293365,
-0.03828956186771393,
0.0043043214827775955,
-0.03421563282608986,
-0.0008623632020317018,
-0.05218355357646942,
-0.031114501878619194,
-0.0022498168982565403,
-0.04108816012740135,
0.08958657085895538,
0.004810038022696972,
0.05363353341817856,
-0.03334090858697891,
-0.03313920646905899,
-0.05715949833393097,
0.1354607194662094,
-0.05078104883432388,
-0.04329679533839226,
-0.058059170842170715,
-0.011399759911000729,
-0.041103046387434006,
0.044927842915058136,
0.00946481991559267,
0.05563981831073761,
-0.1715363711118698,
-0.1080465018749237,
0.17399904131889343,
-0.1995670199394226,
0.020287886261940002,
0.10946253687143326,
0.004945449531078339,
0.09873054176568985,
0.10731988400220871,
0.15808303654193878,
0.1372753530740738,
-0.16360150277614594,
-0.10394343733787537,
-0.11218886077404022,
-0.04436437785625458,
0.02681923657655716,
0.025170397013425827,
-0.08462613821029663,
0.06455429643392563,
0.011953485198318958,
0.10816948860883713,
-0.0799000933766365,
-0.016073307022452354,
-0.03472229093313217,
0.002716558054089546,
-0.021534880623221397,
0.004981830716133118,
-0.01486949808895588,
0.0455845482647419,
0.00033454885124228895,
-0.04171280935406685,
0.04735369235277176,
0.0469517707824707,
-0.09263967722654343,
0.09098637104034424,
-0.08375533670186996,
0.03944801166653633,
-0.004071761853992939,
-0.0037438010331243277,
-0.1489436775445938,
0.027551647275686264,
0.06959719210863113,
-0.08571457117795944,
0.10015144944190979,
-0.02613145485520363,
-0.02072780579328537,
0.06209774315357208,
-0.04882508143782616,
0.020974649116396904,
0.014861270785331726,
-0.008308365941047668,
-0.05933462455868721,
-0.047380875796079636,
-0.000644828483927995,
-0.05631598085165024,
-0.03341243416070938,
-0.03597674146294594,
-0.004029212053865194,
-0.027271172031760216,
0.06338094174861908,
0.03358304873108864,
-0.06333492696285248,
0.02390453964471817,
0.10555354505777359,
-0.015684226527810097,
-0.011721760034561157,
0.03749343007802963,
-0.01376944500952959,
0.0291544608771801,
0.13047271966934204,
-0.16950272023677826,
-0.13189031183719635,
0.08858378231525421,
-0.01812785118818283,
-0.056036002933979034,
0.03188168257474899,
0.05044569820165634,
-0.05510912090539932,
-0.08918476104736328,
-0.037785373628139496,
0.21028703451156616,
-0.008387609384953976,
0.11997714638710022,
-0.07234978675842285,
0.029395252466201782,
0.03020966425538063,
-0.03162231296300888,
-0.0009139940375462174,
0.06719809770584106,
-0.05355025827884674,
-0.013490941375494003,
0.023991458117961884,
-0.02917391248047352,
0.0300217904150486,
0.20578758418560028,
0.027393432334065437,
-0.043895915150642395,
-0.008014857769012451,
-0.006021913606673479,
0.07271015644073486,
0.01819479651749134,
-0.12605002522468567,
-0.04432319477200508,
0.04730069637298584,
0.09520550072193146,
0.07532583177089691,
-0.06440122425556183,
0.07757865637540817,
0.0007237148238345981,
-0.046443842351436615,
-0.05923446640372276,
0.046985626220703125,
0.02140156738460064,
0.024079108610749245,
-0.05282071232795715,
0.05750902742147446,
0.005585297476500273,
-0.04354225844144821,
-0.13524216413497925,
0.07650785148143768,
-0.12801164388656616,
-0.2596122920513153,
-0.16665565967559814,
-0.009612959809601307,
-0.06240317225456238,
0.005576633382588625,
0.0775289535522461,
-0.010209820233285427,
-0.04439498856663704,
-0.07767250388860703,
0.07747730612754822,
-0.05930398777127266,
-0.08559180051088333,
-0.040879808366298676,
0.01646042801439762,
-0.05884450674057007,
-0.11358056217432022,
0.018661979585886,
-0.019378911703824997,
-0.12878024578094482,
0.04803283140063286,
-0.056109342724084854,
0.017964355647563934,
0.12848873436450958,
0.011255734600126743,
-0.02737114392220974,
-0.03487750515341759,
0.0028921400662511587,
-0.03631574660539627,
0.01143584493547678,
0.18999281525611877,
0.04768846556544304,
0.02714102901518345,
0.04635125398635864,
-0.035978514701128006,
-0.035748258233070374,
0.0027545031625777483,
0.022006412968039513,
-0.07275709509849548,
-0.21789871156215668,
-0.11134181916713715,
-0.06040163338184357,
0.03389682620763779,
-0.03654162958264351,
0.01312433835119009,
0.06132365018129349,
-0.010851242579519749,
-0.005129383876919746,
-0.03398016467690468,
0.05202252417802811,
0.10337597131729126,
0.14243058860301971,
0.01044203620404005,
0.09943807870149612,
-0.07596921175718307,
0.05101891607046127,
0.07465210556983948,
0.06398937106132507,
0.23893986642360687,
0.047709036618471146,
0.0735345333814621,
0.0763716846704483,
0.06543488800525665,
0.13383875787258148,
0.06894239038228989,
0.0041038477793335915,
0.05632106587290764,
-0.009413902647793293,
-0.06854669004678726,
0.027170762419700623,
0.025570986792445183,
0.02796091139316559,
-0.09752598404884338,
-0.02949412539601326,
-0.01571718230843544,
0.012600756250321865,
0.11745083332061768,
0.07459850609302521,
-0.02322046086192131,
-0.04958198219537735,
-0.014538111165165901,
-0.08662140369415283,
-0.08210889250040054,
0.0340263694524765,
0.08559247851371765,
-0.13470357656478882,
0.08882738649845123,
0.022923290729522705,
0.07371892035007477,
-0.10553589463233948,
-0.012421073392033577,
-0.06947861611843109,
0.01357025932520628,
-0.014473207294940948,
0.02557867020368576,
-0.1095912978053093,
0.10348566621541977,
0.0007023403886705637,
0.10179701447486877,
-0.04453409090638161,
0.011525499634444714,
0.008858496323227882,
0.0008665448985993862,
0.08945360034704208,
0.016499454155564308,
0.004667242523282766,
0.04588274285197258,
-0.08396779745817184,
0.012147260829806328,
0.11561375111341476,
0.0025042621418833733,
0.035475343465805054,
0.028326895087957382,
-0.015437306836247444,
-0.0546867661178112,
0.02383154258131981,
-0.18117694556713104,
-0.12646739184856415,
0.07623221725225449,
0.03780105710029602,
0.0926671177148819,
-0.01946115866303444,
-0.0669693872332573,
-0.15611431002616882,
0.10872098803520203,
-0.13123753666877747,
-0.06600791960954666,
-0.07315222173929214,
-0.1669376790523529,
0.10272794961929321,
-0.013754774816334248,
0.07251578569412231,
0.005938020534813404,
0.04991932958364487,
-0.10894809663295746,
-0.06792869418859482,
0.045055415481328964,
-0.06278714537620544,
-0.14570274949073792,
0.001826633233577013,
0.15045256912708282,
0.048467520624399185,
-0.0036783015821129084,
0.020361164584755898,
0.040216051042079926,
0.008724663406610489,
-0.06196073815226555,
0.001793361036106944,
0.13897109031677246,
0.006716957315802574,
0.1793082356452942,
-0.09287948161363602,
-0.3517741858959198,
-0.08067928999662399,
-0.12273859977722168,
0.07380697131156921,
0.16592460870742798,
-0.03183232247829437,
0.2011675387620926,
0.1527363806962967,
-0.15632788836956024,
-0.1891026645898819,
-0.052225228399038315,
0.020403869450092316,
0.052994027733802795,
-0.002787994919344783,
-0.16088566184043884,
0.03159328177571297,
0.02133917063474655,
0.014046636410057545,
-0.011060431599617004,
-0.1879875510931015,
-0.16565872728824615,
0.027460647746920586,
-0.022976180538535118,
0.10844682157039642,
-0.07199327647686005,
-0.06476496160030365,
-0.05034599453210831,
-0.05613286793231964,
0.10113532096147537,
-0.03170475363731384,
0.1358761191368103,
0.01957854814827442,
0.034696292132139206,
0.004242686089128256,
-0.019017811864614487,
0.05614320933818817,
0.10963258147239685,
0.0409410186111927,
0.02985898219048977,
0.02524743787944317,
0.023373493924736977,
0.014972337521612644,
0.05450575426220894,
0.09214947372674942,
0.019430171698331833,
-0.03010525554418564,
-0.07755943387746811,
-0.05529453977942467,
0.07713598012924194,
0.003695125924423337,
-0.005893701687455177,
-0.05624718219041824,
0.034672293812036514,
0.012353024445474148,
-0.04134200140833855,
0.03838903456926346,
-0.14586903154850006,
0.024451447650790215,
0.18743902444839478,
0.15678748488426208,
-0.018215201795101166,
0.00824139453470707,
-0.07402120530605316,
-0.08382125198841095,
0.06489317864179611,
0.02266193926334381,
0.047244843095541,
0.044579681009054184,
0.038659438490867615,
0.067842498421669,
0.014557731337845325,
-0.10894427448511124,
0.07876142114400864,
0.018856585025787354,
-0.02315072901546955,
-0.09126929193735123,
0.015796197578310966,
0.005369072314351797,
0.06780751794576645,
0.10566093772649765,
0.16991747915744781,
-0.05994167551398277,
-0.036202095448970795,
-0.036729101091623306,
0.01655556447803974,
-0.029129141941666603,
0.09349192678928375,
-0.05980394408106804,
0.015641961246728897,
-0.07594334334135056,
0.14200885593891144,
0.0359870046377182,
-0.0172968041151762,
0.029541894793510437,
0.09050290286540985,
-0.08790569007396698,
-0.05442266911268234,
-0.1552691012620926,
0.14323543012142181,
-0.08727609366178513,
-0.09228867292404175,
0.02047387696802616,
-0.06553206592798233,
-0.009215795435011387,
0.18025550246238708,
-0.017890404909849167,
0.042121078819036484,
-0.03873339667916298,
-0.025629499927163124,
-0.07495066523551941,
0.017767099663615227,
0.06751950830221176,
0.010556625202298164,
-0.07362151890993118,
0.1426721066236496,
0.043132029473781586,
0.001877865637652576,
-0.03593757003545761,
-0.0751756951212883,
-0.05424991250038147,
0.014026053249835968,
-0.14227668941020966,
0.005558738950639963,
-0.08766688406467438,
-0.03643031790852547,
0.001300137024372816,
0.014528412371873856,
0.019014297053217888,
0.05339875444769859,
-0.018378322944045067,
-0.004329361487179995,
-0.05387406423687935,
0.07304012030363083,
-0.06806789338588715,
0.053641125559806824,
0.007608980871737003,
-0.04653399437665939,
0.10913048684597015,
0.06336408108472824,
-0.05166945606470108,
0.02437085658311844,
-0.11480819433927536,
-0.008382130414247513,
0.005862370599061251,
0.03637427091598511,
-0.04191916435956955,
-0.10726769268512726,
-0.015283279120922089,
0.04906133934855461,
0.010864971205592155,
0.029937975108623505,
0.1349562406539917,
-0.07502295076847076,
0.045887239277362823,
-0.10784098505973816,
0.022489342838525772,
-0.04475422203540802,
0.04402364790439606,
0.07883122563362122,
0.06469674408435822,
0.05555754899978638,
-0.11149805039167404,
0.07559027522802353,
-0.025224454700946808,
-0.016764668747782707,
-0.01839226670563221,
-0.00005728771793656051,
0.01020367443561554,
-0.04666990414261818,
0.05617337301373482,
0.04738810658454895,
0.13459327816963196,
-0.005194969475269318,
-0.001958120847120881,
-0.013384278863668442,
-0.02142452448606491,
-0.1696324497461319,
-0.0030993211548775434,
0.0988817811012268,
0.045954763889312744,
0.0211301501840353,
0.009754661470651627,
0.018905892968177795,
-0.047200758010149,
0.10141752660274506,
0.07173027843236923,
0.16388748586177826,
-0.0015604887157678604,
0.06402172893285751,
0.029133394360542297,
-0.08514375984668732,
-0.12409084290266037,
-0.011450919322669506,
-0.07539777457714081,
0.06320271641016006,
-0.07966022193431854,
-0.03781770169734955,
0.0454368069767952,
-0.08003195375204086,
0.12595772743225098,
0.07221020013093948,
-0.06601684540510178,
-0.07377056032419205,
-0.10878849774599075,
-0.026956399902701378,
-0.05264753848314285,
-0.012463750317692757,
-0.0705297663807869,
0.06995082646608353,
0.038395147770643234,
0.048785846680402756,
0.011116902343928814,
0.07004164904356003,
-0.22016753256320953,
-0.10884885489940643,
0.08901134133338928,
-0.03876201808452606,
0.04809940233826637,
-0.006945617496967316,
0.0052444287575781345,
0.0403539203107357,
0.0716671347618103,
0.03643801808357239,
0.07897594571113586,
0.05948871746659279,
0.03795799985527992,
-0.07825345546007156,
-0.04246506094932556,
0.007885035127401352,
-0.0034800537396222353,
0.07070605456829071,
0.19896617531776428,
0.09223131835460663,
-0.09195680916309357,
0.023094424977898598,
0.14743389189243317,
-0.060161083936691284,
-0.1620747596025467,
-0.08400178700685501,
0.1675882488489151,
-0.027024058625102043,
0.01415464747697115,
-0.017512889578938484,
-0.06744958460330963,
0.002748489612713456,
0.21190853416919708,
0.13240757584571838,
0.04257917404174805,
0.0032650046050548553,
0.026475006714463234,
-0.0076241581700742245,
0.020578142255544662,
0.020234383642673492,
0.024345099925994873,
0.3368845283985138,
0.028466390445828438,
0.0945197120308876,
-0.05030088126659393,
-0.02829721011221409,
-0.12319528311491013,
0.08378329873085022,
-0.10626287013292313,
-0.06466406583786011,
0.009638753719627857,
0.09444863349199295,
-0.09053964912891388,
-0.17420314252376556,
0.07918354868888855,
-0.042196329683065414,
-0.04185894504189491,
0.0121493274345994,
0.0497613362967968,
0.01997133158147335,
0.011516926810145378,
0.029516546055674553,
0.016668003052473068,
0.11697731912136078,
0.00939916167408228,
-0.04329726845026016,
-0.003068107645958662,
-0.02062370628118515,
-0.10060971975326538,
0.012063504196703434,
-0.001713825506158173,
0.10144733637571335,
0.025592654943466187,
0.08036602288484573,
-0.024939974769949913,
0.11936204880475998,
-0.06328865885734558,
-0.07037901133298874,
0.03772781044244766,
0.13223832845687866,
-0.021591728553175926,
0.12373903393745422,
0.015625568106770515,
-0.03442371264100075,
0.04291606321930885,
0.04273279011249542,
-0.0427805595099926,
-0.04135901853442192,
0.0322219617664814,
-0.10780322551727295,
0.11957491934299469,
0.03625452145934105,
-0.008822979405522346,
-0.027819357812404633,
-0.005339204333722591,
0.038269516080617905,
-0.025593385100364685,
0.021260859444737434,
0.0018351355101913214,
-0.187013640999794,
-0.009372951462864876,
-0.04135551676154137,
0.060016877949237823,
-0.23305031657218933,
-0.016809189692139626,
-0.021527815610170364,
-0.016508426517248154,
-0.08673732727766037,
0.0569501630961895,
0.053669702261686325,
-0.021581321954727173,
-0.020882954820990562,
-0.12315286695957184,
0.0800706297159195,
0.1274019479751587,
-0.06867507845163345,
-0.07171186059713364
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-2b-21-EN
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-1b`**](https://huggingface.co/facebook/wav2vec2-xls-r-1b) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 21 `{lang}` -> `en` translation pairs of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from the following spoken languages `{lang}` -> `en` (English):
{`fr`, `de`, `es`, `ca`, `it`, `ru`, `zh-CN`, `pt`, `fa`, `et`, `mn`, `nl`, `tr`, `ar`, `sv-SE`, `lv`, `sl`, `ta`, `ja`, `id`, `cy`} -> `en`
For more information, please refer to Section *5.1.2* of the [official XLS-R paper](https://arxiv.org/abs/2111.09296).
## Usage
### Demo
The model can be tested directly on the speech recognition widget on this model card!
Simple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline
```python
from datasets import load_dataset
from transformers import pipeline
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-1b-21-to-en", feature_extractor="facebook/wav2vec2-xls-r-1b-21-to-en")
translation = asr(audio_file)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-1b-21-to-en")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-1b-21-to-en")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"])
transcription = processor.batch_decode(generated_ids)
```
## Results `{lang}` -> `en`
See the row of **XLS-R (1B)** for the performance on [Covost2](https://huggingface.co/datasets/covost2) for this model.

## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-300m-21-to-en)
- [Wav2Vec2-XLS-R-1B-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-1b-21-to-en)
- [Wav2Vec2-XLS-R-2B-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-2b-21-to-en)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "fr", "de", "es", "ca", "it", "ru", "zh", "pt", "fa", "et", "mn", "nl", "tr", "ar", "sv", "lv", "sl", "ta", "ja", "id", "cy", "en"], "license": "apache-2.0", "tags": ["speech", "xls_r", "automatic-speech-recognition", "xls_r_translation"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Swedish", "src": "https://cdn-media.huggingface.co/speech_samples/cv_swedish_1.mp3"}, {"example_title": "Arabic", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ar_19058308.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "German", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_de_17284683.mp3"}, {"example_title": "French", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_fr_17299386.mp3"}, {"example_title": "Indonesian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_id_19051309.mp3"}, {"example_title": "Italian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_it_17415776.mp3"}, {"example_title": "Japanese", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ja_19482488.mp3"}, {"example_title": "Mongolian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_mn_18565396.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "Turkish", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_tr_17341280.mp3"}, {"example_title": "Catalan", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ca_17367522.mp3"}, {"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-1b-21-to-en
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #region-us
|
# Wav2Vec2-XLS-R-2b-21-EN
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.
The model can translate from the following spoken languages '{lang}' -> 'en' (English):
{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'
For more information, please refer to Section *5.1.2* of the official XLS-R paper.
## Usage
### Demo
The model can be tested directly on the speech recognition widget on this model card!
Simple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline
or step-by-step as follows:
## Results '{lang}' -> 'en'
See the row of XLS-R (1B) for the performance on Covost2 for this model.
!results image
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-21-EN
- Wav2Vec2-XLS-R-1B-21-EN
- Wav2Vec2-XLS-R-2B-21-EN
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-2b-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:",
"## Results '{lang}' -> 'en'\n\nSee the row of XLS-R (1B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-21-EN\n- Wav2Vec2-XLS-R-1B-21-EN\n- Wav2Vec2-XLS-R-2B-21-EN\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-XLS-R-2b-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:",
"## Results '{lang}' -> 'en'\n\nSee the row of XLS-R (1B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-21-EN\n- Wav2Vec2-XLS-R-1B-21-EN\n- Wav2Vec2-XLS-R-2B-21-EN\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
148,
280,
3,
52,
66,
39,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #region-us \n# Wav2Vec2-XLS-R-2b-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.## Usage### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input."
] |
[
-0.07305055856704712,
0.034510448575019836,
-0.005553706549108028,
0.029102865606546402,
0.047391463071107864,
-0.011453820392489433,
0.027848733589053154,
0.10892537981271744,
0.0537131242454052,
0.12744422256946564,
-0.001083217328414321,
0.09494483470916748,
0.08418974280357361,
0.10307976603507996,
-0.019589751958847046,
-0.19772514700889587,
0.03430858999490738,
-0.10577749460935593,
0.03563009575009346,
0.07786969840526581,
0.09295560419559479,
-0.07138144224882126,
0.04734759032726288,
-0.02010292373597622,
-0.01973387785255909,
0.031824734061956406,
0.003976298961788416,
-0.06263525038957596,
0.05877629294991493,
0.10304932296276093,
0.04402536526322365,
0.05538082867860794,
0.08446048200130463,
-0.290728360414505,
0.02360762655735016,
0.07697147876024246,
-0.010186771862208843,
0.02124089002609253,
0.12049490213394165,
-0.07532113045454025,
0.053658902645111084,
-0.044720232486724854,
-0.009994417428970337,
0.09637285768985748,
-0.06261035799980164,
-0.24420742690563202,
-0.06690678000450134,
0.08570130169391632,
0.09750024229288101,
0.04088331013917923,
-0.062213946133852005,
0.008007274940609932,
0.0014032813487574458,
0.08026978373527527,
0.16014619171619415,
-0.23541989922523499,
-0.00042501563439145684,
0.014913483522832394,
0.10230500996112823,
0.047503672540187836,
-0.0741867944598198,
0.05316714942455292,
0.0036073445808142424,
-0.001693365047685802,
-0.006259254179894924,
-0.053979698568582535,
0.0012223015073686838,
-0.03522803634405136,
-0.13404108583927155,
-0.04442547261714935,
0.02743750810623169,
0.05583957955241203,
-0.050019361078739166,
-0.1321238875389099,
-0.03572949394583702,
-0.038265906274318695,
-0.04348956421017647,
-0.05509012192487717,
-0.03883108124136925,
-0.003916336689144373,
0.0318562351167202,
-0.09337221086025238,
-0.1034046933054924,
-0.02606659010052681,
-0.061716098338365555,
0.1326795071363449,
0.01783766970038414,
0.022204741835594177,
0.03409183770418167,
0.05218745395541191,
-0.03212737664580345,
-0.09695127606391907,
-0.041604362428188324,
-0.054503146559000015,
-0.17856985330581665,
0.008195945993065834,
-0.01743098348379135,
-0.032506413757801056,
0.09250615537166595,
0.13208512961864471,
-0.04179053753614426,
0.06750955432653427,
-0.0836026668548584,
0.02760399505496025,
0.029822198674082756,
0.13108190894126892,
-0.08254243433475494,
-0.03328812122344971,
-0.03422295302152634,
-0.02832827903330326,
-0.03572919964790344,
-0.032447539269924164,
-0.06838740408420563,
-0.009610666893422604,
-0.01030912809073925,
0.082087941467762,
0.043941061943769455,
0.001265433500520885,
-0.047729890793561935,
-0.04425114020705223,
0.13165690004825592,
-0.15032047033309937,
0.053017809987068176,
0.10376214981079102,
-0.02462886832654476,
0.09119125455617905,
0.0008023507543839514,
0.019959021359682083,
-0.09118162840604782,
-0.011057587340474129,
0.02164354734122753,
0.02926129661500454,
-0.02296333946287632,
-0.11733106523752213,
0.021982314065098763,
-0.07096172869205475,
-0.06431668251752853,
-0.1296103298664093,
-0.03894375264644623,
-0.059900928288698196,
0.00946110300719738,
-0.07314682751893997,
0.04870535060763359,
-0.057726167142391205,
-0.03579603135585785,
0.01733311079442501,
-0.02139546535909176,
-0.02793663926422596,
-0.0363069511950016,
0.039121247828006744,
-0.04354691505432129,
0.0937192440032959,
0.03148958086967468,
0.04229623079299927,
-0.009867565706372261,
0.020940236747264862,
-0.17426857352256775,
0.2157215178012848,
-0.1295592486858368,
-0.021948309615254402,
-0.1587596833705902,
-0.04720216244459152,
0.0056133512407541275,
0.0333288237452507,
-0.006276797968894243,
0.06848740577697754,
-0.21455593407154083,
-0.06811419129371643,
0.24936529994010925,
-0.08098194748163223,
-0.014860464259982109,
0.14388200640678406,
0.014731476083397865,
-0.044011276215314865,
0.08447445929050446,
0.14737969636917114,
0.13610723614692688,
-0.1955982744693756,
-0.011535859666764736,
0.005078090820461512,
-0.014445408247411251,
0.1371600329875946,
0.10182802379131317,
-0.08460021018981934,
0.04807216301560402,
-0.003170217154547572,
-0.017428547143936157,
-0.01861473172903061,
-0.01563410833477974,
-0.029707632958889008,
0.03789505735039711,
-0.03591658174991608,
0.08758009225130081,
-0.021802017465233803,
-0.0654224082827568,
-0.007430541794747114,
-0.09925558418035507,
-0.011514068581163883,
0.07437171041965485,
-0.053815729916095734,
0.03448103368282318,
-0.10917176306247711,
0.05454893410205841,
-0.01839451678097248,
0.0171428881585598,
-0.20788255333900452,
-0.025433188304305077,
0.011026512831449509,
-0.06360042840242386,
0.09449035674333572,
0.06201329082250595,
0.0257259588688612,
0.029189713299274445,
0.016769742593169212,
-0.0006372581119649112,
0.04832442104816437,
-0.009784447029232979,
0.009944617748260498,
-0.10255079716444016,
-0.04904406517744064,
-0.059059176594018936,
0.10339494794607162,
-0.12463365495204926,
-0.02341572754085064,
0.08770321309566498,
0.12724550068378448,
0.01688770018517971,
0.005824984982609749,
-0.00919008906930685,
0.020925147458910942,
0.02835184894502163,
-0.004860293585807085,
-0.0033978288993239403,
-0.03655318170785904,
-0.025107651948928833,
0.09525998681783676,
-0.11298433691263199,
-0.03900589793920517,
0.0635189339518547,
0.015637099742889404,
-0.07983846217393875,
0.02358340658247471,
-0.033961765468120575,
0.010691110976040363,
-0.10232645273208618,
-0.06332365423440933,
0.20733888447284698,
0.09109757095575333,
0.09339676052331924,
-0.06993565708398819,
-0.06264728307723999,
0.031114261597394943,
-0.04571244493126869,
-0.033862486481666565,
0.12097953259944916,
-0.017371445894241333,
-0.15501020848751068,
0.03267405927181244,
-0.013110177591443062,
0.03497718274593353,
0.1754724681377411,
0.0058053662069141865,
-0.1137247383594513,
-0.03748990222811699,
0.026834284886717796,
-0.0024148498196154833,
-0.003658123780041933,
0.07791034877300262,
0.005650058388710022,
0.05126912519335747,
0.004624673631042242,
0.02282225340604782,
-0.060727380216121674,
0.04797903820872307,
0.021257847547531128,
-0.1066986471414566,
0.0211709626019001,
0.03413692116737366,
0.03207062929868698,
0.057582754641771317,
-0.004029290750622749,
-0.03409251943230629,
-0.07188471406698227,
-0.049029748886823654,
-0.10923326760530472,
0.11876965314149857,
-0.1313154250383377,
-0.3386295735836029,
-0.1567293256521225,
-0.05435074865818024,
-0.024005839601159096,
0.002087576547637582,
0.062488507479429245,
-0.08116790652275085,
-0.06353901326656342,
-0.059442851692438126,
0.04389754682779312,
0.005611713510006666,
-0.05975700542330742,
0.015528954565525055,
0.017042018473148346,
0.021378248929977417,
-0.08220973610877991,
0.008090942166745663,
0.02705991454422474,
-0.0480869859457016,
-0.02730281837284565,
0.03305799141526222,
0.0471588671207428,
0.1354503631591797,
0.0189145989716053,
0.03123835287988186,
-0.02412358857691288,
0.18319985270500183,
-0.10322298109531403,
0.07098368555307388,
0.10934007167816162,
-0.06115175783634186,
0.05866726487874985,
0.17348052561283112,
0.01696956343948841,
-0.029797125607728958,
-0.0022587659768760204,
0.031213440001010895,
0.005736667662858963,
-0.26681023836135864,
-0.09036356955766678,
-0.03257828578352928,
0.021910464391112328,
0.04013705253601074,
0.045778270810842514,
0.06610587239265442,
-0.022249441593885422,
-0.05667958781123161,
-0.08183839917182922,
0.08912041783332825,
0.027878539636731148,
0.1400003284215927,
-0.003617379115894437,
0.06899841129779816,
-0.0449339859187603,
-0.03673669695854187,
0.09956451505422592,
0.026451054960489273,
0.07503267377614975,
0.06758511811494827,
0.18398088216781616,
0.07693994045257568,
0.03920799121260643,
0.03293394297361374,
0.014022225514054298,
-0.023133959621191025,
0.03242915868759155,
0.028219949454069138,
-0.07120117545127869,
0.02682538516819477,
0.03740209713578224,
0.1748112291097641,
-0.09082254022359848,
-0.00004117922071600333,
-0.014607715420424938,
0.07427817583084106,
0.16130684316158295,
0.09994176775217056,
-0.13592509925365448,
-0.016659945249557495,
0.003241538070142269,
-0.06546590477228165,
-0.0520482063293457,
-0.0020270722452551126,
0.11372105032205582,
-0.11029411852359772,
0.09141328185796738,
0.015399092808365822,
0.09789355844259262,
-0.0955662652850151,
-0.011308967135846615,
0.01611429639160633,
0.07771269232034683,
0.0064775412902235985,
0.07987383008003235,
-0.1580999791622162,
0.1346561163663864,
0.022600827738642693,
0.020917294546961784,
-0.013003869913518429,
0.017091894522309303,
-0.003498275065794587,
-0.018881376832723618,
0.1703374683856964,
-0.016651788726449013,
-0.0832773819565773,
-0.11991555988788605,
-0.11921508610248566,
0.008730369620025158,
0.13280674815177917,
-0.09220000356435776,
0.06089406460523605,
-0.010747731663286686,
-0.09215196222066879,
-0.07146339863538742,
-0.054031550884246826,
-0.12591509521007538,
-0.13281786441802979,
0.05535056069493294,
0.014513570815324783,
0.06803299486637115,
0.006462928373366594,
-0.021003542467951775,
-0.13250218331813812,
0.0913516953587532,
-0.15557391941547394,
-0.06463529914617538,
-0.15451578795909882,
0.01217020582407713,
0.16648253798484802,
-0.1103067547082901,
0.07273650914430618,
0.016081562265753746,
0.17448899149894714,
-0.038017429411411285,
-0.09459324181079865,
0.044145021587610245,
-0.07017365843057632,
-0.10356347262859344,
-0.0004184823774266988,
0.17794270813465118,
0.1058209091424942,
0.0513380765914917,
0.06105102226138115,
0.03761184588074684,
0.012338824570178986,
-0.09389331191778183,
0.03151865303516388,
0.07531715929508209,
-0.02326834201812744,
0.045445483177900314,
-0.006357008591294289,
-0.19182324409484863,
-0.12473950535058975,
-0.011612347327172756,
0.18024900555610657,
0.15014007687568665,
-0.1212364211678505,
0.16032716631889343,
0.17990265786647797,
-0.07141630351543427,
-0.18534618616104126,
-0.126519575715065,
0.11601144820451736,
0.05135839059948921,
-0.020395616069436073,
-0.19076032936573029,
0.03277773782610893,
0.042287930846214294,
-0.004832088947296143,
0.04669418931007385,
-0.2520904242992401,
-0.10477878898382187,
0.12552237510681152,
-0.03471243754029274,
-0.13570930063724518,
-0.08213472366333008,
-0.11293481290340424,
-0.09751558303833008,
-0.12299392372369766,
0.08156506717205048,
-0.12126202136278152,
0.04068788141012192,
0.08388340473175049,
0.029278695583343506,
0.014795591123402119,
0.019936684519052505,
0.10402568429708481,
0.06820651888847351,
-0.022639915347099304,
-0.03952716290950775,
0.05753433704376221,
-0.06767105311155319,
-0.014636199921369553,
0.10548001527786255,
-0.010955878533422947,
0.015251440927386284,
-0.04005442559719086,
-0.0680203065276146,
-0.06915415078401566,
0.04416068270802498,
-0.017245670780539513,
-0.006965452805161476,
-0.0221725907176733,
0.0001999066589633003,
0.06591787934303284,
-0.020653011277318,
-0.03539169952273369,
-0.11314886808395386,
0.04770553112030029,
0.18614448606967926,
0.12507420778274536,
0.09249699860811234,
-0.12150007486343384,
-0.05164448171854019,
-0.03960653394460678,
0.013620438985526562,
-0.044531214982271194,
0.07350550591945648,
0.1133757084608078,
0.002881630090996623,
0.14939908683300018,
-0.022417228668928146,
-0.11350132524967194,
0.028316114097833633,
0.04423302784562111,
-0.06496021151542664,
-0.17448078095912933,
-0.005705615039914846,
-0.010725868865847588,
0.0035443026572465897,
-0.03629492595791817,
0.16408957540988922,
0.018809782341122627,
-0.058339282870292664,
-0.007300321478396654,
0.06034572049975395,
-0.048654455691576004,
0.14622598886489868,
0.06142394617199898,
0.08012891560792923,
-0.10880008339881897,
0.050792183727025986,
0.08716888725757599,
-0.07279890030622482,
0.01735926978290081,
0.12063601613044739,
-0.08410043269395828,
-0.08154089003801346,
-0.04130292683839798,
0.048348329961299896,
-0.051773540675640106,
-0.0504920668900013,
0.030821681022644043,
-0.10675620287656784,
0.05756378546357155,
0.22245842218399048,
0.015780793502926826,
0.019014092162251472,
0.018767839297652245,
-0.03959033265709877,
-0.019601179286837578,
0.1307384967803955,
0.04389159381389618,
-0.020578701049089432,
-0.09275658428668976,
0.08485589921474457,
0.022536741569638252,
0.08493734151124954,
-0.020325371995568275,
-0.040732163935899734,
-0.07336818426847458,
0.021791117265820503,
-0.14234961569309235,
0.07750498503446579,
-0.06498617678880692,
-0.013793508522212505,
0.012506237253546715,
-0.03934266045689583,
0.012689251452684402,
0.017010513693094254,
-0.1075209304690361,
-0.03334077447652817,
-0.0544205904006958,
0.07156768441200256,
-0.1795637160539627,
-0.006458833813667297,
0.05726011097431183,
-0.047150347381830215,
0.10214167833328247,
0.06264287978410721,
-0.030964117497205734,
0.05798570439219475,
-0.13659125566482544,
-0.07581335306167603,
0.02461055852472782,
0.06078578531742096,
0.04576125741004944,
-0.11425546556711197,
0.035398416221141815,
0.004842468537390232,
-0.01938515529036522,
-0.01744231954216957,
0.01704486273229122,
-0.08064167201519012,
0.022646348923444748,
-0.10077718645334244,
-0.009619943797588348,
-0.03909803926944733,
0.04069998115301132,
0.06346273422241211,
0.05563763156533241,
0.06963134557008743,
-0.091212198138237,
0.1393754482269287,
-0.10141201317310333,
-0.006803295575082302,
-0.010682135820388794,
-0.0029904984403401613,
0.026554716750979424,
-0.07616612315177917,
0.0858200192451477,
-0.019327601417899132,
0.10494630038738251,
0.0161762572824955,
0.09311176836490631,
-0.005702862981706858,
-0.12001106888055801,
-0.09035253524780273,
0.07429595291614532,
0.07749845832586288,
0.07443434000015259,
0.0016260029515251517,
0.008697010576725006,
-0.05908858776092529,
0.004043844994157553,
0.050969719886779785,
0.05665118619799614,
0.12253765016794205,
0.09546318650245667,
0.001485504675656557,
0.11312103271484375,
-0.11460626125335693,
-0.023436998948454857,
0.014309022575616837,
-0.13950803875923157,
0.03123662807047367,
-0.06855211406946182,
0.11777406185865402,
0.07559453696012497,
-0.14503419399261475,
0.08705007284879684,
0.019456323236227036,
-0.06627443432807922,
-0.13224749267101288,
-0.12277556210756302,
-0.07872799038887024,
-0.06762854754924774,
0.013463540934026241,
-0.08876614272594452,
0.062470272183418274,
0.010308823548257351,
0.06701163202524185,
0.013884274289011955,
0.09102212637662888,
-0.05046527087688446,
-0.12298709899187088,
0.10554829239845276,
-0.0018462390871718526,
0.03564123064279556,
0.04347294196486473,
0.05489221215248108,
0.09592262655496597,
0.04236884042620659,
0.0628633052110672,
0.05004047602415085,
-0.027791103348135948,
0.04154178872704506,
-0.08730940520763397,
-0.07213570922613144,
0.013712934218347073,
0.002684811595827341,
0.016389042139053345,
0.1401645690202713,
0.0957377478480339,
-0.04323554411530495,
0.00410450529307127,
0.10043255239725113,
-0.044722940772771835,
-0.13556069135665894,
-0.17070896923542023,
0.06347904354333878,
-0.0010904573136940598,
0.09526935964822769,
-0.005419556517153978,
-0.11746855080127716,
-0.04543175920844078,
0.2026982605457306,
0.11872539669275284,
-0.006582834757864475,
0.04417218267917633,
0.05814755707979202,
0.02867177687585354,
0.04677679017186165,
0.00860031507909298,
0.0559561587870121,
0.276046484708786,
-0.04230809584259987,
-0.029460281133651733,
-0.01182796061038971,
-0.09383764117956161,
-0.028215479105710983,
0.035801004618406296,
-0.08439014106988907,
-0.03134791925549507,
-0.008424975909292698,
0.15477055311203003,
-0.1232663244009018,
-0.18012122809886932,
-0.008138659410178661,
-0.013345963321626186,
-0.08365726470947266,
-0.0102594755589962,
0.05528067424893379,
0.09344495087862015,
0.012085258960723877,
0.007287229411303997,
-0.08625868707895279,
0.2492569237947464,
-0.002005533315241337,
-0.067659392952919,
0.002535212552174926,
0.0030327632557600737,
-0.09130873531103134,
0.07899600267410278,
-0.025419728830456734,
0.1260758638381958,
0.05758262798190117,
0.026450734585523605,
-0.05411994084715843,
0.06364569067955017,
0.061255164444446564,
-0.08008641004562378,
0.06388676166534424,
0.17725151777267456,
-0.0019872779957950115,
0.04575350880622864,
0.07902825623750687,
-0.11089763790369034,
0.05662202462553978,
0.05623674392700195,
-0.016082782298326492,
-0.04772571846842766,
0.052110277116298676,
-0.0961744412779808,
0.10278329998254776,
0.11666858196258545,
-0.016186753287911415,
0.016257336363196373,
-0.039212893694639206,
0.02852749265730381,
-0.006071560084819794,
0.06779475510120392,
-0.021608775481581688,
-0.15950623154640198,
0.008167627267539501,
-0.039787065237760544,
0.06799870729446411,
-0.1756812036037445,
-0.02559792250394821,
0.03150239586830139,
-0.00781024107709527,
-0.008374718017876148,
0.10373690724372864,
0.038205329328775406,
0.014505041763186455,
-0.023811187595129013,
-0.0884888619184494,
0.037019070237874985,
0.0986315980553627,
-0.12293197959661484,
-0.019328469410538673
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-1B-EN-15
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-1b`**](https://huggingface.co/facebook/wav2vec2-xls-r-1b) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 15 `en` -> `{lang}` translation pairs of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from spoken `en` (Engish) to the following written languages `{lang}`:
`en` -> {`de`, `tr`, `fa`, `sv-SE`, `mn`, `zh-CN`, `cy`, `ca`, `sl`, `et`, `id`, `ar`, `ta`, `lv`, `ja`}
For more information, please refer to Section *5.1.1* of the [official XLS-R paper](https://arxiv.org/abs/2111.09296).
## Usage
### Demo
The model can be tested on [**this space**](https://huggingface.co/spaces/facebook/XLS-R-1B-EN-15).
You can select the target language, record some audio in English,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct `forced_bos_token_id` to `generate(...)` to condition
the decoder on the correct target language.
To select the correct `forced_bos_token_id` given your choosen language id, please make use
of the following mapping:
```python
MAPPING = {
"de": 250003,
"tr": 250023,
"fa": 250029,
"sv": 250042,
"mn": 250037,
"zh": 250025,
"cy": 250007,
"ca": 250005,
"sl": 250052,
"et": 250006,
"id": 250032,
"ar": 250001,
"ta": 250044,
"lv": 250017,
"ja": 250012,
}
```
As an example, if you would like to translate to Swedish, you can do the following:
```python
from datasets import load_dataset
from transformers import pipeline
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-1b-en-to-15", feature_extractor="facebook/wav2vec2-xls-r-1b-en-to-15")
translation = asr(audio_file, forced_bos_token_id=forced_bos_token_id)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-1b-en-to-15")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-1b-en-to-15")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"], forced_bos_token_id=forced_bos_token)
transcription = processor.batch_decode(generated_ids)
```
## Results `en` -> `{lang}`
See the row of **XLS-R (1B)** for the performance on [Covost2](https://huggingface.co/datasets/covost2) for this model.

## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-300m-en-to-15)
- [Wav2Vec2-XLS-R-1B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-1b-en-to-15)
- [Wav2Vec2-XLS-R-2B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-2b-en-to-15)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "en", "de", "tr", "fa", "sv", "mn", "zh", "cy", "ca", "sl", "et", "id", "ar", "ta", "lv", "ja"], "license": "apache-2.0", "tags": ["speech", "xls_r", "automatic-speech-recognition", "xls_r_translation"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-1b-en-to-15
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"en",
"de",
"tr",
"fa",
"sv",
"mn",
"zh",
"cy",
"ca",
"sl",
"et",
"id",
"ar",
"ta",
"lv",
"ja",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"en",
"de",
"tr",
"fa",
"sv",
"mn",
"zh",
"cy",
"ca",
"sl",
"et",
"id",
"ar",
"ta",
"lv",
"ja"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-1B-EN-15
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.
The model can translate from spoken 'en' (Engish) to the following written languages '{lang}':
'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}
For more information, please refer to Section *5.1.1* of the official XLS-R paper.
## Usage
### Demo
The model can be tested on this space.
You can select the target language, record some audio in English,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition
the decoder on the correct target language.
To select the correct 'forced_bos_token_id' given your choosen language id, please make use
of the following mapping:
As an example, if you would like to translate to Swedish, you can do the following:
or step-by-step as follows:
## Results 'en' -> '{lang}'
See the row of XLS-R (1B) for the performance on Covost2 for this model.
!results image
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-EN-15
- Wav2Vec2-XLS-R-1B-EN-15
- Wav2Vec2-XLS-R-2B-EN-15
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-1B-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## Results 'en' -> '{lang}'\n\nSee the row of XLS-R (1B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-1B-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## Results 'en' -> '{lang}'\n\nSee the row of XLS-R (1B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
140,
258,
3,
43,
176,
39,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-XLS-R-1B-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-1b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.## Usage### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input."
] |
[
-0.09161859005689621,
0.05811259523034096,
-0.005818515084683895,
0.02528746984899044,
0.01280263252556324,
-0.0020013058092445135,
0.03851017728447914,
0.1079026311635971,
0.07043342292308807,
0.11378801614046097,
0.011558081023395061,
0.04421146586537361,
0.10059160739183426,
0.15227898955345154,
-0.03041934035718441,
-0.12467458099126816,
0.04134620353579521,
-0.0852094441652298,
-0.002918141195550561,
0.07380708307027817,
0.08248639851808548,
-0.09221892803907394,
0.03747662901878357,
-0.03272510692477226,
0.01494493056088686,
0.003700734581798315,
-0.02735447883605957,
-0.06277142465114594,
0.06940104067325592,
0.08664892613887787,
0.07501380890607834,
0.08845441788434982,
0.04240880534052849,
-0.25271448493003845,
0.027892425656318665,
0.07324641197919846,
-0.016699692234396935,
0.00522144790738821,
0.09328191727399826,
-0.021771060302853584,
0.055623140186071396,
-0.07249606400728226,
-0.012038004584610462,
0.0628872886300087,
-0.06638649851083755,
-0.27163878083229065,
-0.0939636081457138,
-0.01324880588799715,
0.09834939986467361,
0.05457684397697449,
-0.06243489310145378,
0.05587330088019371,
-0.01830684021115303,
0.08777225017547607,
0.09782596677541733,
-0.26448866724967957,
-0.0041845692321658134,
0.02691834233701229,
0.11090957373380661,
0.05252549424767494,
-0.04800856113433838,
0.03914599493145943,
0.02119675651192665,
0.00978900771588087,
-0.03575378283858299,
-0.0729222297668457,
-0.03139110654592514,
-0.011253638193011284,
-0.15005257725715637,
-0.041914284229278564,
0.07861702889204025,
0.031388796865940094,
-0.06011905148625374,
-0.10507958382368088,
-0.0395471528172493,
-0.020360810682177544,
-0.030250510200858116,
-0.08271167427301407,
-0.04420515522360802,
-0.00008440949022769928,
-0.0070740943774580956,
-0.10556615144014359,
-0.09905070811510086,
-0.030562972649931908,
-0.09077098220586777,
0.1315203458070755,
0.031189637258648872,
0.008437363430857658,
-0.017725491896271706,
0.04469739645719528,
0.03862343356013298,
-0.09234913438558578,
-0.08329449594020844,
-0.049250997602939606,
-0.17756429314613342,
-0.000026246199922752567,
-0.06752341985702515,
-0.1038871482014656,
0.14014334976673126,
0.1452482044696808,
-0.10437770932912827,
0.08127932995557785,
-0.07433799654245377,
0.02778579853475094,
0.03647345304489136,
0.17062963545322418,
-0.04023591801524162,
-0.05403643101453781,
-0.03493819758296013,
-0.01869022473692894,
-0.04751437529921532,
-0.042742013931274414,
-0.06201041117310524,
-0.007839540019631386,
0.014573791064321995,
0.08797801285982132,
0.06864605098962784,
-0.005230969749391079,
-0.06683056801557541,
-0.0077841742895543575,
0.11010117083787918,
-0.1530088633298874,
0.0668148398399353,
0.1035509780049324,
-0.012394200079143047,
0.036371052265167236,
-0.006463330239057541,
-0.004836292006075382,
-0.08201014250516891,
-0.05330132693052292,
0.02584553137421608,
0.014810417778789997,
-0.024555550888180733,
-0.12741006910800934,
0.037850815802812576,
-0.0467623770236969,
-0.07271194458007812,
-0.14679887890815735,
-0.027461150661110878,
-0.09109067916870117,
0.007344856858253479,
-0.04464856907725334,
0.12073694914579391,
-0.08513004332780838,
-0.023681746795773506,
0.003387745004147291,
-0.012956389226019382,
-0.040875393897295,
-0.039982445538043976,
0.031103895977139473,
-0.021604375913739204,
0.11872535198926926,
0.02156260423362255,
0.0038149277679622173,
-0.05795406550168991,
0.027572689577937126,
-0.1085459515452385,
0.1901845633983612,
-0.12201343476772308,
-0.030724475160241127,
-0.13061438500881195,
-0.027132099494338036,
-0.007406418677419424,
0.03995101898908615,
0.049024857580661774,
0.10380596667528152,
-0.25433245301246643,
-0.05503278970718384,
0.2706061005592346,
-0.08564601838588715,
-0.08732039481401443,
0.16669869422912598,
0.023324808105826378,
-0.04101104289293289,
0.06011546030640602,
0.14075854420661926,
0.15664860606193542,
-0.20267632603645325,
-0.024536486715078354,
0.02930363081395626,
0.001841402961872518,
0.13787807524204254,
0.07581299543380737,
-0.11628716439008713,
0.025125473737716675,
-0.004760188516229391,
-0.028884168714284897,
0.005533802788704634,
-0.013409881852567196,
-0.05337456241250038,
0.022204840555787086,
-0.00905644241720438,
0.0972009152173996,
-0.028314102441072464,
-0.058027856051921844,
-0.05162292718887329,
-0.0651567205786705,
-0.06703392416238785,
0.07055127620697021,
-0.0631081610918045,
0.05839710310101509,
-0.13101086020469666,
0.0674305111169815,
-0.0023271553218364716,
0.03959106281399727,
-0.17785651981830597,
-0.03430524840950966,
0.01803353801369667,
-0.06577293574810028,
0.06313513964414597,
0.07934202998876572,
0.0506935641169548,
0.013056213036179543,
0.046426039189100266,
-0.016362076625227928,
0.030154040083289146,
0.0011032962938770652,
-0.006373050156980753,
-0.09538836032152176,
-0.0174826979637146,
-0.08203812688589096,
0.0876801609992981,
-0.08679139614105225,
-0.021609395742416382,
0.1265655905008316,
0.11579174548387527,
0.019358664751052856,
0.012986207380890846,
-0.01697229966521263,
0.07538028806447983,
-0.002533417195081711,
-0.0071474239230155945,
-0.00023984190193004906,
-0.03489522263407707,
-0.07036596536636353,
0.09904595464468002,
-0.08174780756235123,
-0.1059001162648201,
0.0711490660905838,
-0.027351418510079384,
-0.059609271585941315,
0.014850513078272343,
-0.01923997327685356,
0.013413449749350548,
-0.060133013874292374,
-0.0453207828104496,
0.19202497601509094,
0.08748044073581696,
0.09496975690126419,
-0.0657091960310936,
-0.03776029869914055,
0.014651362784206867,
-0.06637579947710037,
-0.04369894042611122,
0.11587059497833252,
-0.03290819004178047,
-0.11551185697317123,
0.025762272998690605,
0.09353357553482056,
-0.006818267982453108,
0.14941446483135223,
-0.0017033900367096066,
-0.08961609750986099,
-0.053158968687057495,
0.06592642515897751,
-0.0030499885324388742,
0.01888749562203884,
0.019852977246046066,
0.00018594853463582695,
0.029792936518788338,
0.03378785401582718,
0.03464765474200249,
-0.06761892139911652,
0.06309930980205536,
0.031850021332502365,
-0.1034407764673233,
0.055054452270269394,
0.048017024993896484,
0.029034337028861046,
0.0281243734061718,
-0.0016692959470674396,
-0.0608581118285656,
-0.052855394780635834,
-0.05440391227602959,
-0.10732783377170563,
0.1667858362197876,
-0.14854441583156586,
-0.35812801122665405,
-0.12138799577951431,
-0.021853003650903702,
-0.01463191956281662,
0.0019323956221342087,
0.08153658360242844,
-0.10845606029033661,
-0.040151067078113556,
-0.06998986005783081,
-0.013802115805447102,
0.009798690676689148,
-0.0302580613642931,
0.03571029379963875,
-0.0005868133157491684,
0.014017095789313316,
-0.08469820767641068,
0.008817765861749649,
0.036924097687006,
-0.014561920426785946,
0.03140182048082352,
0.03693089261651039,
0.08624709397554398,
0.11764895170927048,
-0.004081609193235636,
0.03911030292510986,
-0.028784003108739853,
0.20948587357997894,
-0.11464180052280426,
0.06466048955917358,
0.08873224258422852,
-0.03363426402211189,
0.04873565584421158,
0.12831158936023712,
-0.008808470331132412,
-0.051362328231334686,
0.00831963587552309,
-0.0026325006037950516,
0.003121624467894435,
-0.24019749462604523,
-0.08640822023153305,
-0.042586155235767365,
0.04120863229036331,
0.026111915707588196,
0.029740391299128532,
0.015330655500292778,
-0.03619358688592911,
-0.02983815409243107,
-0.07947253435850143,
0.06859106570482254,
0.021058039739727974,
0.16424904763698578,
-0.026799524202942848,
0.05581464245915413,
-0.04350300505757332,
-0.0379394106566906,
0.100868821144104,
0.051105279475450516,
0.055703602731227875,
0.0691707655787468,
0.16393177211284637,
0.09056905657052994,
0.05746116861701012,
-0.0009339063544757664,
-0.01113439816981554,
-0.017623748630285263,
0.03263290971517563,
0.03798433765769005,
-0.06971261650323868,
0.03230205550789833,
0.03880327567458153,
0.15763415396213531,
-0.1270798295736313,
-0.012313018552958965,
-0.01009785383939743,
0.10316908359527588,
0.09637657552957535,
0.1434922218322754,
-0.12625624239444733,
-0.0007233548094518483,
0.0015529857482761145,
-0.013254188932478428,
-0.030064037069678307,
-0.0007721036672592163,
0.1450926512479782,
-0.09172924607992172,
0.06884285062551498,
0.007691640406847,
0.07894913852214813,
-0.08368545025587082,
-0.008399342186748981,
-0.019547080621123314,
0.06233536824584007,
0.022774713113904,
0.06379610300064087,
-0.25785887241363525,
0.118535615503788,
0.026671912521123886,
0.04059028625488281,
-0.01274122204631567,
0.021372856572270393,
-0.008078533224761486,
-0.017359141260385513,
0.14367671310901642,
-0.0018343800911679864,
-0.11808652430772781,
-0.12088777124881744,
-0.08113165199756622,
0.04366179183125496,
0.13689623773097992,
-0.008627292700111866,
0.07456601411104202,
0.017207371070981026,
-0.06740373373031616,
-0.08737925440073013,
-0.034595828503370285,
-0.13764283061027527,
-0.10916876792907715,
0.04486554116010666,
0.047616295516490936,
0.062210191041231155,
-0.012279309332370758,
-0.005488788243383169,
-0.18350563943386078,
0.09715411812067032,
-0.18045340478420258,
-0.041521113365888596,
-0.1134854331612587,
-0.017649412155151367,
0.141078382730484,
-0.10650420933961868,
0.03508006036281586,
0.013505258597433567,
0.10320595651865005,
-0.05483080819249153,
-0.07550191879272461,
0.047286152839660645,
-0.06543184071779251,
-0.11222604662179947,
-0.028613895177841187,
0.11071804910898209,
0.08534163981676102,
0.03217543289065361,
0.09063124656677246,
0.04401717707514763,
-0.03332001715898514,
-0.09802889823913574,
0.02530210092663765,
0.037545450031757355,
-0.05073830112814903,
0.05860883742570877,
0.011676190420985222,
-0.15871058404445648,
-0.0977596566081047,
0.0052118911407887936,
0.16427397727966309,
0.18390533328056335,
-0.10342887789011002,
0.13790443539619446,
0.19961698353290558,
-0.07551021873950958,
-0.20872703194618225,
-0.11847584694623947,
0.04852967709302902,
0.07170022279024124,
-0.04267353564500809,
-0.1633892059326172,
0.0004295271646697074,
0.0366143062710762,
-0.004328323528170586,
0.05702425166964531,
-0.25955983996391296,
-0.09877282381057739,
0.12283316999673843,
-0.019487973302602768,
-0.05864276736974716,
-0.07720112800598145,
-0.09390099346637726,
-0.06492402404546738,
-0.10465280711650848,
0.03449014574289322,
-0.16208945214748383,
0.04848692566156387,
0.06128981336951256,
-0.03494852036237717,
0.010674506425857544,
-0.0023967958986759186,
0.12836310267448425,
0.013194717466831207,
-0.014837123453617096,
-0.026820942759513855,
0.09655942022800446,
-0.04484986513853073,
-0.0196838341653347,
0.09272968024015427,
-0.05252201110124588,
0.05288822576403618,
-0.05158989876508713,
-0.036034028977155685,
-0.04932345822453499,
0.07718855142593384,
-0.016247017309069633,
-0.00297719007357955,
-0.05888604372739792,
0.007508269045501947,
0.0401143953204155,
0.004469557665288448,
0.003963977564126253,
-0.09604769200086594,
0.03231753036379814,
0.2015226185321808,
0.10042449086904526,
0.052063703536987305,
-0.04240623116493225,
0.0028760817367583513,
-0.04184591770172119,
0.04828634858131409,
-0.03542967513203621,
0.06479635089635849,
0.12309520691633224,
-0.012324778363108635,
0.1404699832201004,
-0.014078200794756413,
-0.13144120573997498,
0.046955082565546036,
0.06751666218042374,
-0.08247929066419601,
-0.16242749989032745,
0.0056766667403280735,
-0.08773965388536453,
0.011372668668627739,
0.010338084772229195,
0.1843525618314743,
0.03845805674791336,
-0.028821047395467758,
-0.02435670606791973,
0.035963740199804306,
-0.04670484736561775,
0.16077199578285217,
0.05879970267415047,
0.07356029003858566,
-0.07484254986047745,
0.047447800636291504,
0.08041666448116302,
-0.10220979154109955,
0.04661071300506592,
0.12714451551437378,
-0.10468681156635284,
-0.10361284017562866,
-0.042795948684215546,
0.10193298757076263,
-0.009834789671003819,
-0.07421793043613434,
-0.020814191550016403,
-0.08498712629079819,
0.032664667814970016,
0.21424821019172668,
0.023741500452160835,
0.055371660739183426,
0.041808657348155975,
-0.05441590026021004,
-0.005194663070142269,
0.11944323033094406,
0.023995449766516685,
-0.013184009119868279,
-0.04361473768949509,
0.10083520412445068,
-0.002496524481102824,
0.10174231976270676,
-0.028386155143380165,
-0.048551976680755615,
-0.09979253262281418,
0.030838916078209877,
-0.17989572882652283,
0.037370242178440094,
-0.06591521948575974,
-0.012632093392312527,
0.03233163058757782,
-0.028073592111468315,
0.035963211208581924,
0.016356993466615677,
-0.08285318315029144,
-0.027406109496951103,
-0.053179770708084106,
0.05822017788887024,
-0.1777264028787613,
-0.006241906899958849,
0.02931644581258297,
-0.050569016486406326,
0.07672277837991714,
0.01787734217941761,
-0.04703554883599281,
0.0497644767165184,
-0.15780231356620789,
-0.03790498152375221,
0.006830983329564333,
0.06639164686203003,
0.0520286038517952,
-0.09184911102056503,
0.024048449471592903,
0.03436996787786484,
0.007828515022993088,
0.01405924279242754,
0.07145801931619644,
-0.063740573823452,
0.04198658466339111,
-0.10001541674137115,
-0.011867337860167027,
-0.02091822400689125,
0.04547300562262535,
0.0419410765171051,
0.08398082107305527,
0.09501510858535767,
-0.10741256177425385,
0.12590499222278595,
-0.0915345624089241,
-0.03374375030398369,
0.0001224276056746021,
0.012075244449079037,
0.030417514964938164,
-0.08275915682315826,
0.10132499784231186,
-0.03480980917811394,
0.10285914689302444,
-0.006828564219176769,
0.1161479577422142,
-0.016162754967808723,
-0.1690666526556015,
-0.11492182314395905,
0.0638880506157875,
0.09080713242292404,
0.0621609166264534,
-0.009891999885439873,
-0.014791928231716156,
-0.03125539422035217,
0.0008171472582034767,
0.05503231659531593,
0.050042349845170975,
0.1405264437198639,
0.09289942681789398,
0.06998606026172638,
0.06970515102148056,
-0.08540085703134537,
0.013308828696608543,
-0.010979322716593742,
-0.13379015028476715,
0.04780541732907295,
-0.062115248292684555,
0.10679948329925537,
0.05697790905833244,
-0.1476023644208908,
0.05571715161204338,
0.029011936858296394,
-0.07696837931871414,
-0.1194305345416069,
-0.07207614928483963,
-0.09782498329877853,
-0.08334653079509735,
0.01321612112224102,
-0.08543034642934799,
0.0382537916302681,
-0.03184998035430908,
0.07318302243947983,
-0.014020385220646858,
0.10209392011165619,
0.000008931649972510058,
-0.09868253767490387,
0.08115001767873764,
-0.013042578473687172,
0.06975666433572769,
0.0823410376906395,
0.009266501292586327,
0.07958611100912094,
0.03061218373477459,
0.060062166303396225,
0.05890067294239998,
-0.02065630815923214,
0.056376863270998,
-0.06832323223352432,
-0.07877635955810547,
0.014786720275878906,
0.016196567565202713,
0.02670922316610813,
0.14004531502723694,
0.12534388899803162,
-0.03834565356373787,
0.0025355867110192776,
0.0873466357588768,
-0.047604408115148544,
-0.1377786099910736,
-0.1627591997385025,
0.09591805934906006,
0.012042769230902195,
0.10991387814283371,
-0.019097039476037025,
-0.10513738542795181,
-0.047385502606630325,
0.20506148040294647,
0.10560544580221176,
-0.02410927601158619,
0.041760414838790894,
0.0618414469063282,
0.026589835062623024,
0.027282152324914932,
0.02616456151008606,
0.06389082223176956,
0.2597506642341614,
-0.015172888524830341,
-0.04349111393094063,
-0.030070142820477486,
-0.07549215853214264,
-0.06912989169359207,
-0.018274357542395592,
-0.0901445746421814,
-0.03783116117119789,
-0.013517710380256176,
0.14190234243869781,
-0.08547400683164597,
-0.14112411439418793,
0.0669541135430336,
-0.04137013480067253,
-0.07451530545949936,
0.007227928843349218,
0.09681788831949234,
0.05915439501404762,
0.011880660429596901,
0.012004471383988857,
-0.04656330868601799,
0.21465103328227997,
-0.026477303355932236,
-0.008153623901307583,
-0.012550902552902699,
-0.015435486100614071,
-0.1543595939874649,
0.08812721818685532,
-0.013010719791054726,
0.16734613478183746,
0.09437821805477142,
0.057464633136987686,
-0.04743439704179764,
0.100941501557827,
0.06588654220104218,
-0.061931733042001724,
0.07458656281232834,
0.08953000605106354,
-0.0002701921039260924,
0.07490580528974533,
0.11192410439252853,
-0.07314746826887131,
0.07195992022752762,
0.03810456022620201,
0.010336174629628658,
-0.08064214140176773,
0.04372384026646614,
-0.08597464114427567,
0.10128580033779144,
0.13158032298088074,
-0.019201675429940224,
0.032298505306243896,
-0.012747266329824924,
0.033856187015771866,
-0.03401193395256996,
0.04807879775762558,
-0.06268299371004105,
-0.13880643248558044,
0.014221245422959328,
-0.011335722170770168,
0.06009984761476517,
-0.17780525982379913,
-0.003336952766403556,
0.01933368667960167,
0.009054303169250488,
0.025368232280015945,
0.11304157227277756,
0.030886340886354446,
0.01552070677280426,
-0.02790527231991291,
-0.08167854696512222,
0.04307818412780762,
0.10222376883029938,
-0.12330518662929535,
-0.04541124030947685
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-1B
[Facebook's Wav2Vec2 XLS-R](https://ai.facebook.com/blog/xls-r-self-supervised-speech-processing-for-128-languages) counting **1 billion** parameters.

XLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the "XLM-R for Speech"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz.
**Note**: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out [**this blog**](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for more information about ASR.
[XLS-R Paper](https://arxiv.org/abs/2111.09296)
**Abstract**
This paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this google colab](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLS_R_on_Common_Voice.ipynb) for more information on how to fine-tune the model.
You can find other pretrained XLS-R models with different numbers of parameters:
* [300M parameters version](https://huggingface.co/facebook/wav2vec2-xls-r-300m)
* [1B version version](https://huggingface.co/facebook/wav2vec2-xls-r-1b)
* [2B version version](https://huggingface.co/facebook/wav2vec2-xls-r-2b)
|
{"language": ["multilingual", "ab", "af", "sq", "am", "ar", "hy", "as", "az", "ba", "eu", "be", "bn", "bs", "br", "bg", "my", "yue", "ca", "ceb", "km", "zh", "cv", "hr", "cs", "da", "dv", "nl", "en", "eo", "et", "fo", "fi", "fr", "gl", "lg", "ka", "de", "el", "gn", "gu", "ht", "cnh", "ha", "haw", "he", "hi", "hu", "is", "id", "ia", "ga", "it", "ja", "jv", "kb", "kn", "kk", "rw", "ky", "ko", "ku", "lo", "la", "lv", "ln", "lt", "lm", "mk", "mg", "ms", "ml", "mt", "gv", "mi", "mr", "mn", "ne", false, "nn", "oc", "or", "ps", "fa", "pl", "pt", "pa", "ro", "rm", "rm", "ru", "sah", "sa", "sco", "sr", "sn", "sd", "si", "sk", "sl", "so", "hsb", "es", "su", "sw", "sv", "tl", "tg", "ta", "tt", "te", "th", "bo", "tp", "tr", "tk", "uk", "ur", "uz", "vi", "vot", "war", "cy", "yi", "yo", "zu"], "license": "apache-2.0", "tags": ["speech", "xls_r", "xls_r_pretrained"], "datasets": ["common_voice", "multilingual_librispeech"], "language_bcp47": ["zh-HK", "zh-TW", "fy-NL"]}
| null |
facebook/wav2vec2-xls-r-1b
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"speech",
"xls_r",
"xls_r_pretrained",
"multilingual",
"ab",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"ba",
"eu",
"be",
"bn",
"bs",
"br",
"bg",
"my",
"yue",
"ca",
"ceb",
"km",
"zh",
"cv",
"hr",
"cs",
"da",
"dv",
"nl",
"en",
"eo",
"et",
"fo",
"fi",
"fr",
"gl",
"lg",
"ka",
"de",
"el",
"gn",
"gu",
"ht",
"cnh",
"ha",
"haw",
"he",
"hi",
"hu",
"is",
"id",
"ia",
"ga",
"it",
"ja",
"jv",
"kb",
"kn",
"kk",
"rw",
"ky",
"ko",
"ku",
"lo",
"la",
"lv",
"ln",
"lt",
"lm",
"mk",
"mg",
"ms",
"ml",
"mt",
"gv",
"mi",
"mr",
"mn",
"ne",
"no",
"nn",
"oc",
"or",
"ps",
"fa",
"pl",
"pt",
"pa",
"ro",
"rm",
"ru",
"sah",
"sa",
"sco",
"sr",
"sn",
"sd",
"si",
"sk",
"sl",
"so",
"hsb",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tp",
"tr",
"tk",
"uk",
"ur",
"uz",
"vi",
"vot",
"war",
"cy",
"yi",
"yo",
"zu",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"ab",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"ba",
"eu",
"be",
"bn",
"bs",
"br",
"bg",
"my",
"yue",
"ca",
"ceb",
"km",
"zh",
"cv",
"hr",
"cs",
"da",
"dv",
"nl",
"en",
"eo",
"et",
"fo",
"fi",
"fr",
"gl",
"lg",
"ka",
"de",
"el",
"gn",
"gu",
"ht",
"cnh",
"ha",
"haw",
"he",
"hi",
"hu",
"is",
"id",
"ia",
"ga",
"it",
"ja",
"jv",
"kb",
"kn",
"kk",
"rw",
"ky",
"ko",
"ku",
"lo",
"la",
"lv",
"ln",
"lt",
"lm",
"mk",
"mg",
"ms",
"ml",
"mt",
"gv",
"mi",
"mr",
"mn",
"ne",
"no",
"nn",
"oc",
"or",
"ps",
"fa",
"pl",
"pt",
"pa",
"ro",
"rm",
"rm",
"ru",
"sah",
"sa",
"sco",
"sr",
"sn",
"sd",
"si",
"sk",
"sl",
"so",
"hsb",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tp",
"tr",
"tk",
"uk",
"ur",
"uz",
"vi",
"vot",
"war",
"cy",
"yi",
"yo",
"zu"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-1B
Facebook's Wav2Vec2 XLS-R counting 1 billion parameters.
!model image
XLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the "XLM-R for Speech"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz.
Note: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.
XLS-R Paper
Abstract
This paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.
The original model can be found under URL
# Usage
See this google colab for more information on how to fine-tune the model.
You can find other pretrained XLS-R models with different numbers of parameters:
* 300M parameters version
* 1B version version
* 2B version version
|
[
"# Wav2Vec2-XLS-R-1B\n\nFacebook's Wav2Vec2 XLS-R counting 1 billion parameters.\n\n!model image\n\nXLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the \"XLM-R for Speech\"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz. \n\nNote: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.\n\nXLS-R Paper\n\nAbstract\nThis paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.\n\nThe original model can be found under URL",
"# Usage\n\nSee this google colab for more information on how to fine-tune the model.\n\nYou can find other pretrained XLS-R models with different numbers of parameters:\n\n* 300M parameters version\n* 1B version version\n* 2B version version"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-1B\n\nFacebook's Wav2Vec2 XLS-R counting 1 billion parameters.\n\n!model image\n\nXLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the \"XLM-R for Speech\"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz. \n\nNote: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.\n\nXLS-R Paper\n\nAbstract\nThis paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.\n\nThe original model can be found under URL",
"# Usage\n\nSee this google colab for more information on how to fine-tune the model.\n\nYou can find other pretrained XLS-R models with different numbers of parameters:\n\n* 300M parameters version\n* 1B version version\n* 2B version version"
] |
[
364,
477,
55
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.04758894816040993,
0.046750012785196304,
-0.014598522335290909,
0.007297620642930269,
0.04550313577055931,
0.06332182139158249,
0.05419944226741791,
0.09116444736719131,
0.0727275013923645,
0.120418980717659,
0.11597013473510742,
0.08993609249591827,
0.10850890725851059,
0.030012592673301697,
0.02611636370420456,
-0.23409810662269592,
-0.014968891628086567,
-0.01329061295837164,
-0.050599705427885056,
0.09369145333766937,
0.0228695347905159,
-0.039926063269376755,
0.09142758697271347,
-0.08705759048461914,
0.05855416879057884,
0.025268973782658577,
-0.04975442960858345,
0.006925574969500303,
0.026553384959697723,
0.04827604070305824,
0.005549455527216196,
0.07044541090726852,
0.04715149477124214,
-0.24975258111953735,
0.032285042107105255,
-0.01567848026752472,
-0.03946636989712715,
-0.007352286018431187,
0.019108077511191368,
-0.12647803127765656,
0.16093921661376953,
-0.058167893439531326,
-0.0829150602221489,
0.048358362168073654,
-0.17256440222263336,
-0.1780303716659546,
-0.057935889810323715,
0.11529215425252914,
0.05751940608024597,
0.04282727837562561,
-0.054432131350040436,
0.07651355117559433,
-0.10932482779026031,
0.05207585170865059,
0.20425982773303986,
-0.18100734055042267,
-0.03396864980459213,
0.052614592015743256,
0.03836875036358833,
0.06726958602666855,
-0.10757510364055634,
0.004715980961918831,
0.014004664495587349,
0.018325919285416603,
-0.06147626042366028,
-0.04280159994959831,
0.09236277639865875,
0.05261838808655739,
-0.0792241171002388,
-0.0023866845294833183,
0.12998612225055695,
0.04852217063307762,
0.05794971063733101,
0.07729648798704147,
-0.014261530712246895,
-0.17769774794578552,
-0.036947861313819885,
-0.0072819264605641365,
0.02907484956085682,
0.03221805766224861,
-0.0040543596260249615,
0.10043403506278992,
-0.04863365739583969,
0.027797162532806396,
-0.0013738577254116535,
0.03689282014966011,
0.05448931083083153,
-0.01068132370710373,
0.002564662601798773,
-0.008259820751845837,
0.02728724479675293,
-0.1339804083108902,
0.0193843524903059,
0.03314534202218056,
-0.011473586782813072,
-0.0013692154316231608,
0.08231918513774872,
0.08224035054445267,
0.106025330722332,
0.11278236657381058,
-0.0741412341594696,
0.09792093932628632,
0.053250301629304886,
0.05515800416469574,
0.03846575319766998,
0.030388658866286278,
-0.05339635908603668,
-0.07941558212041855,
-0.11798667907714844,
0.011203414760529995,
-0.039740465581417084,
0.002606295980513096,
-0.0391656793653965,
0.03476659208536148,
-0.0027384550776332617,
0.04721203073859215,
-0.002601043554022908,
0.04684453085064888,
-0.0627133846282959,
0.024724356830120087,
0.010196039453148842,
-0.04707365855574608,
0.03391134366393089,
0.10114587098360062,
0.019277868792414665,
0.10931576043367386,
-0.03982828930020332,
0.015561006963253021,
-0.010288936085999012,
0.06658016890287399,
-0.034604910761117935,
0.055008504539728165,
-0.009356276132166386,
-0.02502163127064705,
0.0725674033164978,
-0.046018797904253006,
0.04567970335483551,
-0.08267710357904434,
-0.028219234198331833,
-0.05670318007469177,
-0.0090637831017375,
-0.07997855544090271,
-0.0053685917519032955,
-0.08900962024927139,
-0.1086217612028122,
-0.002038841601461172,
0.012349596247076988,
0.052931562066078186,
-0.0727030411362648,
0.08642856776714325,
0.009818961843848228,
0.08657389879226685,
0.060204003006219864,
0.026761578395962715,
0.0009941438911482692,
0.07027294486761093,
-0.05940583348274231,
0.06373874843120575,
-0.08054359257221222,
0.03210092708468437,
-0.0985342487692833,
-0.09965771436691284,
-0.10359422862529755,
0.017799248918890953,
0.0003231614828109741,
0.17196089029312134,
-0.1537889987230301,
-0.09573788195848465,
0.26548218727111816,
-0.03543860465288162,
-0.03795851394534111,
0.12055743485689163,
0.05553283542394638,
-0.018209131434559822,
0.04342658445239067,
0.14339213073253632,
0.023143338039517403,
-0.09280882775783539,
-0.09612368792295456,
0.03393334895372391,
0.06252431869506836,
0.10833461582660675,
0.10885844379663467,
0.01866021566092968,
0.1016145795583725,
0.025084640830755234,
0.01778208650648594,
0.06347259134054184,
-0.08169955015182495,
-0.08290635794401169,
0.0662018284201622,
-0.04978432506322861,
0.036164697259664536,
0.10354362428188324,
-0.025877490639686584,
-0.051357705146074295,
-0.03679794445633888,
-0.11615405976772308,
0.09353452920913696,
0.003735500853508711,
-0.009119843132793903,
-0.09966672956943512,
-0.0024344315752387047,
0.06647340953350067,
0.02288619615137577,
-0.04526057466864586,
0.09622130542993546,
-0.053359005600214005,
0.11819121986627579,
0.0636710673570633,
0.07794049382209778,
0.10187273472547531,
-0.018218638375401497,
-0.07862292975187302,
-0.041916824877262115,
0.09991682320833206,
0.012974705547094345,
-0.0592535175383091,
-0.22189302742481232,
0.05805857852101326,
-0.021290823817253113,
0.09959299862384796,
-0.173395037651062,
0.03262442350387573,
0.1390991061925888,
0.1537192165851593,
0.0032056004274636507,
-0.011890897527337074,
-0.007739355321973562,
0.08227310329675674,
0.01825963705778122,
-0.015648340806365013,
0.033745236694812775,
-0.030913399532437325,
-0.0483984649181366,
0.02221750095486641,
-0.07240423560142517,
0.11462175101041794,
0.10714226216077805,
-0.0774630531668663,
-0.07751689106225967,
0.10570073872804642,
-0.021646583452820778,
-0.019964344799518585,
0.11689845472574234,
-0.0033914693631231785,
0.10226988047361374,
0.0063048601150512695,
0.020653771236538887,
-0.021763304248452187,
-0.02627391926944256,
0.016238698735833168,
-0.078618124127388,
-0.04888671264052391,
0.19366852939128876,
0.039686936885118484,
-0.14944960176944733,
0.19100140035152435,
0.13074064254760742,
0.046479348093271255,
0.1761941760778427,
-0.02098187804222107,
-0.038460101932287216,
-0.09703352302312851,
0.008853917010128498,
0.006463043857365847,
0.06676548719406128,
-0.1745358407497406,
-0.0192779041826725,
-0.035569097846746445,
-0.012208925560116768,
0.014522350393235683,
-0.07549697160720825,
-0.06783230602741241,
-0.049335066229104996,
-0.04544844850897789,
0.0057989503256976604,
0.05347583442926407,
-0.06442030519247055,
0.0798012763261795,
0.01801256649196148,
-0.03578075021505356,
-0.05564208701252937,
-0.0182326789945364,
-0.07343576848506927,
0.14630886912345886,
-0.1613723635673523,
-0.0870039165019989,
0.06405065953731537,
-0.10177189111709595,
0.058744970709085464,
-0.019216667860746384,
0.0016281373100355268,
-0.14172931015491486,
0.012648227624595165,
0.03568769991397858,
0.0959029421210289,
-0.10252966731786728,
-0.022290745750069618,
-0.010165792889893055,
0.010574684478342533,
-0.04425327852368355,
-0.022789975628256798,
-0.03573226556181908,
0.015208350494503975,
-0.0862061157822609,
0.09914316236972809,
-0.13020643591880798,
0.05319688096642494,
0.11437389999628067,
0.10433543473482132,
0.010324402712285519,
-0.014090786688029766,
0.15116819739341736,
-0.14562298357486725,
0.003910788334906101,
-0.03903409093618393,
-0.007643275894224644,
0.03717243671417236,
0.1350412517786026,
0.04445036128163338,
-0.06503818184137344,
-0.03835732862353325,
0.015962857753038406,
-0.004784753080457449,
-0.1427658051252365,
-0.0325813964009285,
-0.047438591718673706,
0.11782156676054001,
-0.04103660210967064,
0.08624549210071564,
-0.023751046508550644,
0.0010536120971664786,
-0.047525715082883835,
-0.12167119979858398,
-0.008506490848958492,
-0.04549160227179527,
0.0391707718372345,
-0.045567408204078674,
0.013115744106471539,
-0.03516572713851929,
-0.022930510342121124,
0.04655591398477554,
0.08895205706357956,
-0.06574182957410812,
0.05527547001838684,
0.07031182944774628,
0.07370050996541977,
0.1494828313589096,
-0.0243929885327816,
-0.043639298528432846,
0.043996717780828476,
-0.018319150432944298,
0.0061632124707102776,
-0.019752908498048782,
-0.031136393547058105,
0.026026621460914612,
0.15798795223236084,
-0.025608809664845467,
0.035328567028045654,
0.0016783815808594227,
0.12035245448350906,
0.07534033805131912,
0.06748493760824203,
-0.10401834547519684,
-0.029330622404813766,
0.07391085475683212,
0.007402131799608469,
-0.004358747974038124,
0.03399980440735817,
0.03638716787099838,
-0.05962919816374779,
0.11692819744348526,
0.08322925120592117,
0.023530272766947746,
-0.07913004606962204,
0.06384597718715668,
0.007824680767953396,
-0.00374860898591578,
-0.009654784575104713,
0.05424704775214195,
-0.2965855002403259,
0.19174015522003174,
0.020947936922311783,
0.010615861043334007,
-0.008552956394851208,
-0.0559433214366436,
0.03683074191212654,
0.07165589183568954,
0.13723109662532806,
0.0654105469584465,
-0.16263112425804138,
-0.16869181394577026,
-0.011847032234072685,
0.0036132351960986853,
0.1084655374288559,
-0.025796353816986084,
0.04171372205018997,
0.0539994016289711,
-0.042270079255104065,
-0.02768274024128914,
0.009784230962395668,
-0.07511284947395325,
-0.008112223818898201,
0.07865383476018906,
-0.043466437608003616,
0.033007413148880005,
-0.019895141944289207,
-0.03433030843734741,
-0.17768992483615875,
0.011113160289824009,
-0.15309588611125946,
0.013765513896942139,
-0.034572720527648926,
0.01982167549431324,
0.057510241866111755,
-0.11803165823221207,
-0.10935557633638382,
0.043603166937828064,
-0.07333322614431381,
-0.019134515896439552,
0.021817652508616447,
0.08984815329313278,
-0.0452747568488121,
-0.18489821255207062,
-0.003973573446273804,
0.116004578769207,
0.08338615298271179,
0.11872392892837524,
-0.04745079576969147,
0.02563413791358471,
-0.00017659025616012514,
-0.08358296006917953,
0.15851682424545288,
-0.021242165938019753,
-0.028059806674718857,
0.04623347893357277,
-0.008475400507450104,
-0.06448058784008026,
-0.08969849348068237,
-0.07474170625209808,
0.08955075591802597,
0.2982664108276367,
-0.02794746123254299,
0.10638932883739471,
0.0982741042971611,
-0.08372929692268372,
-0.2470681369304657,
-0.11061340570449829,
-0.01849932037293911,
0.022611845284700394,
-0.014536612667143345,
-0.20981232821941376,
-0.037290822714567184,
-0.0007734635728411376,
0.027293013408780098,
-0.011917482130229473,
-0.27352064847946167,
-0.03471164405345917,
0.1081906408071518,
0.0014343776274472475,
0.08150352537631989,
-0.1988932341337204,
-0.03273606672883034,
-0.006223125848919153,
-0.061205051839351654,
-0.08480805903673172,
-0.01567237451672554,
0.04754459857940674,
-0.012480777688324451,
0.01996791735291481,
-0.007350675296038389,
-0.009788032621145248,
0.16906900703907013,
0.055363982915878296,
-0.015854258090257645,
-0.08185693621635437,
-0.112275131046772,
0.011410282924771309,
0.027432221919298172,
-0.020014578476548195,
-0.12136157602071762,
-0.06844186037778854,
-0.05250456929206848,
0.028336280956864357,
-0.1432066410779953,
-0.0038499508518725634,
-0.053854309022426605,
0.01680470071732998,
-0.043834712356328964,
0.07147806882858276,
0.061997462064027786,
0.015688452869653702,
0.07988321781158447,
-0.09488020837306976,
0.10145483165979385,
0.003983052913099527,
0.13370566070079803,
0.05125358700752258,
0.003471267642453313,
-0.032494474202394485,
-0.012622587382793427,
-0.007999869994819164,
-0.09816669672727585,
-0.008034632541239262,
0.1326776146888733,
0.006311078555881977,
0.08053737878799438,
0.0388166643679142,
-0.11542125046253204,
0.005140396766364574,
0.10659848153591156,
-0.08538658916950226,
-0.16917502880096436,
-0.0030268800910562277,
-0.0890909805893898,
-0.02113277278840542,
-0.013151636347174644,
0.09828539192676544,
-0.004258833825588226,
-0.009548707865178585,
0.009000595659017563,
0.08283234387636185,
-0.06984352320432663,
0.1317896991968155,
0.06573610007762909,
0.007921171374619007,
-0.07596545666456223,
0.018116388469934464,
0.008600310422480106,
-0.05593857914209366,
0.01814286783337593,
0.16202816367149353,
-0.03931687772274017,
-0.08665654808282852,
-0.002317215083166957,
0.11540139466524124,
0.09805311262607574,
-0.014468617737293243,
-0.00561782019212842,
-0.12169135361909866,
0.05704066902399063,
0.1739104688167572,
0.02329704351723194,
0.03515571355819702,
0.06376361101865768,
0.03490538150072098,
0.04636535793542862,
0.09959180653095245,
0.06675101816654205,
-0.019476937130093575,
-0.04469487816095352,
0.0639730915427208,
-0.05874791368842125,
0.10523588955402374,
-0.008864161558449268,
-0.013945162296295166,
-0.18875719606876373,
0.055915020406246185,
-0.051443327218294144,
-0.051493141800165176,
-0.1284194439649582,
-0.028333108872175217,
0.01252016332000494,
-0.10935238003730774,
-0.05806810036301613,
-0.051216673105955124,
-0.08376302570104599,
-0.011372096836566925,
0.022658830508589745,
0.12448020279407501,
-0.06087949499487877,
-0.06464768201112747,
0.08071742206811905,
-0.06336674839258194,
0.0726860836148262,
0.11158103495836258,
-0.0033723560627549887,
0.08947547525167465,
-0.15136584639549255,
-0.020136872306466103,
0.03837635740637779,
0.018109142780303955,
-0.026558663696050644,
-0.021494993939995766,
-0.0733703002333641,
-0.06686865538358688,
0.019500169903039932,
0.056125253438949585,
0.023654242977499962,
0.018599938601255417,
0.15732167661190033,
-0.03313099220395088,
-0.08085320144891739,
-0.027194727212190628,
0.04946548864245415,
0.07403677701950073,
-0.013004578649997711,
0.021954303607344627,
-0.08872180432081223,
0.04596378654241562,
-0.10859571397304535,
0.029522854834794998,
0.0036232504062354565,
-0.09073837846517563,
0.012730208225548267,
-0.034900274127721786,
0.08679858595132828,
-0.0240815170109272,
0.06667061150074005,
-0.07957162708044052,
-0.12135081738233566,
0.014396972954273224,
-0.03353375941514969,
-0.07763993740081787,
0.04483365640044212,
0.018808657303452492,
0.052962254732847214,
-0.06260966509580612,
-0.09313981235027313,
0.03413401544094086,
-0.027125364169478416,
-0.03185851871967316,
0.10172892361879349,
0.10287643224000931,
0.15682417154312134,
0.04110949859023094,
-0.012061616405844688,
-0.10464885085821152,
0.030519677326083183,
0.04306397587060928,
-0.1267947256565094,
-0.03761861100792885,
-0.033327169716358185,
0.1723002791404724,
0.14371195435523987,
-0.12686452269554138,
0.025362113490700722,
-0.0958649069070816,
-0.06121267378330231,
-0.09734506160020828,
-0.11149827390909195,
-0.02032903954386711,
-0.04046092554926872,
0.034334179013967514,
-0.08462601155042648,
0.023711245507001877,
0.05536153167486191,
0.055003099143505096,
0.022635193541646004,
0.0749579444527626,
0.05964932218194008,
-0.029332127422094345,
0.02037837915122509,
0.015804825350642204,
-0.027766520157456398,
-0.12268771976232529,
0.06084098666906357,
0.012114623561501503,
-0.04785197228193283,
0.00784006342291832,
0.006400256883352995,
-0.1174456775188446,
-0.0028134838212281466,
-0.08843544125556946,
-0.10307108610868454,
0.0019086425891146064,
0.03586211055517197,
0.01882612146437168,
0.08446840196847916,
0.018749628216028214,
0.0027973330579698086,
0.03613686561584473,
0.05063589662313461,
-0.0379384309053421,
-0.05817336589097977,
-0.054089002311229706,
0.12242035567760468,
-0.05788540840148926,
0.024130774661898613,
-0.005720221437513828,
-0.010565275326371193,
0.045951247215270996,
0.1654592603445053,
0.22614310681819916,
-0.06411873549222946,
0.06531936675310135,
0.01827136240899563,
0.04413125663995743,
0.037272192537784576,
-0.05637273192405701,
0.09209465235471725,
0.14816640317440033,
-0.10727906227111816,
0.04184422641992569,
-0.0515945665538311,
-0.018564878031611443,
-0.042329996824264526,
-0.04453852027654648,
0.031944289803504944,
0.01670663431286812,
-0.04309053346514702,
0.11987631022930145,
-0.19023706018924713,
-0.09840022027492523,
0.019816596060991287,
-0.19020907580852509,
-0.026547834277153015,
-0.031278468668460846,
0.033922936767339706,
0.15247024595737457,
0.07217798382043839,
-0.018671637400984764,
-0.07651641219854355,
0.046709176152944565,
0.047176163643598557,
-0.09028580039739609,
0.06400150060653687,
0.05047034099698067,
-0.09749908745288849,
0.04046724736690521,
-0.07054471224546432,
0.03860846161842346,
0.10197609663009644,
0.001449240604415536,
0.02716143988072872,
0.01903568208217621,
0.07957754284143448,
-0.04702087119221687,
-0.09546404331922531,
0.08421297371387482,
-0.0016376468120142817,
0.04542030394077301,
0.1331513673067093,
-0.02716847136616707,
0.0395800918340683,
0.0850001648068428,
-0.05050958693027496,
0.03210289403796196,
0.11880776286125183,
-0.039885617792606354,
0.06432854384183884,
0.10259599983692169,
-0.009587311185896397,
-0.021405193954706192,
-0.011172259226441383,
-0.00789770670235157,
-0.033448997884988785,
0.05320707708597183,
-0.03739314153790474,
-0.1015140563249588,
-0.019261715933680534,
-0.01634345017373562,
0.08806314319372177,
0.000858892744872719,
-0.06017018482089043,
0.017442673444747925,
0.012204518541693687,
-0.07268474251031876,
0.09057633578777313,
0.07205245643854141,
0.019738230854272842,
-0.05116759613156319,
-0.1648966372013092,
0.0272023007273674,
0.13811902701854706,
-0.09347319602966309,
-0.050105031579732895
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-2b-21-EN
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-2b`**](https://huggingface.co/facebook/wav2vec2-xls-r-2b) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 21 `{lang}` -> `en` translation pairs of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from the following spoken languages `{lang}` -> `en` (English):
{`fr`, `de`, `es`, `ca`, `it`, `ru`, `zh-CN`, `pt`, `fa`, `et`, `mn`, `nl`, `tr`, `ar`, `sv-SE`, `lv`, `sl`, `ta`, `ja`, `id`, `cy`} -> `en`
For more information, please refer to Section *5.1.2* of the [official XLS-R paper](https://arxiv.org/abs/2111.09296).
## Usage
### Demo
The model can be tested directly on the speech recognition widget on this model card!
Simple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline
```python
from datasets import load_dataset
from transformers import pipeline
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-2b-21-to-en", feature_extractor="facebook/wav2vec2-xls-r-2b-21-to-en")
translation = asr(audio_file)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-2b-21-to-en")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-2b-21-to-en")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"])
transcription = processor.batch_decode(generated_ids)
```
## Results `{lang}` -> `en`
See the row of **XLS-R (2B)** for the performance on [Covost2](https://huggingface.co/datasets/covost2) for this model.

## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-300m-21-to-en)
- [Wav2Vec2-XLS-R-1B-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-1b-21-to-en)
- [Wav2Vec2-XLS-R-2B-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-2b-21-to-en)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "fr", "de", "es", "ca", "it", "ru", "zh", "pt", "fa", "et", "mn", "nl", "tr", "ar", "sv", "lv", "sl", "ta", "ja", "id", "cy", "en"], "license": "apache-2.0", "tags": ["speech", "xls_r", "automatic-speech-recognition", "xls_r_translation"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Swedish", "src": "https://cdn-media.huggingface.co/speech_samples/cv_swedish_1.mp3"}, {"example_title": "Arabic", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ar_19058308.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "German", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_de_17284683.mp3"}, {"example_title": "French", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_fr_17299386.mp3"}, {"example_title": "Indonesian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_id_19051309.mp3"}, {"example_title": "Italian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_it_17415776.mp3"}, {"example_title": "Japanese", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ja_19482488.mp3"}, {"example_title": "Mongolian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_mn_18565396.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "Turkish", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_tr_17341280.mp3"}, {"example_title": "Catalan", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ca_17367522.mp3"}, {"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-2b-21-to-en
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-2b-21-EN
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.
The model can translate from the following spoken languages '{lang}' -> 'en' (English):
{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'
For more information, please refer to Section *5.1.2* of the official XLS-R paper.
## Usage
### Demo
The model can be tested directly on the speech recognition widget on this model card!
Simple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline
or step-by-step as follows:
## Results '{lang}' -> 'en'
See the row of XLS-R (2B) for the performance on Covost2 for this model.
!results image
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-21-EN
- Wav2Vec2-XLS-R-1B-21-EN
- Wav2Vec2-XLS-R-2B-21-EN
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-2b-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:",
"## Results '{lang}' -> 'en'\n\nSee the row of XLS-R (2B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-21-EN\n- Wav2Vec2-XLS-R-1B-21-EN\n- Wav2Vec2-XLS-R-2B-21-EN\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-2b-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:",
"## Results '{lang}' -> 'en'\n\nSee the row of XLS-R (2B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-21-EN\n- Wav2Vec2-XLS-R-1B-21-EN\n- Wav2Vec2-XLS-R-2B-21-EN\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
152,
280,
3,
52,
66,
39,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-XLS-R-2b-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.## Usage### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input."
] |
[
-0.07017840445041656,
0.04909767955541611,
-0.006031119730323553,
0.020541569218039513,
0.020089363679289818,
-0.01933172158896923,
0.028582174330949783,
0.10275153815746307,
0.04552898555994034,
0.13754813373088837,
-0.011960349045693874,
0.07071655988693237,
0.07494030892848969,
0.1315261721611023,
-0.018951181322336197,
-0.15216824412345886,
0.05442282557487488,
-0.1092434972524643,
0.03472798690199852,
0.0696573331952095,
0.08750438690185547,
-0.07386237382888794,
0.029761353507637978,
-0.01369683351367712,
-0.00499483710154891,
0.01586069166660309,
-0.008972173556685448,
-0.06360957771539688,
0.04467477276921272,
0.09774181991815567,
0.0415322445333004,
0.07307866960763931,
0.05895007774233818,
-0.2851034104824066,
0.02247682958841324,
0.07131995260715485,
-0.008976330980658531,
0.010028082877397537,
0.13616397976875305,
-0.06607931107282639,
0.03586196154356003,
-0.0628194659948349,
-0.019809357821941376,
0.09003932774066925,
-0.06932919472455978,
-0.25064757466316223,
-0.06429959088563919,
0.06778499484062195,
0.07462677359580994,
0.0384490080177784,
-0.07443247735500336,
-0.0006341415573842824,
-0.00827457383275032,
0.07612521201372147,
0.13625839352607727,
-0.2021411955356598,
-0.006117249373346567,
-0.0344938300549984,
0.08586355298757553,
0.02850727178156376,
-0.0669512078166008,
0.05549236387014389,
0.006116058211773634,
-0.008758092299103737,
-0.021474702283740044,
-0.046352822333574295,
-0.026341821998357773,
-0.02950225956737995,
-0.13202650845050812,
-0.03737623989582062,
0.0571105070412159,
0.06930248439311981,
-0.04631267488002777,
-0.13419784605503082,
-0.03578880429267883,
-0.036700498312711716,
-0.035666752606630325,
-0.07502572238445282,
-0.04972623288631439,
0.003545125015079975,
0.019933486357331276,
-0.09091204404830933,
-0.10478243231773376,
0.007623941637575626,
-0.05700840428471565,
0.13089364767074585,
0.006607747636735439,
0.02232319302856922,
0.03134339302778244,
0.041435807943344116,
0.008505807258188725,
-0.08959779143333435,
-0.014713605865836143,
-0.04651902988553047,
-0.1909170299768448,
0.0034928524401038885,
-0.01953786239027977,
-0.035919368267059326,
0.11789756268262863,
0.11675425618886948,
-0.036214422434568405,
0.07053700089454651,
-0.08532845228910446,
0.018206968903541565,
0.0407637283205986,
0.10923507809638977,
-0.10122088342905045,
-0.057934846729040146,
-0.0367727167904377,
-0.010392457246780396,
-0.03023611754179001,
-0.03839289769530296,
-0.06505977362394333,
-0.012756234966218472,
0.023102160543203354,
0.06541584432125092,
0.07719200849533081,
-0.004975954070687294,
-0.06726676970720291,
-0.046854596585035324,
0.11977844685316086,
-0.14019708335399628,
0.05607607960700989,
0.12883852422237396,
-0.030391734093427658,
0.06763023883104324,
-0.002024903893470764,
0.03604227676987648,
-0.07540971040725708,
-0.011200944893062115,
0.03588317334651947,
0.015963193029165268,
-0.015018092468380928,
-0.10627274960279465,
0.029859958216547966,
-0.04997607693076134,
-0.07325375825166702,
-0.11248204112052917,
-0.007117955479770899,
-0.07391821593046188,
0.01863805018365383,
-0.07320700585842133,
0.03977221995592117,
-0.05788077414035797,
-0.03782243654131889,
0.029228122904896736,
-0.019093742594122887,
-0.00989610143005848,
-0.039380114525556564,
0.033832479268312454,
-0.04265756160020828,
0.09047500789165497,
0.03327624127268791,
0.03465666621923447,
-0.010496935807168484,
0.022344399243593216,
-0.1568000316619873,
0.21582581102848053,
-0.1263071745634079,
-0.03191658854484558,
-0.14447782933712006,
-0.041628215461969376,
0.013937528245151043,
0.030172480270266533,
0.005389880388975143,
0.06067560985684395,
-0.22741366922855377,
-0.07114467024803162,
0.21637435257434845,
-0.07062645256519318,
-0.03643923997879028,
0.15604268014431,
0.0026568956673145294,
-0.03797168657183647,
0.06965304911136627,
0.15254658460617065,
0.15412971377372742,
-0.20093794167041779,
-0.01843475177884102,
-0.0008019160595722497,
-0.007679385133087635,
0.18071793019771576,
0.08726324886083603,
-0.09136654436588287,
0.0522770881652832,
0.0023527266457676888,
-0.016690682619810104,
-0.015810903161764145,
-0.014858094044029713,
-0.03669954836368561,
0.047690000385046005,
-0.03823072090744972,
0.10604696720838547,
-0.030253805220127106,
-0.07531546801328659,
-0.02096027322113514,
-0.09651274234056473,
-0.0015626362292096019,
0.07583050429821014,
-0.043226346373558044,
0.0295683853328228,
-0.10744863748550415,
0.06051658093929291,
0.0033462306018918753,
0.007250994443893433,
-0.17880798876285553,
-0.050409771502017975,
0.008325103670358658,
-0.0790766254067421,
0.08373989164829254,
0.08025491237640381,
0.02281815931200981,
0.0266939215362072,
0.020521875470876694,
-0.010765863582491875,
0.03497779741883278,
0.003529172856360674,
0.021182825788855553,
-0.11852894723415375,
-0.05294343829154968,
-0.06001649796962738,
0.12816232442855835,
-0.0957917720079422,
-0.035424087196588516,
0.11957989633083344,
0.14702346920967102,
0.015552937053143978,
0.004130970221012831,
0.008303576149046421,
0.020761936902999878,
0.040238384157419205,
0.002297323662787676,
0.0008616680279374123,
-0.02781127579510212,
-0.034551627933979034,
0.09858044981956482,
-0.124538853764534,
-0.06507562845945358,
0.06418611854314804,
0.015538167208433151,
-0.0685926079750061,
0.02377854846417904,
-0.03664455562829971,
-0.0010905934032052755,
-0.08258824050426483,
-0.061763521283864975,
0.19352149963378906,
0.08930804580450058,
0.07616613060235977,
-0.06917643547058105,
-0.05201951786875725,
0.02475561387836933,
-0.05117153003811836,
-0.04066254943609238,
0.11548127979040146,
-0.033321306109428406,
-0.1595468372106552,
0.0296053197234869,
0.00009868101187748834,
0.04080071300268173,
0.19290730357170105,
0.0017633726820349693,
-0.09920379519462585,
-0.05516258254647255,
0.049241043627262115,
-0.0028266070876270533,
-0.010534560307860374,
0.05418522283434868,
0.010680506937205791,
0.049618806689977646,
-0.007533091586083174,
0.028176244348287582,
-0.04930367320775986,
0.05291152372956276,
0.019812311977148056,
-0.10360494256019592,
0.0767880380153656,
0.03382766991853714,
0.04720510542392731,
0.04238266497850418,
-0.0009266536217182875,
-0.052738405764102936,
-0.05851992964744568,
-0.048123981803655624,
-0.10357961058616638,
0.09566830843687057,
-0.14490118622779846,
-0.34378498792648315,
-0.13283342123031616,
-0.029950441792607307,
-0.04163485765457153,
0.0025614090263843536,
0.057380810379981995,
-0.0729794129729271,
-0.05223221331834793,
-0.04991649091243744,
0.021787507459521294,
0.004754530265927315,
-0.060527779161930084,
0.022942421957850456,
0.021597711369395256,
0.04959704726934433,
-0.08307582139968872,
0.009303607977926731,
0.021650798618793488,
-0.04709596186876297,
-0.017519788816571236,
0.05445224046707153,
0.04332590848207474,
0.11303966492414474,
0.03204844146966934,
0.03148811310529709,
-0.01723705418407917,
0.18732982873916626,
-0.11173325031995773,
0.07815422117710114,
0.09163940697908401,
-0.0540851354598999,
0.06134669482707977,
0.19440875947475433,
0.011547906324267387,
-0.03931955248117447,
0.00707280682399869,
0.033529091626405716,
0.024153398349881172,
-0.2420729696750641,
-0.11780932545661926,
-0.0278673954308033,
0.029620599001646042,
0.034702181816101074,
0.0341234914958477,
0.04873885586857796,
-0.03722275793552399,
-0.06432463973760605,
-0.0920153334736824,
0.08480623364448547,
0.006253421306610107,
0.14205747842788696,
0.0036525288596749306,
0.06873034685850143,
-0.03037511371076107,
-0.043627288192510605,
0.11507931351661682,
0.009061661548912525,
0.05565578117966652,
0.05336504429578781,
0.1938481330871582,
0.09121286123991013,
0.055252332240343094,
0.02569441869854927,
-0.025895586237311363,
-0.010064720176160336,
0.0371454693377018,
0.03286217153072357,
-0.08155851066112518,
0.037842269986867905,
0.03814280033111572,
0.19463489949703217,
-0.10396087914705276,
0.010323381051421165,
0.002696869196370244,
0.09663544595241547,
0.13201269507408142,
0.11684912443161011,
-0.1095975860953331,
0.0007277365657500923,
0.0011315381852909923,
-0.0449180081486702,
-0.05915052443742752,
-0.022476453334093094,
0.1272396594285965,
-0.09827655553817749,
0.09843093156814575,
0.02682875096797943,
0.08119548857212067,
-0.10425569117069244,
-0.005472083110362291,
0.027158526703715324,
0.06883648037910461,
0.0183139406144619,
0.07346131652593613,
-0.15609028935432434,
0.12036877870559692,
0.023750942200422287,
0.01712808944284916,
-0.0027503524906933308,
0.014757183380424976,
-0.010153485462069511,
-0.044879261404275894,
0.15292920172214508,
-0.005285813007503748,
-0.06442177295684814,
-0.09191394597291946,
-0.12631551921367645,
0.00971162598580122,
0.1401895433664322,
-0.07665155082941055,
0.04905363544821739,
0.007393620442599058,
-0.08511654287576675,
-0.06430785357952118,
0.008683203719556332,
-0.13735756278038025,
-0.12694856524467468,
0.07051564753055573,
0.02599603496491909,
0.07184550911188126,
-0.009037266485393047,
-0.020921804010868073,
-0.16231858730316162,
0.11507956683635712,
-0.12429145723581314,
-0.04443388432264328,
-0.13868415355682373,
-0.003226670902222395,
0.1791359782218933,
-0.09381630271673203,
0.06557381898164749,
0.012086236849427223,
0.1629200428724289,
-0.043799445033073425,
-0.07940961420536041,
0.04574441537261009,
-0.06988824158906937,
-0.121561199426651,
-0.016332879662513733,
0.17042893171310425,
0.11265174299478531,
0.040703482925891876,
0.06930973380804062,
0.0330594964325428,
0.015776552259922028,
-0.09241887927055359,
0.042465005069971085,
0.06652795523405075,
-0.03906276449561119,
0.03088201768696308,
0.02006421610713005,
-0.20997479557991028,
-0.11248825490474701,
-0.017398111522197723,
0.1626702845096588,
0.14150556921958923,
-0.11734535545110703,
0.16973209381103516,
0.19300426542758942,
-0.065482497215271,
-0.18371547758579254,
-0.1291625201702118,
0.13687285780906677,
0.05425921455025673,
-0.04183764010667801,
-0.19087763130664825,
0.05665619298815727,
0.029768001288175583,
-0.011657539755105972,
0.025062374770641327,
-0.2333894819021225,
-0.11192923784255981,
0.12225981801748276,
-0.06577637791633606,
-0.1461114138364792,
-0.0680619403719902,
-0.1143694594502449,
-0.09535183757543564,
-0.11005589365959167,
0.09358576685190201,
-0.16029736399650574,
0.039371784776449203,
0.08912916481494904,
0.020767860114574432,
0.022258592769503593,
0.018695566803216934,
0.11795423179864883,
0.05846076086163521,
-0.027067173272371292,
-0.038895104080438614,
0.05904867872595787,
-0.05712060257792473,
-0.019393442198634148,
0.08894205093383789,
-0.002841721288859844,
0.007281746249645948,
-0.03964195027947426,
-0.05430784076452255,
-0.07719284296035767,
0.07166171818971634,
-0.023132599890232086,
-0.002333150478079915,
-0.025622094050049782,
0.0018093526596203446,
0.07527793198823929,
-0.013654490932822227,
-0.07536861300468445,
-0.09773565083742142,
0.02627638168632984,
0.21212764084339142,
0.11215056478977203,
0.07950706779956818,
-0.10621654242277145,
-0.03819631412625313,
-0.0412275455892086,
0.010904786176979542,
0.007459255401045084,
0.07589256763458252,
0.10017183423042297,
-0.009797935374081135,
0.14253154397010803,
-0.036741454154253006,
-0.12806881964206696,
0.027611155062913895,
0.04903588443994522,
-0.0839383527636528,
-0.1677360087633133,
-0.007634968031197786,
-0.008528363890945911,
-0.005408747121691704,
-0.054359693080186844,
0.16462458670139313,
0.04065052419900894,
-0.05041603371500969,
-0.012227291241288185,
0.04608935862779617,
-0.04675326496362686,
0.12971101701259613,
0.06344980746507645,
0.08149338513612747,
-0.10272597521543503,
0.046618398278951645,
0.0945863127708435,
-0.07112350314855576,
0.0381082184612751,
0.10025075823068619,
-0.08657752722501755,
-0.07700595259666443,
-0.022788668051362038,
0.06282274425029755,
0.01160581037402153,
-0.05045244097709656,
0.03597966209053993,
-0.09499146044254303,
0.04864035174250603,
0.2088025063276291,
0.011983867734670639,
0.026498494669795036,
0.03182055801153183,
-0.04312688112258911,
-0.0015836559468880296,
0.12879478931427002,
0.0647955983877182,
-0.02368178777396679,
-0.08446381986141205,
0.10917837172746658,
0.0056318072602152824,
0.06078227236866951,
-0.01893429271876812,
-0.040540535002946854,
-0.07087704539299011,
0.021458471193909645,
-0.14951658248901367,
0.07678020745515823,
-0.06381462514400482,
-0.012893288396298885,
0.018898438662290573,
-0.042891260236501694,
0.015558361075818539,
0.010276297107338905,
-0.10521211475133896,
-0.037688467651605606,
-0.035460252314805984,
0.06908994913101196,
-0.1885187178850174,
-0.012426599860191345,
0.05601159483194351,
-0.05087340250611305,
0.10089953243732452,
0.05867232382297516,
-0.028002269566059113,
0.04762629419565201,
-0.16546416282653809,
-0.10405085235834122,
0.025044210255146027,
0.05910160765051842,
0.03815142437815666,
-0.14146843552589417,
0.023478444665670395,
0.013979237526655197,
-0.021265609189867973,
-0.020250167697668076,
0.035586584359407425,
-0.07942698150873184,
0.011894104070961475,
-0.08173272758722305,
-0.0121997632086277,
-0.03141408786177635,
0.0481112003326416,
0.07531700283288956,
0.0541442334651947,
0.07071402668952942,
-0.09896634519100189,
0.11809872090816498,
-0.09033021330833435,
-0.012586868368089199,
-0.019727639853954315,
0.0027691388968378305,
0.012493664398789406,
-0.061404310166835785,
0.09095948934555054,
-0.026988785713911057,
0.08998743444681168,
-0.011707795783877373,
0.0655183419585228,
-0.014663576148450375,
-0.13793420791625977,
-0.11619813740253448,
0.0838618203997612,
0.06500933319330215,
0.06267360597848892,
0.0006748916348442435,
0.0020080292597413063,
-0.06945609301328659,
0.002731734188273549,
0.07139138132333755,
0.022907983511686325,
0.12380409985780716,
0.09688226878643036,
-0.011740291491150856,
0.11796250194311142,
-0.13339176774024963,
-0.01938559301197529,
0.03120007924735546,
-0.17955809831619263,
0.03971438482403755,
-0.07691539078950882,
0.10280262678861618,
0.05384001508355141,
-0.13206186890602112,
0.08555236458778381,
0.020891554653644562,
-0.07348185032606125,
-0.11770962178707123,
-0.11172500997781754,
-0.08954464644193649,
-0.046927183866500854,
0.01369471475481987,
-0.08229319751262665,
0.0639941468834877,
-0.010926347225904465,
0.06938308477401733,
0.006181546486914158,
0.08889853209257126,
-0.02655763179063797,
-0.12922534346580505,
0.11568127572536469,
-0.004695916548371315,
0.028735237196087837,
0.048418477177619934,
0.036885324865579605,
0.09173095971345901,
0.061699964106082916,
0.0535917729139328,
0.053333111107349396,
-0.027141382917761803,
0.03466197848320007,
-0.07906488329172134,
-0.07927064597606659,
0.020508047193288803,
-0.021280847489833832,
0.011258482001721859,
0.13480569422245026,
0.09373839199542999,
-0.03871778026223183,
0.0015893972013145685,
0.08549042046070099,
-0.04076491668820381,
-0.15903425216674805,
-0.17029030621051788,
0.021555805578827858,
-0.019591182470321655,
0.0985882505774498,
-0.005312853958457708,
-0.10523992031812668,
-0.041507571935653687,
0.16597285866737366,
0.1356743574142456,
-0.02269812487065792,
0.058377571403980255,
0.05884901434183121,
0.02512735314667225,
0.03190840408205986,
-0.0072613866068422794,
0.02835586853325367,
0.25190719962120056,
-0.0242657158523798,
-0.010096183978021145,
-0.02560298517346382,
-0.10188516974449158,
-0.034595511853694916,
0.03404804691672325,
-0.07890323549509048,
-0.022285837680101395,
-0.022577587515115738,
0.14435380697250366,
-0.11170365661382675,
-0.18273264169692993,
-0.0025450584944337606,
-0.024072328582406044,
-0.08674009144306183,
-0.0117606520652771,
0.06788240373134613,
0.09716324508190155,
0.015554140321910381,
0.01356769073754549,
-0.08080285787582397,
0.22036093473434448,
-0.007961412891745567,
-0.05961258336901665,
0.022812863811850548,
-0.010627992451190948,
-0.12124242633581161,
0.08436606079339981,
-0.0155487060546875,
0.13088873028755188,
0.062028661370277405,
0.029894838109612465,
-0.06509646028280258,
0.07538938522338867,
0.07336952537298203,
-0.07791104912757874,
0.06376146525144577,
0.15557658672332764,
-0.0029067411087453365,
0.07178624719381332,
0.09371589124202728,
-0.09043026715517044,
0.05479135736823082,
0.06169970706105232,
-0.0017042455729097128,
-0.05822061747312546,
0.057473719120025635,
-0.0957210436463356,
0.10056666284799576,
0.11726494133472443,
-0.023999247699975967,
0.015835043042898178,
-0.041864342987537384,
0.02761087566614151,
-0.013677476905286312,
0.06451508402824402,
-0.02585371397435665,
-0.16144078969955444,
0.02122911997139454,
-0.009076851420104504,
0.09167255461215973,
-0.14880776405334473,
-0.026083460077643394,
0.031216179952025414,
-0.0055666835978627205,
0.013155238702893257,
0.11213155835866928,
0.025404080748558044,
0.001806700020097196,
-0.02144980989396572,
-0.02387888915836811,
0.03294305503368378,
0.09541000425815582,
-0.111225925385952,
-0.02184409648180008
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-2B-22-16 (XLS-R-Any-to-Any)
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-2b`**](https://huggingface.co/facebook/wav2vec2-xls-r-2b) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on `{input_lang}` -> `{output_lang}` translation pairs
of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from the following spoken languages `{input_lang}` to the following written languages `{output_lang}`:
`{input_lang}` -> `{output_lang}`
with `{input_lang}` one of:
{`en`, `fr`, `de`, `es`, `ca`, `it`, `ru`, `zh-CN`, `pt`, `fa`, `et`, `mn`, `nl`, `tr`, `ar`, `sv-SE`, `lv`, `sl`, `ta`, `ja`, `id`, `cy`}
and `{output_lang}`:
{`en`, `de`, `tr`, `fa`, `sv-SE`, `mn`, `zh-CN`, `cy`, `ca`, `sl`, `et`, `id`, `ar`, `ta`, `lv`, `ja`}
## Usage
### Demo
The model can be tested on [**this space**](https://huggingface.co/spaces/facebook/XLS-R-2B-22-16).
You can select the target language, record some audio in any of the above mentioned input languages,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct `forced_bos_token_id` to `generate(...)` to condition
the decoder on the correct target language.
To select the correct `forced_bos_token_id` given your choosen language id, please make use
of the following mapping:
```python
MAPPING = {
"en": 250004,
"de": 250003,
"tr": 250023,
"fa": 250029,
"sv": 250042,
"mn": 250037,
"zh": 250025,
"cy": 250007,
"ca": 250005,
"sl": 250052,
"et": 250006,
"id": 250032,
"ar": 250001,
"ta": 250044,
"lv": 250017,
"ja": 250012,
}
```
As an example, if you would like to translate to Swedish, you can do the following:
```python
from datasets import load_dataset
from transformers import pipeline
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-2b-22-to-16", feature_extractor="facebook/wav2vec2-xls-r-2b-22-to-16")
translation = asr(audio_file, forced_bos_token_id=forced_bos_token_id)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-2b-22-to-16")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-2b-22-to-16")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"], forced_bos_token_id=forced_bos_token)
transcription = processor.batch_decode(generated_ids)
```
## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-300m-en-to-15)
- [Wav2Vec2-XLS-R-1B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-1b-en-to-15)
- [Wav2Vec2-XLS-R-2B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-2b-en-to-15)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "fr", "de", "es", "ca", "it", "ru", "zh", "pt", "fa", "et", "mn", "nl", "tr", "ar", "sv", "lv", "sl", "ta", "ja", "id", "cy", "en"], "license": "apache-2.0", "tags": ["speech", "xls_r", "automatic-speech-recognition", "xls_r_translation"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Swedish", "src": "https://cdn-media.huggingface.co/speech_samples/cv_swedish_1.mp3"}, {"example_title": "Arabic", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ar_19058308.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "German", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_de_17284683.mp3"}, {"example_title": "French", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_fr_17299386.mp3"}, {"example_title": "Indonesian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_id_19051309.mp3"}, {"example_title": "Italian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_it_17415776.mp3"}, {"example_title": "Japanese", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ja_19482488.mp3"}, {"example_title": "Mongolian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_mn_18565396.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "Turkish", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_tr_17341280.mp3"}, {"example_title": "Catalan", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ca_17367522.mp3"}, {"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-2b-22-to-16
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-2B-22-16 (XLS-R-Any-to-Any)
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on '{input_lang}' -> '{output_lang}' translation pairs
of the Covost2 dataset.
The model can translate from the following spoken languages '{input_lang}' to the following written languages '{output_lang}':
'{input_lang}' -> '{output_lang}'
with '{input_lang}' one of:
{'en', 'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'}
and '{output_lang}':
{'en', 'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}
## Usage
### Demo
The model can be tested on this space.
You can select the target language, record some audio in any of the above mentioned input languages,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition
the decoder on the correct target language.
To select the correct 'forced_bos_token_id' given your choosen language id, please make use
of the following mapping:
As an example, if you would like to translate to Swedish, you can do the following:
or step-by-step as follows:
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-EN-15
- Wav2Vec2-XLS-R-1B-EN-15
- Wav2Vec2-XLS-R-2B-EN-15
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-2B-22-16 (XLS-R-Any-to-Any)\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on '{input_lang}' -> '{output_lang}' translation pairs\nof the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{input_lang}' to the following written languages '{output_lang}':\n\n'{input_lang}' -> '{output_lang}'\n\nwith '{input_lang}' one of:\n\n{'en', 'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'}\n\nand '{output_lang}':\n\n{'en', 'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in any of the above mentioned input languages, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-2B-22-16 (XLS-R-Any-to-Any)\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on '{input_lang}' -> '{output_lang}' translation pairs\nof the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{input_lang}' to the following written languages '{output_lang}':\n\n'{input_lang}' -> '{output_lang}'\n\nwith '{input_lang}' one of:\n\n{'en', 'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'}\n\nand '{output_lang}':\n\n{'en', 'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in any of the above mentioned input languages, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
143,
397,
3,
50,
176,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.088396817445755,
0.024025822058320045,
-0.008195031434297562,
0.009924937970936298,
0.07764125615358353,
-0.007152137346565723,
0.02983078360557556,
0.1245976909995079,
0.08163280040025711,
0.040878113359212875,
0.06919390708208084,
0.11401806771755219,
0.03546397387981415,
0.05465773493051529,
-0.03653685748577118,
-0.23954276740550995,
0.055636439472436905,
-0.015185371041297913,
0.0563824400305748,
0.08907964825630188,
0.093874491751194,
-0.04597529023885727,
0.056464310735464096,
-0.026378512382507324,
-0.005151687655597925,
0.03592477738857269,
0.016118120402097702,
-0.09971863776445389,
0.0999029129743576,
0.07767703384160995,
-0.0022778413258492947,
0.0520453006029129,
0.03902453929185867,
-0.23792386054992676,
0.0213876124471426,
-0.022330112755298615,
-0.03861574828624725,
-0.021906252950429916,
-0.01864326000213623,
-0.0873778685927391,
0.1675214022397995,
-0.011237068101763725,
-0.0620989128947258,
0.07157447934150696,
-0.07675929367542267,
-0.24526850879192352,
-0.030659377574920654,
0.029174083843827248,
-0.057100508362054825,
0.06436258554458618,
-0.06850474327802658,
0.09652118384838104,
-0.09027329832315445,
0.0976896584033966,
0.13604050874710083,
-0.31391072273254395,
-0.006432413123548031,
0.009491697885096073,
0.09911561012268066,
0.09583298116922379,
-0.047913193702697754,
0.09712795168161392,
0.03424149379134178,
0.02426985651254654,
-0.05874878540635109,
-0.08907817304134369,
-0.053868018090724945,
0.017286403104662895,
-0.11702890694141388,
-0.011893607676029205,
0.22793231904506683,
-0.02353544346988201,
0.03222668543457985,
-0.044828690588474274,
0.006103006191551685,
-0.056680187582969666,
-0.039211172610521317,
-0.011456641368567944,
-0.01500712614506483,
0.03872941434383392,
0.020889922976493835,
-0.05959736555814743,
-0.09274255484342575,
-0.01779683120548725,
-0.1388489007949829,
0.2034989297389984,
0.0506679005920887,
0.007043931167572737,
-0.026807954534888268,
0.0009283893741667271,
-0.01192496158182621,
-0.08010761439800262,
-0.02397383563220501,
-0.025245962664484978,
-0.0103384330868721,
0.10094788670539856,
-0.011306412518024445,
0.0009116917499341071,
0.15322057902812958,
0.050647057592868805,
-0.058547623455524445,
0.03952816128730774,
-0.06832998991012573,
0.09653164446353912,
-0.003914169501513243,
0.1282522827386856,
-0.08669662475585938,
-0.06821954250335693,
-0.03536170348525047,
-0.039418160915374756,
0.037695806473493576,
-0.029407169669866562,
-0.1356537789106369,
-0.028757546097040176,
-0.013963371515274048,
0.11412303894758224,
0.019407834857702255,
0.08124465495347977,
-0.019721893593668938,
0.03636814281344414,
-0.04912380874156952,
-0.0774751827120781,
0.03643369674682617,
0.06819324940443039,
0.016863541677594185,
0.13375277817249298,
-0.007469030562788248,
0.010621411725878716,
-0.10060290992259979,
-0.0036666416563093662,
0.012911614961922169,
0.0742332711815834,
0.012638174928724766,
-0.10437200218439102,
0.0253752414137125,
-0.09894473850727081,
-0.006803784053772688,
-0.15008951723575592,
0.003067746991291642,
-0.029173102229833603,
-0.07899843901395798,
-0.04712507873773575,
0.011976308189332485,
-0.11671163886785507,
-0.07460937649011612,
0.040279023349285126,
-0.07059375196695328,
-0.06118498742580414,
-0.08448641002178192,
0.09688276797533035,
-0.01696615107357502,
0.11643971502780914,
-0.11773445457220078,
0.0659245103597641,
-0.029324378818273544,
-0.0200869832187891,
-0.03353105112910271,
0.13507325947284698,
-0.10914164036512375,
-0.008691957220435143,
-0.059156034141778946,
-0.07214438170194626,
-0.06182383373379707,
0.10547156631946564,
-0.03732168301939964,
0.13900603353977203,
-0.22740235924720764,
-0.08844640105962753,
0.23090478777885437,
-0.08438631892204285,
-0.021349526941776276,
0.1664469987154007,
0.051303911954164505,
-0.05709722265601158,
0.06813804805278778,
0.3030608594417572,
-0.009821181185543537,
-0.1276656687259674,
-0.029713844880461693,
0.10311183333396912,
-0.016188576817512512,
-0.002349666552618146,
0.10244861245155334,
-0.038948751986026764,
0.012288873083889484,
0.04268261790275574,
-0.000011204963811906055,
0.05477046221494675,
-0.04436134919524193,
-0.05983202904462814,
0.00710611417889595,
-0.05374012887477875,
0.07865238934755325,
0.0026042924728244543,
-0.00746374623849988,
-0.07444537431001663,
-0.06447015702724457,
-0.06607742607593536,
0.08449456095695496,
-0.054009757936000824,
0.088021419942379,
-0.14540790021419525,
0.10532813519239426,
0.057425666600465775,
0.035498715937137604,
-0.14338113367557526,
0.1326124221086502,
-0.03131479024887085,
0.11272010952234268,
0.13111548125743866,
0.09282995760440826,
0.03891712799668312,
-0.046529147773981094,
-0.03870456665754318,
0.004885233007371426,
0.10748203098773956,
0.026378070935606956,
-0.002430272987112403,
-0.1538434773683548,
0.07821627706289291,
-0.08606278896331787,
0.025263922289013863,
-0.07998840510845184,
-0.003895449684932828,
0.09115754812955856,
0.0972435250878334,
-0.018897883594036102,
0.048728104680776596,
-0.022020816802978516,
0.08113426715135574,
-0.021605798974633217,
0.029388677328824997,
0.04827965050935745,
-0.036064211279153824,
-0.09240688383579254,
0.2138536125421524,
-0.10710079222917557,
0.1951080560684204,
0.18463821709156036,
-0.12185057997703552,
0.025823350995779037,
0.060734301805496216,
0.01803644932806492,
0.027817508205771446,
0.08522161841392517,
-0.070248082280159,
0.16046403348445892,
-0.029229484498500824,
0.11862457543611526,
-0.06188606843352318,
0.015010682865977287,
0.008646686561405659,
-0.04739993065595627,
-0.06848739832639694,
0.15849964320659637,
0.029787922278046608,
-0.0647357925772667,
0.1463788002729416,
0.1343315690755844,
0.0036876348312944174,
0.19798244535923004,
-0.022096851840615273,
-0.014905494637787342,
-0.004185727797448635,
0.0011204793117940426,
-0.03748312592506409,
0.0562184639275074,
-0.19506292045116425,
-0.07391706854104996,
0.024386350065469742,
0.005128361750394106,
0.06614935398101807,
-0.11238012462854385,
-0.02798963338136673,
-0.031759340316057205,
-0.10021162778139114,
-0.03272876515984535,
0.07941124588251114,
-0.0030187307856976986,
0.10999823361635208,
-0.06579633057117462,
-0.13321822881698608,
-0.01520082913339138,
-0.04138937219977379,
-0.06569528579711914,
0.11623469740152359,
-0.19993869960308075,
-0.31777095794677734,
-0.0715126320719719,
-0.0862419456243515,
-0.028421444818377495,
0.021497080102562904,
0.12175850570201874,
-0.1347496062517166,
-0.013955888338387012,
-0.03993329033255577,
0.10017400979995728,
-0.10759120434522629,
-0.024866675958037376,
-0.08228780329227448,
-0.008833426050841808,
-0.04790498688817024,
-0.11224779486656189,
-0.034878384321928024,
-0.03431607410311699,
-0.009470014832913876,
0.07636688649654388,
-0.06604528427124023,
0.08471869677305222,
0.15908488631248474,
0.06012938544154167,
0.035694219172000885,
-0.06704498082399368,
0.07743953913450241,
-0.09884548932313919,
-0.07667989283800125,
0.09248682856559753,
-0.005942463409155607,
0.013919173739850521,
0.15169930458068848,
0.01823403127491474,
-0.042158499360084534,
-0.03003832697868347,
-0.03397173807024956,
-0.0526602603495121,
-0.1947859674692154,
-0.16618478298187256,
-0.0978350043296814,
0.05452284589409828,
-0.08221489191055298,
0.062180083245038986,
0.02366287261247635,
-0.06458134204149246,
-0.007038713898509741,
-0.10235206037759781,
0.008086069487035275,
0.006449392531067133,
0.28563445806503296,
-0.09563732147216797,
0.11313100904226303,
-0.06668615341186523,
-0.06939425319433212,
0.09278644621372223,
0.07288344204425812,
0.03188313543796539,
0.068454809486866,
0.08634688705205917,
0.049331165850162506,
0.1388475000858307,
0.07206544280052185,
-0.011043989099562168,
0.040878474712371826,
0.010460109449923038,
-0.008931954391300678,
-0.054994843900203705,
-0.023168770596385002,
0.02667047828435898,
0.25690707564353943,
-0.08712691813707352,
-0.02591194212436676,
-0.07351472973823547,
0.11112440377473831,
0.061410728842020035,
0.052804719656705856,
-0.09835570305585861,
0.02118074707686901,
0.04684257134795189,
-0.003081441158428788,
-0.04082112014293671,
0.11755133420228958,
0.14610125124454498,
-0.05666667968034744,
0.08544185012578964,
0.06411775201559067,
0.04785337299108505,
-0.0816589891910553,
0.0842452421784401,
-0.08216694742441177,
-0.043180081993341446,
0.028864435851573944,
0.050624292343854904,
-0.29373985528945923,
0.23052771389484406,
0.017626045271754265,
-0.021997107192873955,
-0.030816782265901566,
-0.01987459696829319,
0.04665824770927429,
0.12411453574895859,
0.1693301796913147,
0.014448185451328754,
-0.08828993141651154,
-0.13941384851932526,
-0.02137727662920952,
0.020956123247742653,
0.13800324499607086,
0.05654832720756531,
-0.007232370786368847,
-0.01289288979023695,
-0.03392506763339043,
-0.016007492318749428,
-0.013070158660411835,
-0.08274199068546295,
-0.13839761912822723,
0.05849273130297661,
0.18050619959831238,
0.010729621164500713,
-0.0001668126496952027,
-0.08006532490253448,
-0.1867859959602356,
0.05939510092139244,
-0.15033870935440063,
-0.018888331949710846,
-0.07425132393836975,
-0.09456516802310944,
0.12053357064723969,
-0.08387532085180283,
0.0003081752802245319,
0.010216615162789822,
-0.045187704265117645,
-0.06883493065834045,
-0.04803645610809326,
0.1049083024263382,
-0.07161353528499603,
-0.04953649267554283,
0.007018435746431351,
0.20309282839298248,
0.0013041014317423105,
0.09994885325431824,
0.023705262690782547,
0.020218390971422195,
-0.02710947021842003,
-0.07770837098360062,
0.07097644358873367,
0.07402065396308899,
-0.04623318836092949,
0.11134020239114761,
-0.04822342470288277,
-0.1989050656557083,
-0.07458996772766113,
-0.0373302660882473,
0.24015961587429047,
0.17282170057296753,
-0.08930090069770813,
0.14846190810203552,
0.16722814738750458,
-0.07025478035211563,
-0.3139357268810272,
-0.1474905014038086,
-0.09788456559181213,
0.04508572816848755,
-0.05575866997241974,
-0.1546456664800644,
-0.04216568544507027,
-0.04267919808626175,
-0.025698602199554443,
0.045872099697589874,
-0.25990524888038635,
-0.08376867324113846,
0.1966756284236908,
-0.064975805580616,
0.16084879636764526,
-0.12359830737113953,
-0.10154608637094498,
-0.03276240825653076,
-0.01845027692615986,
0.030813738703727722,
-0.06653366982936859,
0.1192825436592102,
0.01782604120671749,
0.029537087306380272,
0.040077872574329376,
0.02467464841902256,
0.13574430346488953,
0.04844177886843681,
-0.04637465998530388,
-0.06479614973068237,
-0.08823254704475403,
0.005625919904559851,
0.04824841767549515,
0.04964408278465271,
-0.15935781598091125,
-0.013131058774888515,
-0.10387463122606277,
-0.026498863473534584,
-0.08820813149213791,
0.06709982454776764,
0.020801259204745293,
-0.020793383941054344,
-0.06662115454673767,
-0.011972969397902489,
-0.006366388406604528,
0.01432086806744337,
0.15121741592884064,
-0.13594210147857666,
0.0727541446685791,
0.1633070409297943,
0.16165396571159363,
-0.038734447211027145,
0.049073271453380585,
-0.04124779999256134,
-0.0540422759950161,
0.053527552634477615,
-0.035410452634096146,
-0.001225121202878654,
0.1166464239358902,
-0.006919504143297672,
0.08129196614027023,
0.03131309524178505,
-0.07304240018129349,
0.06899851560592651,
0.08166766166687012,
-0.06535845994949341,
-0.1931646466255188,
-0.005248780362308025,
-0.08172228187322617,
0.11147355288267136,
0.09431985765695572,
0.18188278377056122,
-0.025483978912234306,
0.0019162449752911925,
-0.04840774089097977,
0.018702447414398193,
-0.06824277341365814,
0.14132964611053467,
0.08112998306751251,
0.01478505413979292,
-0.14447356760501862,
0.05786510556936264,
-0.009590932168066502,
-0.09930621087551117,
0.018941635265946388,
0.1058453768491745,
-0.06887850165367126,
-0.13711893558502197,
-0.12723863124847412,
0.010869914665818214,
-0.02443573996424675,
-0.09859419614076614,
-0.023517517372965813,
-0.16913308203220367,
0.06365341693162918,
0.1892535537481308,
0.042428646236658096,
0.03943372517824173,
-0.07132229208946228,
-0.05048278719186783,
0.06296375393867493,
0.036869462579488754,
0.012990090996026993,
-0.030375685542821884,
-0.07541783154010773,
0.08087608218193054,
0.0029306390788406134,
0.1409943699836731,
-0.0588531494140625,
-0.06454252451658249,
-0.11713848263025284,
0.05133233591914177,
-0.1173081025481224,
0.00003772639320231974,
-0.07194382697343826,
-0.007554948795586824,
0.03154435753822327,
-0.13212601840496063,
-0.041157130151987076,
-0.006173362955451012,
-0.10958176851272583,
0.0020658287685364485,
-0.0027791799511760473,
0.11852552741765976,
-0.10758079588413239,
-0.01786252297461033,
0.03298978880047798,
-0.007156712468713522,
0.14303843677043915,
0.13656173646450043,
-0.10573616623878479,
0.14108514785766602,
-0.1582714021205902,
-0.08507490903139114,
0.11541596055030823,
0.059876345098018646,
0.019683584570884705,
0.04457917436957359,
-0.003825179999694228,
0.07610203325748444,
0.03535599261522293,
0.039662353694438934,
0.0815567597746849,
-0.0583772286772728,
0.0947122573852539,
-0.11851900815963745,
-0.07686873525381088,
0.008702294901013374,
0.02525828406214714,
0.10610472410917282,
0.06730563193559647,
0.09397385269403458,
-0.08519929647445679,
0.02467835694551468,
-0.029940547421574593,
0.045894794166088104,
-0.04288803040981293,
-0.12281680107116699,
-0.054848942905664444,
-0.056171003729104996,
0.07102077454328537,
0.011101781390607357,
0.18706247210502625,
0.020694971084594727,
-0.02482507936656475,
-0.007946680299937725,
-0.005141273140907288,
-0.07450226694345474,
0.026665933430194855,
0.1470624804496765,
0.06319846212863922,
-0.03306559845805168,
-0.07634677737951279,
-0.022874435409903526,
0.03776121139526367,
0.07911975681781769,
0.0036724198143929243,
0.13873770833015442,
0.18523137271404266,
0.10875219106674194,
0.07984056323766708,
-0.04700064659118652,
-0.035019055008888245,
-0.03415519744157791,
-0.04148178547620773,
0.0059705558232963085,
-0.09561119973659515,
0.18137969076633453,
0.1153835654258728,
-0.035597123205661774,
0.048775479197502136,
-0.02459757961332798,
-0.032373033463954926,
-0.15672925114631653,
-0.08694818615913391,
-0.0620400495827198,
-0.10457678139209747,
-0.007243948057293892,
-0.07965196669101715,
0.06635496765375137,
0.011457462795078754,
0.07895547151565552,
-0.01920485310256481,
0.04268263280391693,
-0.04372800886631012,
-0.0970241129398346,
0.1013408675789833,
-0.029191795736551285,
0.053187962621450424,
-0.07708558440208435,
0.0012529511004686356,
0.034071508795022964,
-0.012033606879413128,
0.01641961932182312,
0.056798532605171204,
-0.07943368703126907,
-0.00411267438903451,
-0.14751191437244415,
-0.051124457269907,
-0.01625988259911537,
0.04628301411867142,
0.04563719406723976,
0.16077566146850586,
0.09175834059715271,
-0.10566390305757523,
0.04591251537203789,
0.1447158008813858,
-0.06499302387237549,
-0.12630592286586761,
-0.02607429027557373,
0.1363655924797058,
0.020693883299827576,
0.11308397352695465,
-0.05492117628455162,
-0.03969450667500496,
-0.06824825704097748,
0.2080475091934204,
0.32458585500717163,
-0.07250738143920898,
0.06704859435558319,
0.002010915195569396,
0.047390166670084,
0.030862562358379364,
0.00770637858659029,
0.13521577417850494,
0.2520180642604828,
0.017376765608787537,
-0.012748938985168934,
-0.09172628074884415,
-0.015458749607205391,
-0.07206747680902481,
0.016815457493066788,
-0.035946108400821686,
-0.0859406366944313,
-0.022792058065533638,
0.1079864650964737,
-0.17693118751049042,
-0.025169391185045242,
-0.05491669476032257,
-0.17284195125102997,
-0.06022852286696434,
0.005726202856749296,
0.0859091728925705,
0.14922019839286804,
0.0073393769562244415,
-0.031488921493291855,
-0.08007238060235977,
0.028981225565075874,
0.015325640328228474,
-0.18434330821037292,
0.03927035257220268,
0.011898157186806202,
-0.12538228929042816,
0.024287346750497818,
-0.03099130094051361,
0.0712055191397667,
0.06637218594551086,
0.10266034305095673,
0.031409647315740585,
0.14645712077617645,
0.03808462619781494,
-0.05643090605735779,
0.04318089410662651,
0.06332088261842728,
0.0062770359218120575,
0.10089164227247238,
0.09063538908958435,
-0.03536597266793251,
0.10523486882448196,
0.021023355424404144,
-0.07425390183925629,
-0.03093280829489231,
0.04953913763165474,
-0.08883795142173767,
0.054291706532239914,
0.043786343187093735,
0.007406802382320166,
-0.04717559739947319,
-0.01764974184334278,
-0.041508033871650696,
-0.03378712385892868,
-0.015766823664307594,
-0.048717718571424484,
-0.10522529482841492,
-0.054054878652095795,
-0.06143154576420784,
0.051416490226984024,
-0.13712908327579498,
-0.00982820987701416,
-0.062037479132413864,
0.05594467744231224,
-0.062196265906095505,
0.07050541788339615,
0.07187610119581223,
-0.026928827166557312,
-0.00011131233623018488,
-0.15942800045013428,
0.10011947154998779,
0.13286712765693665,
-0.11421165615320206,
-0.028748992830514908
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-2B-EN-15
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-2b`**](https://huggingface.co/facebook/wav2vec2-xls-r-2b) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 15 `en` -> `{lang}` translation pairs of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from spoken `en` (Engish) to the following written languages `{lang}`:
`en` -> {`de`, `tr`, `fa`, `sv-SE`, `mn`, `zh-CN`, `cy`, `ca`, `sl`, `et`, `id`, `ar`, `ta`, `lv`, `ja`}
For more information, please refer to Section *5.1.1* of the [official XLS-R paper](https://arxiv.org/abs/2111.09296).
## Usage
### Demo
The model can be tested on [**this space**](https://huggingface.co/spaces/facebook/XLS-R-2B-EN-15).
You can select the target language, record some audio in English,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct `forced_bos_token_id` to `generate(...)` to condition
the decoder on the correct target language.
To select the correct `forced_bos_token_id` given your choosen language id, please make use
of the following mapping:
```python
MAPPING = {
"de": 250003,
"tr": 250023,
"fa": 250029,
"sv": 250042,
"mn": 250037,
"zh": 250025,
"cy": 250007,
"ca": 250005,
"sl": 250052,
"et": 250006,
"id": 250032,
"ar": 250001,
"ta": 250044,
"lv": 250017,
"ja": 250012,
}
```
As an example, if you would like to translate to Swedish, you can do the following:
```python
from datasets import load_dataset
from transformers import pipeline
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-2b-en-to-15", feature_extractor="facebook/wav2vec2-xls-r-2b-en-to-15")
translation = asr(audio_file, forced_bos_token_id=forced_bos_token_id)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-2b-en-to-15")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-2b-en-to-15")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"], forced_bos_token_id=forced_bos_token)
transcription = processor.batch_decode(generated_ids)
```
## Results `en` -> `{lang}`
See the row of **XLS-R (2B)** for the performance on [Covost2](https://huggingface.co/datasets/covost2) for this model.

## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-300m-en-to-15)
- [Wav2Vec2-XLS-R-1B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-1b-en-to-15)
- [Wav2Vec2-XLS-R-2B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-2b-en-to-15)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "en", "de", "tr", "fa", "sv", "mn", "zh", "cy", "ca", "sl", "et", "id", "ar", "ta", "lv", "ja"], "license": "apache-2.0", "tags": ["speech", "xls_r", "automatic-speech-recognition", "xls_r_translation"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-2b-en-to-15
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"en",
"de",
"tr",
"fa",
"sv",
"mn",
"zh",
"cy",
"ca",
"sl",
"et",
"id",
"ar",
"ta",
"lv",
"ja",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"en",
"de",
"tr",
"fa",
"sv",
"mn",
"zh",
"cy",
"ca",
"sl",
"et",
"id",
"ar",
"ta",
"lv",
"ja"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-2B-EN-15
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.
The model can translate from spoken 'en' (Engish) to the following written languages '{lang}':
'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}
For more information, please refer to Section *5.1.1* of the official XLS-R paper.
## Usage
### Demo
The model can be tested on this space.
You can select the target language, record some audio in English,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition
the decoder on the correct target language.
To select the correct 'forced_bos_token_id' given your choosen language id, please make use
of the following mapping:
As an example, if you would like to translate to Swedish, you can do the following:
or step-by-step as follows:
## Results 'en' -> '{lang}'
See the row of XLS-R (2B) for the performance on Covost2 for this model.
!results image
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-EN-15
- Wav2Vec2-XLS-R-1B-EN-15
- Wav2Vec2-XLS-R-2B-EN-15
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-2B-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## Results 'en' -> '{lang}'\n\nSee the row of XLS-R (2B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-2B-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## Results 'en' -> '{lang}'\n\nSee the row of XLS-R (2B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
140,
258,
3,
43,
176,
39,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-XLS-R-2B-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-2b' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.## Usage### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input."
] |
[
-0.08983278274536133,
0.06371909379959106,
-0.0059128268621861935,
0.025712544098496437,
0.009200723841786385,
-0.0019482554635033011,
0.03794090822339058,
0.10953926295042038,
0.07082179933786392,
0.11237751692533493,
0.010309786535799503,
0.0433921292424202,
0.10024558752775192,
0.15244147181510925,
-0.030241740867495537,
-0.12557759881019592,
0.04033678025007248,
-0.0844869464635849,
-0.0036294374149292707,
0.07403707504272461,
0.08265102654695511,
-0.09090612828731537,
0.03634800389409065,
-0.03402194008231163,
0.012345082126557827,
0.005740313790738583,
-0.024998340755701065,
-0.061197537928819656,
0.07044711709022522,
0.08657646924257278,
0.07383499294519424,
0.08973570913076401,
0.04237598553299904,
-0.2531070411205292,
0.02683640830218792,
0.07084298133850098,
-0.01758188009262085,
0.005147841293364763,
0.09339109808206558,
-0.018398582935333252,
0.059119291603565216,
-0.07232284545898438,
-0.013677685521543026,
0.06059565395116806,
-0.06420331448316574,
-0.2711035907268524,
-0.09418509900569916,
-0.011518009006977081,
0.10036953538656235,
0.05080917850136757,
-0.06174221262335777,
0.055067114531993866,
-0.020802637562155724,
0.08787714689970016,
0.09817390888929367,
-0.26715198159217834,
-0.003691509598866105,
0.02327435277402401,
0.11393005400896072,
0.05071736127138138,
-0.04700934514403343,
0.03903280571103096,
0.020642975345253944,
0.010708451271057129,
-0.0371512770652771,
-0.074516162276268,
-0.03335461765527725,
-0.010581924580037594,
-0.14960215985774994,
-0.04123459383845329,
0.07564728707075119,
0.030981985852122307,
-0.061398155987262726,
-0.10314680635929108,
-0.03969716653227806,
-0.013378383591771126,
-0.02865123189985752,
-0.08596621453762054,
-0.043712131679058075,
0.0005718402680940926,
-0.008627096191048622,
-0.1098281666636467,
-0.10015746206045151,
-0.03349711373448372,
-0.09188796579837799,
0.13220158219337463,
0.03184199333190918,
0.008884971030056477,
-0.01912556029856205,
0.04498758539557457,
0.035918015986680984,
-0.09101702272891998,
-0.08367633074522018,
-0.05024699121713638,
-0.17686311900615692,
0.0024215434677898884,
-0.06851755082607269,
-0.10239549726247787,
0.13952867686748505,
0.14606210589408875,
-0.10357989370822906,
0.08062966912984848,
-0.07560738176107407,
0.02878960594534874,
0.03382208198308945,
0.16785259544849396,
-0.03902613744139671,
-0.05606424808502197,
-0.03455846384167671,
-0.019539188593626022,
-0.045020438730716705,
-0.04174014553427696,
-0.06428210437297821,
-0.011735101230442524,
0.01402720995247364,
0.08790288865566254,
0.06991543620824814,
-0.00884279701858759,
-0.0681174173951149,
-0.008260323666036129,
0.10837436467409134,
-0.15320463478565216,
0.0686100646853447,
0.10308170318603516,
-0.013035484589636326,
0.03441907465457916,
-0.007354164961725473,
-0.0019128897693008184,
-0.08120217174291611,
-0.05415426939725876,
0.028950298205018044,
0.014681813307106495,
-0.023021845147013664,
-0.12671469151973724,
0.03938734158873558,
-0.044039852917194366,
-0.07137201726436615,
-0.14726249873638153,
-0.02783934585750103,
-0.08731036633253098,
0.0061558750458061695,
-0.044927727431058884,
0.12055276334285736,
-0.0840153843164444,
-0.022339576855301857,
0.0025897473096847534,
-0.014535625465214252,
-0.04259185865521431,
-0.03896108269691467,
0.03173694759607315,
-0.02049156464636326,
0.11905932426452637,
0.018026050180196762,
0.0014376214239746332,
-0.05686112120747566,
0.0285031758248806,
-0.10694597661495209,
0.18768812716007233,
-0.11860775202512741,
-0.03393446281552315,
-0.13067437708377838,
-0.02775433659553528,
-0.006219483911991119,
0.04098745808005333,
0.04987914487719536,
0.10306688398122787,
-0.25528883934020996,
-0.05504348501563072,
0.27487847208976746,
-0.0849534124135971,
-0.08601260185241699,
0.16738493740558624,
0.024525178596377373,
-0.040720079094171524,
0.061445172876119614,
0.1408853828907013,
0.1566258817911148,
-0.20144473016262054,
-0.02180810086429119,
0.03030458278954029,
0.0006280929665081203,
0.13852037489414215,
0.07752615958452225,
-0.11538238078355789,
0.025101670995354652,
-0.005724326241761446,
-0.02972463145852089,
0.0042624385096132755,
-0.011038008145987988,
-0.053807634860277176,
0.023640606552362442,
-0.00896360632032156,
0.10348474979400635,
-0.029120774939656258,
-0.05812942981719971,
-0.052903831005096436,
-0.06503065675497055,
-0.06254291534423828,
0.07131834328174591,
-0.06281019002199173,
0.060247354209423065,
-0.1326879858970642,
0.06680066883563995,
-0.0006333125056698918,
0.03797909989953041,
-0.1754901111125946,
-0.03039640560746193,
0.01917639747262001,
-0.0655430480837822,
0.06140973046422005,
0.0800841897726059,
0.05211018770933151,
0.014845386147499084,
0.04478525370359421,
-0.01779860258102417,
0.027186142280697823,
0.0007423859788104892,
-0.007464217487722635,
-0.09395517408847809,
-0.01811499334871769,
-0.08175931125879288,
0.08721597492694855,
-0.08711511641740799,
-0.02106226608157158,
0.12895701825618744,
0.11878129094839096,
0.019322220236063004,
0.014489872381091118,
-0.01925453543663025,
0.07373887300491333,
-0.00321819051168859,
-0.005903088487684727,
0.0009362868149764836,
-0.0357380136847496,
-0.07139438390731812,
0.09917216747999191,
-0.07944413274526596,
-0.10460444539785385,
0.07044064998626709,
-0.02388017252087593,
-0.05547842010855675,
0.014823307283222675,
-0.019055182114243507,
0.011808332987129688,
-0.05768651142716408,
-0.04772654175758362,
0.1899108588695526,
0.08738814294338226,
0.09314326196908951,
-0.06705088168382645,
-0.038271453231573105,
0.012601186521351337,
-0.06814823299646378,
-0.043872229754924774,
0.11264693737030029,
-0.03516092151403427,
-0.11921707540750504,
0.024247854948043823,
0.09656567126512527,
-0.008602271787822247,
0.1508609503507614,
0.0013480173656716943,
-0.08961338549852371,
-0.055296890437603,
0.06699895113706589,
-0.002903254935517907,
0.017897840589284897,
0.023581329733133316,
0.0012096663704141974,
0.028547998517751694,
0.0333588533103466,
0.0347275473177433,
-0.06882146745920181,
0.06305935233831406,
0.030833134427666664,
-0.1039590835571289,
0.056407783180475235,
0.0473521463572979,
0.027682730928063393,
0.029266348108649254,
0.0007822758634574711,
-0.06011292710900307,
-0.05052146688103676,
-0.055078692734241486,
-0.10886267572641373,
0.16515935957431793,
-0.14951099455356598,
-0.35850954055786133,
-0.12218744307756424,
-0.021630005910992622,
-0.013639569282531738,
0.002847943687811494,
0.08264493942260742,
-0.1061142161488533,
-0.03821653872728348,
-0.07002709060907364,
-0.016309166327118874,
0.009669792838394642,
-0.02887272834777832,
0.03440256044268608,
0.0026458001229912043,
0.0164510328322649,
-0.08316975831985474,
0.00925486721098423,
0.03770595043897629,
-0.01320461742579937,
0.03053666464984417,
0.03564481809735298,
0.0853918120265007,
0.11532268673181534,
-0.004252132494002581,
0.036573316901922226,
-0.028239339590072632,
0.20627643167972565,
-0.11231456696987152,
0.06582871824502945,
0.09122595936059952,
-0.036923255771398544,
0.05001984164118767,
0.1298225373029709,
-0.011088313534855843,
-0.05020381882786751,
0.008678940124809742,
-0.004336063284426928,
0.003950583282858133,
-0.24181535840034485,
-0.08489824831485748,
-0.04328739270567894,
0.041324108839035034,
0.02507733181118965,
0.0277701485902071,
0.013737284578382969,
-0.03777865692973137,
-0.030748529359698296,
-0.07742786407470703,
0.07041697204113007,
0.02096221223473549,
0.16228516399860382,
-0.026916662231087685,
0.055756039917469025,
-0.04245621711015701,
-0.03722911328077316,
0.10275614261627197,
0.05198610574007034,
0.05304580554366112,
0.06865447014570236,
0.1594984233379364,
0.09011923521757126,
0.05912664160132408,
-0.001489643007516861,
-0.01250962819904089,
-0.0170257780700922,
0.033579085022211075,
0.03713438659906387,
-0.06924455612897873,
0.031030923128128052,
0.04160820320248604,
0.16188618540763855,
-0.1254788190126419,
-0.012053271755576134,
-0.009071688167750835,
0.10335605591535568,
0.09460978209972382,
0.14762268960475922,
-0.12520408630371094,
-0.00022315898968372494,
0.004636566620320082,
-0.013213555328547955,
-0.030139634385704994,
-0.0009789770701900125,
0.15130051970481873,
-0.09304209798574448,
0.06937868148088455,
0.008511724881827831,
0.07652045786380768,
-0.08098030090332031,
-0.008448544889688492,
-0.021806972101330757,
0.05885133519768715,
0.02516123652458191,
0.06573297083377838,
-0.25723838806152344,
0.11645351350307465,
0.0269318837672472,
0.04037192091345787,
-0.010804946534335613,
0.022355224937200546,
-0.00789753720164299,
-0.019800813868641853,
0.1448686271905899,
-0.0021886846516281366,
-0.12205357104539871,
-0.1175878718495369,
-0.0807136669754982,
0.04347160831093788,
0.1380123645067215,
-0.009519064798951149,
0.0743701234459877,
0.01640598103404045,
-0.06753657758235931,
-0.08731279522180557,
-0.03523773327469826,
-0.1382535845041275,
-0.10783793777227402,
0.045418329536914825,
0.045309100300073624,
0.06008946895599365,
-0.013973508961498737,
-0.0055240788497030735,
-0.17514702677726746,
0.10207740217447281,
-0.18660736083984375,
-0.04260479658842087,
-0.1119837537407875,
-0.019012095406651497,
0.1399787813425064,
-0.10746528953313828,
0.03511391952633858,
0.01723545230925083,
0.10668035596609116,
-0.053820282220840454,
-0.0760793685913086,
0.04841717705130577,
-0.06478361040353775,
-0.1145666167140007,
-0.028593331575393677,
0.11522414535284042,
0.0857803225517273,
0.033148590475320816,
0.09274452179670334,
0.042628709226846695,
-0.02891087904572487,
-0.09624844044446945,
0.02637356147170067,
0.04032411426305771,
-0.04724204167723656,
0.05755273252725601,
0.011610463261604309,
-0.15659530460834503,
-0.09710918366909027,
0.005385398864746094,
0.16673286259174347,
0.1862792670726776,
-0.10284522920846939,
0.14098362624645233,
0.2001083642244339,
-0.07639838755130768,
-0.20927079021930695,
-0.12096459418535233,
0.04914373904466629,
0.07289380580186844,
-0.038823578506708145,
-0.16181005537509918,
-0.0022359138820320368,
0.03528504818677902,
-0.0048415022902190685,
0.05062423273921013,
-0.25919514894485474,
-0.09926625341176987,
0.11935926973819733,
-0.01923735998570919,
-0.06094640865921974,
-0.07382135093212128,
-0.09517543762922287,
-0.06680480390787125,
-0.10589258372783661,
0.03587714955210686,
-0.1658482551574707,
0.045921843498945236,
0.06024400517344475,
-0.034132543951272964,
0.010878017172217369,
-0.0012143112253397703,
0.13100601732730865,
0.008842413313686848,
-0.01727648451924324,
-0.02758726477622986,
0.09406094998121262,
-0.044553838670253754,
-0.020957274362444878,
0.08939777314662933,
-0.055349092930555344,
0.052456002682447433,
-0.05279579758644104,
-0.03493013605475426,
-0.05009227246046066,
0.0769955962896347,
-0.01734444871544838,
-0.004024727735668421,
-0.059531185775995255,
0.0068039847537875175,
0.040518198162317276,
0.0063841380178928375,
0.0006764422287233174,
-0.09419805556535721,
0.03127463907003403,
0.20372286438941956,
0.10354837030172348,
0.057219844311475754,
-0.03945600986480713,
0.0018938434077426791,
-0.04167093709111214,
0.04789993539452553,
-0.03602459654211998,
0.06726919114589691,
0.1228095218539238,
-0.01320952083915472,
0.13992659747600555,
-0.014433641918003559,
-0.13335277140140533,
0.048124589025974274,
0.06857254356145859,
-0.08365380764007568,
-0.1595020741224289,
0.005378090776503086,
-0.0848611444234848,
0.010195457376539707,
0.009539906866848469,
0.18631727993488312,
0.039267946034669876,
-0.030393868684768677,
-0.02382742054760456,
0.03763839229941368,
-0.0512772835791111,
0.16145062446594238,
0.05862552672624588,
0.07245753705501556,
-0.07293283194303513,
0.04587996006011963,
0.07670208066701889,
-0.09864471107721329,
0.04764183238148689,
0.12647408246994019,
-0.10286063700914383,
-0.10269670188426971,
-0.04203151911497116,
0.09678305685520172,
-0.006759342271834612,
-0.07667894661426544,
-0.019855400547385216,
-0.08330071717500687,
0.031067103147506714,
0.20824725925922394,
0.02381165511906147,
0.055823955684900284,
0.041889939457178116,
-0.05388827994465828,
-0.0027396560180932283,
0.11968802660703659,
0.025792496278882027,
-0.012486608698964119,
-0.04520190507173538,
0.10165027529001236,
-0.003240104764699936,
0.10232619196176529,
-0.028958721086382866,
-0.04639995098114014,
-0.10066598653793335,
0.031038334593176842,
-0.18315662443637848,
0.03855603188276291,
-0.06543327867984772,
-0.013068152591586113,
0.0312950499355793,
-0.030056465417146683,
0.03444774076342583,
0.016112318262457848,
-0.08307913690805435,
-0.0272311270236969,
-0.05449090525507927,
0.05720749869942665,
-0.17792783677577972,
-0.007214762270450592,
0.031411416828632355,
-0.0516081340610981,
0.07664936780929565,
0.018553750589489937,
-0.04418397694826126,
0.053434133529663086,
-0.15353044867515564,
-0.04050126671791077,
0.008166483603417873,
0.0674077570438385,
0.05071254447102547,
-0.09155302494764328,
0.022559087723493576,
0.035947974771261215,
0.006934067700058222,
0.014208775945007801,
0.07361270487308502,
-0.0622485876083374,
0.04056761786341667,
-0.10358548164367676,
-0.014313491992652416,
-0.021827328950166702,
0.0450383797287941,
0.04178529232740402,
0.08620195090770721,
0.09587698429822922,
-0.1074291542172432,
0.12490097433328629,
-0.08990674465894699,
-0.0334196463227272,
-0.000595369900111109,
0.013262293301522732,
0.031515974551439285,
-0.08222515136003494,
0.09991896152496338,
-0.034174587577581406,
0.10054661333560944,
-0.011226003058254719,
0.11663921922445297,
-0.01609041914343834,
-0.17391034960746765,
-0.11211291700601578,
0.06417151540517807,
0.09010078012943268,
0.06039101257920265,
-0.010205400176346302,
-0.016787303611636162,
-0.032976943999528885,
0.0012360637774690986,
0.05249997600913048,
0.05198579654097557,
0.1404278725385666,
0.09373413026332855,
0.07180272042751312,
0.07077904045581818,
-0.08155352622270584,
0.007921352982521057,
-0.010208669118583202,
-0.13241071999073029,
0.04531032592058182,
-0.059918779879808426,
0.10864952951669693,
0.05918675661087036,
-0.1511290818452835,
0.052840471267700195,
0.02890178933739662,
-0.07709753513336182,
-0.12029104679822922,
-0.0708574652671814,
-0.0989559069275856,
-0.08202648162841797,
0.011603721417486668,
-0.08430258184671402,
0.03735688701272011,
-0.033300869166851044,
0.0712236613035202,
-0.015143999829888344,
0.09909840673208237,
-0.0007311524823307991,
-0.09887868911027908,
0.07940955460071564,
-0.013285500928759575,
0.06903337687253952,
0.08867247402667999,
0.008517990820109844,
0.08005605638027191,
0.03409268707036972,
0.061439115554094315,
0.05763467773795128,
-0.020902130752801895,
0.05436472222208977,
-0.06827633827924728,
-0.07882519066333771,
0.014192106202244759,
0.01673426851630211,
0.02751430869102478,
0.13491572439670563,
0.1269455999135971,
-0.039322879165410995,
0.0033152273390442133,
0.08910669386386871,
-0.0503402017056942,
-0.13631078600883484,
-0.1609778255224228,
0.09725409001111984,
0.012616091407835484,
0.11256146430969238,
-0.018971819430589676,
-0.10532547533512115,
-0.04910582676529884,
0.20208467543125153,
0.10393655300140381,
-0.024846794083714485,
0.04152649641036987,
0.06216691806912422,
0.02726738341152668,
0.026998667046427727,
0.028157828375697136,
0.06332847476005554,
0.26037755608558655,
-0.012961274944245815,
-0.044855859130620956,
-0.02842000499367714,
-0.07545173168182373,
-0.06981872022151947,
-0.01511742826551199,
-0.09277547895908356,
-0.0380672886967659,
-0.01341968309134245,
0.14235086739063263,
-0.08285050839185715,
-0.14386646449565887,
0.06501783430576324,
-0.04196128621697426,
-0.0756000429391861,
0.010698594152927399,
0.10109249502420425,
0.05734071508049965,
0.013143109157681465,
0.009893708862364292,
-0.046105269342660904,
0.2142253816127777,
-0.025361889973282814,
-0.010595522820949554,
-0.008268212899565697,
-0.0141844367608428,
-0.15532861649990082,
0.08783011883497238,
-0.012679936364293098,
0.17213252186775208,
0.09326106309890747,
0.05687420442700386,
-0.046627361327409744,
0.09924405813217163,
0.0673750787973404,
-0.06311442703008652,
0.07516243308782578,
0.09234724938869476,
-0.0018412588397040963,
0.07357323169708252,
0.11592841148376465,
-0.07265545427799225,
0.06949077546596527,
0.0381791777908802,
0.009081623516976833,
-0.08253375440835953,
0.04449678212404251,
-0.08805821090936661,
0.0991029366850853,
0.12944647669792175,
-0.021546581760048866,
0.03281065449118614,
-0.011836456134915352,
0.034765731543302536,
-0.033980902284383774,
0.050036150962114334,
-0.06089767441153526,
-0.1378505676984787,
0.01299200113862753,
-0.013565922155976295,
0.06181526556611061,
-0.17674469947814941,
-0.004105757921934128,
0.01860995776951313,
0.009505216963589191,
0.023219287395477295,
0.11567138135433197,
0.029135921970009804,
0.012579033151268959,
-0.027267972007393837,
-0.08756134659051895,
0.040626369416713715,
0.1008765697479248,
-0.12369690835475922,
-0.046185195446014404
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-2B
[Facebook's Wav2Vec2 XLS-R](https://ai.facebook.com/blog/xls-r-self-supervised-speech-processing-for-128-languages) counting **2 billion** parameters.

XLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the "XLM-R for Speech"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz.
**Note**: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out [**this blog**](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for more information about ASR.
[XLS-R Paper](https://arxiv.org/abs/2111.09296)
Authors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli
**Abstract**
This paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this google colab](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLS_R_on_Common_Voice.ipynb) for more information on how to fine-tune the model.
You can find other pretrained XLS-R models with different numbers of parameters:
* [300M parameters version](https://huggingface.co/facebook/wav2vec2-xls-r-300m)
* [1B version version](https://huggingface.co/facebook/wav2vec2-xls-r-1b)
* [2B version version](https://huggingface.co/facebook/wav2vec2-xls-r-2b)
|
{"language": ["multilingual", "ab", "af", "sq", "am", "ar", "hy", "as", "az", "ba", "eu", "be", "bn", "bs", "br", "bg", "my", "yue", "ca", "ceb", "km", "zh", "cv", "hr", "cs", "da", "dv", "nl", "en", "eo", "et", "fo", "fi", "fr", "gl", "lg", "ka", "de", "el", "gn", "gu", "ht", "cnh", "ha", "haw", "he", "hi", "hu", "is", "id", "ia", "ga", "it", "ja", "jv", "kb", "kn", "kk", "rw", "ky", "ko", "ku", "lo", "la", "lv", "ln", "lt", "lm", "mk", "mg", "ms", "ml", "mt", "gv", "mi", "mr", "mn", "ne", false, "nn", "oc", "or", "ps", "fa", "pl", "pt", "pa", "ro", "rm", "rm", "ru", "sah", "sa", "sco", "sr", "sn", "sd", "si", "sk", "sl", "so", "hsb", "es", "su", "sw", "sv", "tl", "tg", "ta", "tt", "te", "th", "bo", "tp", "tr", "tk", "uk", "ur", "uz", "vi", "vot", "war", "cy", "yi", "yo", "zu"], "license": "apache-2.0", "tags": ["speech", "xls_r", "xls_r_pretrained"], "datasets": ["common_voice", "multilingual_librispeech"], "language_bcp47": ["zh-HK", "zh-TW", "fy-NL"]}
| null |
facebook/wav2vec2-xls-r-2b
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"speech",
"xls_r",
"xls_r_pretrained",
"multilingual",
"ab",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"ba",
"eu",
"be",
"bn",
"bs",
"br",
"bg",
"my",
"yue",
"ca",
"ceb",
"km",
"zh",
"cv",
"hr",
"cs",
"da",
"dv",
"nl",
"en",
"eo",
"et",
"fo",
"fi",
"fr",
"gl",
"lg",
"ka",
"de",
"el",
"gn",
"gu",
"ht",
"cnh",
"ha",
"haw",
"he",
"hi",
"hu",
"is",
"id",
"ia",
"ga",
"it",
"ja",
"jv",
"kb",
"kn",
"kk",
"rw",
"ky",
"ko",
"ku",
"lo",
"la",
"lv",
"ln",
"lt",
"lm",
"mk",
"mg",
"ms",
"ml",
"mt",
"gv",
"mi",
"mr",
"mn",
"ne",
"no",
"nn",
"oc",
"or",
"ps",
"fa",
"pl",
"pt",
"pa",
"ro",
"rm",
"ru",
"sah",
"sa",
"sco",
"sr",
"sn",
"sd",
"si",
"sk",
"sl",
"so",
"hsb",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tp",
"tr",
"tk",
"uk",
"ur",
"uz",
"vi",
"vot",
"war",
"cy",
"yi",
"yo",
"zu",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"ab",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"ba",
"eu",
"be",
"bn",
"bs",
"br",
"bg",
"my",
"yue",
"ca",
"ceb",
"km",
"zh",
"cv",
"hr",
"cs",
"da",
"dv",
"nl",
"en",
"eo",
"et",
"fo",
"fi",
"fr",
"gl",
"lg",
"ka",
"de",
"el",
"gn",
"gu",
"ht",
"cnh",
"ha",
"haw",
"he",
"hi",
"hu",
"is",
"id",
"ia",
"ga",
"it",
"ja",
"jv",
"kb",
"kn",
"kk",
"rw",
"ky",
"ko",
"ku",
"lo",
"la",
"lv",
"ln",
"lt",
"lm",
"mk",
"mg",
"ms",
"ml",
"mt",
"gv",
"mi",
"mr",
"mn",
"ne",
"no",
"nn",
"oc",
"or",
"ps",
"fa",
"pl",
"pt",
"pa",
"ro",
"rm",
"rm",
"ru",
"sah",
"sa",
"sco",
"sr",
"sn",
"sd",
"si",
"sk",
"sl",
"so",
"hsb",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tp",
"tr",
"tk",
"uk",
"ur",
"uz",
"vi",
"vot",
"war",
"cy",
"yi",
"yo",
"zu"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-2B
Facebook's Wav2Vec2 XLS-R counting 2 billion parameters.
!model image
XLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the "XLM-R for Speech"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz.
Note: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.
XLS-R Paper
Authors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli
Abstract
This paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.
The original model can be found under URL
# Usage
See this google colab for more information on how to fine-tune the model.
You can find other pretrained XLS-R models with different numbers of parameters:
* 300M parameters version
* 1B version version
* 2B version version
|
[
"# Wav2Vec2-XLS-R-2B\n\nFacebook's Wav2Vec2 XLS-R counting 2 billion parameters.\n\n!model image\n\nXLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the \"XLM-R for Speech\"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz. \n\nNote: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.\n\nXLS-R Paper\n\nAuthors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli\n\nAbstract\nThis paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.\n\nThe original model can be found under URL",
"# Usage\n\nSee this google colab for more information on how to fine-tune the model.\n\nYou can find other pretrained XLS-R models with different numbers of parameters:\n\n* 300M parameters version\n* 1B version version\n* 2B version version"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-2B\n\nFacebook's Wav2Vec2 XLS-R counting 2 billion parameters.\n\n!model image\n\nXLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the \"XLM-R for Speech\"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz. \n\nNote: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.\n\nXLS-R Paper\n\nAuthors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli\n\nAbstract\nThis paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.\n\nThe original model can be found under URL",
"# Usage\n\nSee this google colab for more information on how to fine-tune the model.\n\nYou can find other pretrained XLS-R models with different numbers of parameters:\n\n* 300M parameters version\n* 1B version version\n* 2B version version"
] |
[
364,
542,
55
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.04758894816040993,
0.046750012785196304,
-0.014598522335290909,
0.007297620642930269,
0.04550313577055931,
0.06332182139158249,
0.05419944226741791,
0.09116444736719131,
0.0727275013923645,
0.120418980717659,
0.11597013473510742,
0.08993609249591827,
0.10850890725851059,
0.030012592673301697,
0.02611636370420456,
-0.23409810662269592,
-0.014968891628086567,
-0.01329061295837164,
-0.050599705427885056,
0.09369145333766937,
0.0228695347905159,
-0.039926063269376755,
0.09142758697271347,
-0.08705759048461914,
0.05855416879057884,
0.025268973782658577,
-0.04975442960858345,
0.006925574969500303,
0.026553384959697723,
0.04827604070305824,
0.005549455527216196,
0.07044541090726852,
0.04715149477124214,
-0.24975258111953735,
0.032285042107105255,
-0.01567848026752472,
-0.03946636989712715,
-0.007352286018431187,
0.019108077511191368,
-0.12647803127765656,
0.16093921661376953,
-0.058167893439531326,
-0.0829150602221489,
0.048358362168073654,
-0.17256440222263336,
-0.1780303716659546,
-0.057935889810323715,
0.11529215425252914,
0.05751940608024597,
0.04282727837562561,
-0.054432131350040436,
0.07651355117559433,
-0.10932482779026031,
0.05207585170865059,
0.20425982773303986,
-0.18100734055042267,
-0.03396864980459213,
0.052614592015743256,
0.03836875036358833,
0.06726958602666855,
-0.10757510364055634,
0.004715980961918831,
0.014004664495587349,
0.018325919285416603,
-0.06147626042366028,
-0.04280159994959831,
0.09236277639865875,
0.05261838808655739,
-0.0792241171002388,
-0.0023866845294833183,
0.12998612225055695,
0.04852217063307762,
0.05794971063733101,
0.07729648798704147,
-0.014261530712246895,
-0.17769774794578552,
-0.036947861313819885,
-0.0072819264605641365,
0.02907484956085682,
0.03221805766224861,
-0.0040543596260249615,
0.10043403506278992,
-0.04863365739583969,
0.027797162532806396,
-0.0013738577254116535,
0.03689282014966011,
0.05448931083083153,
-0.01068132370710373,
0.002564662601798773,
-0.008259820751845837,
0.02728724479675293,
-0.1339804083108902,
0.0193843524903059,
0.03314534202218056,
-0.011473586782813072,
-0.0013692154316231608,
0.08231918513774872,
0.08224035054445267,
0.106025330722332,
0.11278236657381058,
-0.0741412341594696,
0.09792093932628632,
0.053250301629304886,
0.05515800416469574,
0.03846575319766998,
0.030388658866286278,
-0.05339635908603668,
-0.07941558212041855,
-0.11798667907714844,
0.011203414760529995,
-0.039740465581417084,
0.002606295980513096,
-0.0391656793653965,
0.03476659208536148,
-0.0027384550776332617,
0.04721203073859215,
-0.002601043554022908,
0.04684453085064888,
-0.0627133846282959,
0.024724356830120087,
0.010196039453148842,
-0.04707365855574608,
0.03391134366393089,
0.10114587098360062,
0.019277868792414665,
0.10931576043367386,
-0.03982828930020332,
0.015561006963253021,
-0.010288936085999012,
0.06658016890287399,
-0.034604910761117935,
0.055008504539728165,
-0.009356276132166386,
-0.02502163127064705,
0.0725674033164978,
-0.046018797904253006,
0.04567970335483551,
-0.08267710357904434,
-0.028219234198331833,
-0.05670318007469177,
-0.0090637831017375,
-0.07997855544090271,
-0.0053685917519032955,
-0.08900962024927139,
-0.1086217612028122,
-0.002038841601461172,
0.012349596247076988,
0.052931562066078186,
-0.0727030411362648,
0.08642856776714325,
0.009818961843848228,
0.08657389879226685,
0.060204003006219864,
0.026761578395962715,
0.0009941438911482692,
0.07027294486761093,
-0.05940583348274231,
0.06373874843120575,
-0.08054359257221222,
0.03210092708468437,
-0.0985342487692833,
-0.09965771436691284,
-0.10359422862529755,
0.017799248918890953,
0.0003231614828109741,
0.17196089029312134,
-0.1537889987230301,
-0.09573788195848465,
0.26548218727111816,
-0.03543860465288162,
-0.03795851394534111,
0.12055743485689163,
0.05553283542394638,
-0.018209131434559822,
0.04342658445239067,
0.14339213073253632,
0.023143338039517403,
-0.09280882775783539,
-0.09612368792295456,
0.03393334895372391,
0.06252431869506836,
0.10833461582660675,
0.10885844379663467,
0.01866021566092968,
0.1016145795583725,
0.025084640830755234,
0.01778208650648594,
0.06347259134054184,
-0.08169955015182495,
-0.08290635794401169,
0.0662018284201622,
-0.04978432506322861,
0.036164697259664536,
0.10354362428188324,
-0.025877490639686584,
-0.051357705146074295,
-0.03679794445633888,
-0.11615405976772308,
0.09353452920913696,
0.003735500853508711,
-0.009119843132793903,
-0.09966672956943512,
-0.0024344315752387047,
0.06647340953350067,
0.02288619615137577,
-0.04526057466864586,
0.09622130542993546,
-0.053359005600214005,
0.11819121986627579,
0.0636710673570633,
0.07794049382209778,
0.10187273472547531,
-0.018218638375401497,
-0.07862292975187302,
-0.041916824877262115,
0.09991682320833206,
0.012974705547094345,
-0.0592535175383091,
-0.22189302742481232,
0.05805857852101326,
-0.021290823817253113,
0.09959299862384796,
-0.173395037651062,
0.03262442350387573,
0.1390991061925888,
0.1537192165851593,
0.0032056004274636507,
-0.011890897527337074,
-0.007739355321973562,
0.08227310329675674,
0.01825963705778122,
-0.015648340806365013,
0.033745236694812775,
-0.030913399532437325,
-0.0483984649181366,
0.02221750095486641,
-0.07240423560142517,
0.11462175101041794,
0.10714226216077805,
-0.0774630531668663,
-0.07751689106225967,
0.10570073872804642,
-0.021646583452820778,
-0.019964344799518585,
0.11689845472574234,
-0.0033914693631231785,
0.10226988047361374,
0.0063048601150512695,
0.020653771236538887,
-0.021763304248452187,
-0.02627391926944256,
0.016238698735833168,
-0.078618124127388,
-0.04888671264052391,
0.19366852939128876,
0.039686936885118484,
-0.14944960176944733,
0.19100140035152435,
0.13074064254760742,
0.046479348093271255,
0.1761941760778427,
-0.02098187804222107,
-0.038460101932287216,
-0.09703352302312851,
0.008853917010128498,
0.006463043857365847,
0.06676548719406128,
-0.1745358407497406,
-0.0192779041826725,
-0.035569097846746445,
-0.012208925560116768,
0.014522350393235683,
-0.07549697160720825,
-0.06783230602741241,
-0.049335066229104996,
-0.04544844850897789,
0.0057989503256976604,
0.05347583442926407,
-0.06442030519247055,
0.0798012763261795,
0.01801256649196148,
-0.03578075021505356,
-0.05564208701252937,
-0.0182326789945364,
-0.07343576848506927,
0.14630886912345886,
-0.1613723635673523,
-0.0870039165019989,
0.06405065953731537,
-0.10177189111709595,
0.058744970709085464,
-0.019216667860746384,
0.0016281373100355268,
-0.14172931015491486,
0.012648227624595165,
0.03568769991397858,
0.0959029421210289,
-0.10252966731786728,
-0.022290745750069618,
-0.010165792889893055,
0.010574684478342533,
-0.04425327852368355,
-0.022789975628256798,
-0.03573226556181908,
0.015208350494503975,
-0.0862061157822609,
0.09914316236972809,
-0.13020643591880798,
0.05319688096642494,
0.11437389999628067,
0.10433543473482132,
0.010324402712285519,
-0.014090786688029766,
0.15116819739341736,
-0.14562298357486725,
0.003910788334906101,
-0.03903409093618393,
-0.007643275894224644,
0.03717243671417236,
0.1350412517786026,
0.04445036128163338,
-0.06503818184137344,
-0.03835732862353325,
0.015962857753038406,
-0.004784753080457449,
-0.1427658051252365,
-0.0325813964009285,
-0.047438591718673706,
0.11782156676054001,
-0.04103660210967064,
0.08624549210071564,
-0.023751046508550644,
0.0010536120971664786,
-0.047525715082883835,
-0.12167119979858398,
-0.008506490848958492,
-0.04549160227179527,
0.0391707718372345,
-0.045567408204078674,
0.013115744106471539,
-0.03516572713851929,
-0.022930510342121124,
0.04655591398477554,
0.08895205706357956,
-0.06574182957410812,
0.05527547001838684,
0.07031182944774628,
0.07370050996541977,
0.1494828313589096,
-0.0243929885327816,
-0.043639298528432846,
0.043996717780828476,
-0.018319150432944298,
0.0061632124707102776,
-0.019752908498048782,
-0.031136393547058105,
0.026026621460914612,
0.15798795223236084,
-0.025608809664845467,
0.035328567028045654,
0.0016783815808594227,
0.12035245448350906,
0.07534033805131912,
0.06748493760824203,
-0.10401834547519684,
-0.029330622404813766,
0.07391085475683212,
0.007402131799608469,
-0.004358747974038124,
0.03399980440735817,
0.03638716787099838,
-0.05962919816374779,
0.11692819744348526,
0.08322925120592117,
0.023530272766947746,
-0.07913004606962204,
0.06384597718715668,
0.007824680767953396,
-0.00374860898591578,
-0.009654784575104713,
0.05424704775214195,
-0.2965855002403259,
0.19174015522003174,
0.020947936922311783,
0.010615861043334007,
-0.008552956394851208,
-0.0559433214366436,
0.03683074191212654,
0.07165589183568954,
0.13723109662532806,
0.0654105469584465,
-0.16263112425804138,
-0.16869181394577026,
-0.011847032234072685,
0.0036132351960986853,
0.1084655374288559,
-0.025796353816986084,
0.04171372205018997,
0.0539994016289711,
-0.042270079255104065,
-0.02768274024128914,
0.009784230962395668,
-0.07511284947395325,
-0.008112223818898201,
0.07865383476018906,
-0.043466437608003616,
0.033007413148880005,
-0.019895141944289207,
-0.03433030843734741,
-0.17768992483615875,
0.011113160289824009,
-0.15309588611125946,
0.013765513896942139,
-0.034572720527648926,
0.01982167549431324,
0.057510241866111755,
-0.11803165823221207,
-0.10935557633638382,
0.043603166937828064,
-0.07333322614431381,
-0.019134515896439552,
0.021817652508616447,
0.08984815329313278,
-0.0452747568488121,
-0.18489821255207062,
-0.003973573446273804,
0.116004578769207,
0.08338615298271179,
0.11872392892837524,
-0.04745079576969147,
0.02563413791358471,
-0.00017659025616012514,
-0.08358296006917953,
0.15851682424545288,
-0.021242165938019753,
-0.028059806674718857,
0.04623347893357277,
-0.008475400507450104,
-0.06448058784008026,
-0.08969849348068237,
-0.07474170625209808,
0.08955075591802597,
0.2982664108276367,
-0.02794746123254299,
0.10638932883739471,
0.0982741042971611,
-0.08372929692268372,
-0.2470681369304657,
-0.11061340570449829,
-0.01849932037293911,
0.022611845284700394,
-0.014536612667143345,
-0.20981232821941376,
-0.037290822714567184,
-0.0007734635728411376,
0.027293013408780098,
-0.011917482130229473,
-0.27352064847946167,
-0.03471164405345917,
0.1081906408071518,
0.0014343776274472475,
0.08150352537631989,
-0.1988932341337204,
-0.03273606672883034,
-0.006223125848919153,
-0.061205051839351654,
-0.08480805903673172,
-0.01567237451672554,
0.04754459857940674,
-0.012480777688324451,
0.01996791735291481,
-0.007350675296038389,
-0.009788032621145248,
0.16906900703907013,
0.055363982915878296,
-0.015854258090257645,
-0.08185693621635437,
-0.112275131046772,
0.011410282924771309,
0.027432221919298172,
-0.020014578476548195,
-0.12136157602071762,
-0.06844186037778854,
-0.05250456929206848,
0.028336280956864357,
-0.1432066410779953,
-0.0038499508518725634,
-0.053854309022426605,
0.01680470071732998,
-0.043834712356328964,
0.07147806882858276,
0.061997462064027786,
0.015688452869653702,
0.07988321781158447,
-0.09488020837306976,
0.10145483165979385,
0.003983052913099527,
0.13370566070079803,
0.05125358700752258,
0.003471267642453313,
-0.032494474202394485,
-0.012622587382793427,
-0.007999869994819164,
-0.09816669672727585,
-0.008034632541239262,
0.1326776146888733,
0.006311078555881977,
0.08053737878799438,
0.0388166643679142,
-0.11542125046253204,
0.005140396766364574,
0.10659848153591156,
-0.08538658916950226,
-0.16917502880096436,
-0.0030268800910562277,
-0.0890909805893898,
-0.02113277278840542,
-0.013151636347174644,
0.09828539192676544,
-0.004258833825588226,
-0.009548707865178585,
0.009000595659017563,
0.08283234387636185,
-0.06984352320432663,
0.1317896991968155,
0.06573610007762909,
0.007921171374619007,
-0.07596545666456223,
0.018116388469934464,
0.008600310422480106,
-0.05593857914209366,
0.01814286783337593,
0.16202816367149353,
-0.03931687772274017,
-0.08665654808282852,
-0.002317215083166957,
0.11540139466524124,
0.09805311262607574,
-0.014468617737293243,
-0.00561782019212842,
-0.12169135361909866,
0.05704066902399063,
0.1739104688167572,
0.02329704351723194,
0.03515571355819702,
0.06376361101865768,
0.03490538150072098,
0.04636535793542862,
0.09959180653095245,
0.06675101816654205,
-0.019476937130093575,
-0.04469487816095352,
0.0639730915427208,
-0.05874791368842125,
0.10523588955402374,
-0.008864161558449268,
-0.013945162296295166,
-0.18875719606876373,
0.055915020406246185,
-0.051443327218294144,
-0.051493141800165176,
-0.1284194439649582,
-0.028333108872175217,
0.01252016332000494,
-0.10935238003730774,
-0.05806810036301613,
-0.051216673105955124,
-0.08376302570104599,
-0.011372096836566925,
0.022658830508589745,
0.12448020279407501,
-0.06087949499487877,
-0.06464768201112747,
0.08071742206811905,
-0.06336674839258194,
0.0726860836148262,
0.11158103495836258,
-0.0033723560627549887,
0.08947547525167465,
-0.15136584639549255,
-0.020136872306466103,
0.03837635740637779,
0.018109142780303955,
-0.026558663696050644,
-0.021494993939995766,
-0.0733703002333641,
-0.06686865538358688,
0.019500169903039932,
0.056125253438949585,
0.023654242977499962,
0.018599938601255417,
0.15732167661190033,
-0.03313099220395088,
-0.08085320144891739,
-0.027194727212190628,
0.04946548864245415,
0.07403677701950073,
-0.013004578649997711,
0.021954303607344627,
-0.08872180432081223,
0.04596378654241562,
-0.10859571397304535,
0.029522854834794998,
0.0036232504062354565,
-0.09073837846517563,
0.012730208225548267,
-0.034900274127721786,
0.08679858595132828,
-0.0240815170109272,
0.06667061150074005,
-0.07957162708044052,
-0.12135081738233566,
0.014396972954273224,
-0.03353375941514969,
-0.07763993740081787,
0.04483365640044212,
0.018808657303452492,
0.052962254732847214,
-0.06260966509580612,
-0.09313981235027313,
0.03413401544094086,
-0.027125364169478416,
-0.03185851871967316,
0.10172892361879349,
0.10287643224000931,
0.15682417154312134,
0.04110949859023094,
-0.012061616405844688,
-0.10464885085821152,
0.030519677326083183,
0.04306397587060928,
-0.1267947256565094,
-0.03761861100792885,
-0.033327169716358185,
0.1723002791404724,
0.14371195435523987,
-0.12686452269554138,
0.025362113490700722,
-0.0958649069070816,
-0.06121267378330231,
-0.09734506160020828,
-0.11149827390909195,
-0.02032903954386711,
-0.04046092554926872,
0.034334179013967514,
-0.08462601155042648,
0.023711245507001877,
0.05536153167486191,
0.055003099143505096,
0.022635193541646004,
0.0749579444527626,
0.05964932218194008,
-0.029332127422094345,
0.02037837915122509,
0.015804825350642204,
-0.027766520157456398,
-0.12268771976232529,
0.06084098666906357,
0.012114623561501503,
-0.04785197228193283,
0.00784006342291832,
0.006400256883352995,
-0.1174456775188446,
-0.0028134838212281466,
-0.08843544125556946,
-0.10307108610868454,
0.0019086425891146064,
0.03586211055517197,
0.01882612146437168,
0.08446840196847916,
0.018749628216028214,
0.0027973330579698086,
0.03613686561584473,
0.05063589662313461,
-0.0379384309053421,
-0.05817336589097977,
-0.054089002311229706,
0.12242035567760468,
-0.05788540840148926,
0.024130774661898613,
-0.005720221437513828,
-0.010565275326371193,
0.045951247215270996,
0.1654592603445053,
0.22614310681819916,
-0.06411873549222946,
0.06531936675310135,
0.01827136240899563,
0.04413125663995743,
0.037272192537784576,
-0.05637273192405701,
0.09209465235471725,
0.14816640317440033,
-0.10727906227111816,
0.04184422641992569,
-0.0515945665538311,
-0.018564878031611443,
-0.042329996824264526,
-0.04453852027654648,
0.031944289803504944,
0.01670663431286812,
-0.04309053346514702,
0.11987631022930145,
-0.19023706018924713,
-0.09840022027492523,
0.019816596060991287,
-0.19020907580852509,
-0.026547834277153015,
-0.031278468668460846,
0.033922936767339706,
0.15247024595737457,
0.07217798382043839,
-0.018671637400984764,
-0.07651641219854355,
0.046709176152944565,
0.047176163643598557,
-0.09028580039739609,
0.06400150060653687,
0.05047034099698067,
-0.09749908745288849,
0.04046724736690521,
-0.07054471224546432,
0.03860846161842346,
0.10197609663009644,
0.001449240604415536,
0.02716143988072872,
0.01903568208217621,
0.07957754284143448,
-0.04702087119221687,
-0.09546404331922531,
0.08421297371387482,
-0.0016376468120142817,
0.04542030394077301,
0.1331513673067093,
-0.02716847136616707,
0.0395800918340683,
0.0850001648068428,
-0.05050958693027496,
0.03210289403796196,
0.11880776286125183,
-0.039885617792606354,
0.06432854384183884,
0.10259599983692169,
-0.009587311185896397,
-0.021405193954706192,
-0.011172259226441383,
-0.00789770670235157,
-0.033448997884988785,
0.05320707708597183,
-0.03739314153790474,
-0.1015140563249588,
-0.019261715933680534,
-0.01634345017373562,
0.08806314319372177,
0.000858892744872719,
-0.06017018482089043,
0.017442673444747925,
0.012204518541693687,
-0.07268474251031876,
0.09057633578777313,
0.07205245643854141,
0.019738230854272842,
-0.05116759613156319,
-0.1648966372013092,
0.0272023007273674,
0.13811902701854706,
-0.09347319602966309,
-0.050105031579732895
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-300M-21-EN
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-300m`**](https://huggingface.co/facebook/wav2vec2-xls-r-300m) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 21 `{lang}` -> `en` translation pairs of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from the following spoken languages `{lang}` -> `en` (English):
{`fr`, `de`, `es`, `ca`, `it`, `ru`, `zh-CN`, `pt`, `fa`, `et`, `mn`, `nl`, `tr`, `ar`, `sv-SE`, `lv`, `sl`, `ta`, `ja`, `id`, `cy`} -> `en`
For more information, please refer to Section *5.1.2* of the [official XLS-R paper](https://arxiv.org/abs/2111.09296).
## Usage
### Demo
The model can be tested directly on the speech recognition widget on this model card!
Simple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline
```python
from datasets import load_dataset
from transformers import pipeline
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-300m-21-to-en", feature_extractor="facebook/wav2vec2-xls-r-300m-21-to-en")
translation = asr(audio_file)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-300m-21-to-en")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-300m-21-to-en")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"])
transcription = processor.batch_decode(generated_ids)
```
## Results `{lang}` -> `en`
See the row of **XLS-R (0.3B)** for the performance on [Covost2](https://huggingface.co/datasets/covost2) for this model.

## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-300m-21-to-en)
- [Wav2Vec2-XLS-R-1B-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-1b-21-to-en)
- [Wav2Vec2-XLS-R-2B-21-EN](https://huggingface.co/facebook/wav2vec2-xls-r-2b-21-to-en)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "fr", "de", "es", "ca", "it", "ru", "zh", "pt", "fa", "et", "mn", "nl", "tr", "ar", "sv", "lv", "sl", "ta", "ja", "id", "cy", "en"], "license": "apache-2.0", "tags": ["speech", "xls_r", "automatic-speech-recognition", "xls_r_translation"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Swedish", "src": "https://cdn-media.huggingface.co/speech_samples/cv_swedish_1.mp3"}, {"example_title": "Arabic", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ar_19058308.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "German", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_de_17284683.mp3"}, {"example_title": "French", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_fr_17299386.mp3"}, {"example_title": "Indonesian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_id_19051309.mp3"}, {"example_title": "Italian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_it_17415776.mp3"}, {"example_title": "Japanese", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ja_19482488.mp3"}, {"example_title": "Mongolian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_mn_18565396.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}, {"example_title": "Russian", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ru_18849022.mp3"}, {"example_title": "Turkish", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_tr_17341280.mp3"}, {"example_title": "Catalan", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_ca_17367522.mp3"}, {"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Dutch", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_nl_17691471.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-300m-21-to-en
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"fr",
"de",
"es",
"ca",
"it",
"ru",
"zh",
"pt",
"fa",
"et",
"mn",
"nl",
"tr",
"ar",
"sv",
"lv",
"sl",
"ta",
"ja",
"id",
"cy",
"en"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-300M-21-EN
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.
The model can translate from the following spoken languages '{lang}' -> 'en' (English):
{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'
For more information, please refer to Section *5.1.2* of the official XLS-R paper.
## Usage
### Demo
The model can be tested directly on the speech recognition widget on this model card!
Simple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline
or step-by-step as follows:
## Results '{lang}' -> 'en'
See the row of XLS-R (0.3B) for the performance on Covost2 for this model.
!results image
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-21-EN
- Wav2Vec2-XLS-R-1B-21-EN
- Wav2Vec2-XLS-R-2B-21-EN
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-300M-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:",
"## Results '{lang}' -> 'en'\n\nSee the row of XLS-R (0.3B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-21-EN\n- Wav2Vec2-XLS-R-1B-21-EN\n- Wav2Vec2-XLS-R-2B-21-EN\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-300M-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:",
"## Results '{lang}' -> 'en'\n\nSee the row of XLS-R (0.3B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-21-EN\n- Wav2Vec2-XLS-R-1B-21-EN\n- Wav2Vec2-XLS-R-2B-21-EN\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
152,
280,
3,
52,
66,
40,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #fr #de #es #ca #it #ru #zh #pt #fa #et #mn #nl #tr #ar #sv #lv #sl #ta #ja #id #cy #en #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-XLS-R-300M-21-EN\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 21 '{lang}' -> 'en' translation pairs of the Covost2 dataset.\n\nThe model can translate from the following spoken languages '{lang}' -> 'en' (English):\n\n{'fr', 'de', 'es', 'ca', 'it', 'ru', 'zh-CN', 'pt', 'fa', 'et', 'mn', 'nl', 'tr', 'ar', 'sv-SE', 'lv', 'sl', 'ta', 'ja', 'id', 'cy'} -> 'en'\n\nFor more information, please refer to Section *5.1.2* of the official XLS-R paper.## Usage### Demo\n\nThe model can be tested directly on the speech recognition widget on this model card! \nSimple record some audio in one of the possible spoken languages or pick an example audio file to see how well the checkpoint can translate the input."
] |
[
-0.07269272208213806,
0.03771946206688881,
-0.005541683174669743,
0.021866342052817345,
0.02256869338452816,
-0.01754506304860115,
0.025793658569455147,
0.09876181185245514,
0.04188045114278793,
0.13749729096889496,
-0.007112571969628334,
0.0712776854634285,
0.07575750350952148,
0.12582933902740479,
-0.021605653688311577,
-0.15340964496135712,
0.05517818033695221,
-0.10860460251569748,
0.03572501987218857,
0.07446803897619247,
0.0875253677368164,
-0.07546468079090118,
0.031890470534563065,
-0.017779527232050896,
-0.0027422006241977215,
0.01680544763803482,
-0.01237337477505207,
-0.06566978245973587,
0.042221762239933014,
0.09722664207220078,
0.04075586050748825,
0.06732922047376633,
0.06084155663847923,
-0.2824968993663788,
0.022039363160729408,
0.06961715966463089,
-0.0062951999716460705,
0.008863543160259724,
0.1335848867893219,
-0.06472192704677582,
0.03929285332560539,
-0.06360851228237152,
-0.017076406627893448,
0.09181468188762665,
-0.06957872956991196,
-0.24008934199810028,
-0.06878264248371124,
0.061028093099594116,
0.07594383507966995,
0.04706517234444618,
-0.07472823560237885,
0.009990490041673183,
-0.0075569902546703815,
0.07861382514238358,
0.14843255281448364,
-0.20249435305595398,
-0.0077683767303824425,
-0.02180139347910881,
0.0878896713256836,
0.03097817301750183,
-0.06563270092010498,
0.05684700235724449,
0.005598717834800482,
-0.006482074968516827,
-0.016366509720683098,
-0.045541394501924515,
-0.02386200614273548,
-0.03133567422628403,
-0.13616560399532318,
-0.03511057421565056,
0.0658704861998558,
0.07500293850898743,
-0.044007834047079086,
-0.14096230268478394,
-0.03689628839492798,
-0.05026888847351074,
-0.03624197840690613,
-0.07516760379076004,
-0.04978397488594055,
-0.000825632712803781,
0.015093415044248104,
-0.08339554071426392,
-0.09963331371545792,
0.007735314313322306,
-0.0627162978053093,
0.14098353683948517,
0.004304287955164909,
0.021968461573123932,
0.03292283043265343,
0.039668623358011246,
0.014740761369466782,
-0.09289642423391342,
-0.014401859603822231,
-0.04738209769129753,
-0.1881396621465683,
0.000011014841220458038,
-0.018544742837548256,
-0.04962257295846939,
0.116686150431633,
0.1145610511302948,
-0.040704477578401566,
0.0731227844953537,
-0.07707613706588745,
0.015215168707072735,
0.043076835572719574,
0.11196105182170868,
-0.1095208153128624,
-0.06607745587825775,
-0.03832109645009041,
-0.00735405832529068,
-0.03162522241473198,
-0.0372484028339386,
-0.05887816473841667,
-0.008889381773769855,
0.024349452927708626,
0.0664193257689476,
0.07465965300798416,
-0.0008793683373369277,
-0.06573444604873657,
-0.0482216514647007,
0.1211547702550888,
-0.14292189478874207,
0.051332391798496246,
0.12248681485652924,
-0.02999824285507202,
0.07593198865652084,
-0.004533143248409033,
0.03395780175924301,
-0.07565248012542725,
-0.005345067474991083,
0.03483964502811432,
0.012039022520184517,
-0.019178738817572594,
-0.11130416393280029,
0.02558360993862152,
-0.04284831881523132,
-0.07619869709014893,
-0.11267314851284027,
-0.0067037236876785755,
-0.07641847431659698,
0.020544417202472687,
-0.07086656242609024,
0.04514428228139877,
-0.056488797068595886,
-0.03902789205312729,
0.0306625347584486,
-0.019391175359487534,
-0.005426491145044565,
-0.04130438342690468,
0.03382690250873566,
-0.042634665966033936,
0.0913430005311966,
0.0364442840218544,
0.03405480086803436,
-0.009752935729920864,
0.021326230838894844,
-0.15272016823291779,
0.21453173458576202,
-0.12637187540531158,
-0.027387401089072227,
-0.14371752738952637,
-0.043485164642333984,
0.007425971329212189,
0.0335257351398468,
0.007290687412023544,
0.062206801027059555,
-0.22695805132389069,
-0.06820417940616608,
0.21168819069862366,
-0.0648617222905159,
-0.04147844761610031,
0.15427683293819427,
0.002330745803192258,
-0.044772177934646606,
0.06927241384983063,
0.14303986728191376,
0.14548824727535248,
-0.20317625999450684,
-0.01862959936261177,
-0.003397878725081682,
-0.006579092238098383,
0.1771751046180725,
0.08592423051595688,
-0.09327546507120132,
0.05266840010881424,
0.0025913724675774574,
-0.014905937016010284,
-0.015111841261386871,
-0.01666068658232689,
-0.03653130307793617,
0.04859243705868721,
-0.041644319891929626,
0.09663642197847366,
-0.029966793954372406,
-0.07643026113510132,
-0.024717232212424278,
-0.09550908207893372,
-0.007776788901537657,
0.07560988515615463,
-0.043296217918395996,
0.027121897786855698,
-0.1041959822177887,
0.05572499707341194,
0.0028804042376577854,
0.008258531801402569,
-0.18419331312179565,
-0.05189864709973335,
0.007766540627926588,
-0.07736535370349884,
0.08350121974945068,
0.07406093925237656,
0.02291973866522312,
0.025459984317421913,
0.017545199021697044,
-0.007797710131853819,
0.03866741806268692,
0.0045133731327950954,
0.021184170618653297,
-0.1153828427195549,
-0.053672097623348236,
-0.05840243399143219,
0.11535380035638809,
-0.08413642644882202,
-0.03814348205924034,
0.12263783067464828,
0.14412611722946167,
0.014521214179694653,
0.0017632755916565657,
0.011004743166267872,
0.02455422654747963,
0.04134071245789528,
-0.0017359761986881495,
-0.0016420435858890414,
-0.028350774198770523,
-0.03369821980595589,
0.08786798268556595,
-0.1282576471567154,
-0.06466906517744064,
0.0677778348326683,
0.014628220349550247,
-0.07528388500213623,
0.02146090939640999,
-0.034969598054885864,
0.003071351908147335,
-0.08736108988523483,
-0.06545912474393845,
0.19249138236045837,
0.09169884026050568,
0.07618768513202667,
-0.07015776634216309,
-0.054641637951135635,
0.02213209681212902,
-0.04762472212314606,
-0.044031497091054916,
0.11690549552440643,
-0.029216241091489792,
-0.1641787588596344,
0.030897198244929314,
-0.0007919420022517443,
0.04052433371543884,
0.19363801181316376,
0.0016854879213497043,
-0.09616654366254807,
-0.05575422942638397,
0.045601218938827515,
-0.0019683176651597023,
-0.011551999486982822,
0.06252874433994293,
0.009623671881854534,
0.04986792802810669,
-0.003210319671779871,
0.028356188908219337,
-0.044179294258356094,
0.05225633084774017,
0.023354453966021538,
-0.1010812297463417,
0.08764921873807907,
0.03927474841475487,
0.043175458908081055,
0.04331587255001068,
-0.00005503050488187,
-0.05219992622733116,
-0.06202656403183937,
-0.05090383440256119,
-0.10065903514623642,
0.10097724199295044,
-0.14325125515460968,
-0.34279072284698486,
-0.1274057775735855,
-0.03230800852179527,
-0.049414101988077164,
-0.0017438518116250634,
0.053364839404821396,
-0.07914626598358154,
-0.05712227523326874,
-0.05184170603752136,
0.021594775840640068,
0.0014858903596177697,
-0.06037139147520065,
0.027815094217658043,
0.019794799387454987,
0.04447658732533455,
-0.08708932995796204,
0.005586500745266676,
0.018477918580174446,
-0.046864427626132965,
-0.01898324489593506,
0.05189274251461029,
0.051083896309137344,
0.11178083717823029,
0.03345441073179245,
0.03601602464914322,
-0.015857666730880737,
0.19272162020206451,
-0.1107025071978569,
0.0760936513543129,
0.09272776544094086,
-0.043118011206388474,
0.05850682780146599,
0.19105692207813263,
0.016408290714025497,
-0.04473107308149338,
0.010930630378425121,
0.033641889691352844,
0.021644271910190582,
-0.2391822338104248,
-0.12214092910289764,
-0.03244060277938843,
0.01954924315214157,
0.03512191027402878,
0.03447793424129486,
0.04724659398198128,
-0.03692200034856796,
-0.0648680254817009,
-0.09378845244646072,
0.08236641436815262,
0.008432724513113499,
0.13959582149982452,
0.00749670946970582,
0.0699891448020935,
-0.029859310016036034,
-0.0442831888794899,
0.11387466639280319,
0.008810336701571941,
0.058247160166502,
0.057470161467790604,
0.1924586445093155,
0.0929897129535675,
0.050149764865636826,
0.02552921697497368,
-0.01906624436378479,
-0.01289078313857317,
0.03560502082109451,
0.03343428298830986,
-0.08358535170555115,
0.03534578159451485,
0.03273579850792885,
0.18994225561618805,
-0.09997174888849258,
0.01127893291413784,
0.0020111689809709787,
0.09848770499229431,
0.14032143354415894,
0.10708290338516235,
-0.1153382658958435,
0.0038442956283688545,
0.0009742515394464135,
-0.045183707028627396,
-0.055772941559553146,
-0.01993374153971672,
0.11401458829641342,
-0.09886744618415833,
0.10478588938713074,
0.026514071971178055,
0.08369710296392441,
-0.1093604564666748,
-0.004532046616077423,
0.03262067958712578,
0.07519453763961792,
0.01613432727754116,
0.06739168614149094,
-0.16344907879829407,
0.13186274468898773,
0.025424301624298096,
0.016230814158916473,
-0.006371099967509508,
0.013725082390010357,
-0.009299332275986671,
-0.049883194267749786,
0.15304531157016754,
-0.006112204398959875,
-0.05943794921040535,
-0.09521974623203278,
-0.12128262966871262,
0.008189001120626926,
0.14152850210666656,
-0.07374392449855804,
0.04935256391763687,
0.009751022793352604,
-0.08548444509506226,
-0.06398763507604599,
0.013453596271574497,
-0.14361456036567688,
-0.13173514604568481,
0.06791992485523224,
0.02910993993282318,
0.06711870431900024,
-0.008291901089251041,
-0.021345065906643867,
-0.16955187916755676,
0.1144038513302803,
-0.12064777314662933,
-0.04493236169219017,
-0.14254756271839142,
-0.0011389774736016989,
0.18152156472206116,
-0.09210732579231262,
0.06401395052671432,
0.010032596066594124,
0.16075149178504944,
-0.04611830413341522,
-0.07944809645414352,
0.04937709867954254,
-0.07414525002241135,
-0.11712747812271118,
-0.01616990752518177,
0.16615037620067596,
0.11672207713127136,
0.040456488728523254,
0.06673271209001541,
0.03633401542901993,
0.017704926431179047,
-0.09745343029499054,
0.03521999716758728,
0.06614845991134644,
-0.04150528088212013,
0.03642831742763519,
0.018560729920864105,
-0.21391107141971588,
-0.1120477169752121,
-0.018954021856188774,
0.16096636652946472,
0.1373196393251419,
-0.11442617326974869,
0.16185660660266876,
0.19841453433036804,
-0.06094186007976532,
-0.18146196007728577,
-0.12828142940998077,
0.13487480580806732,
0.05598965659737587,
-0.039573848247528076,
-0.1847572922706604,
0.05793743208050728,
0.03338797390460968,
-0.011054079979658127,
0.014002260752022266,
-0.2403659075498581,
-0.11264297366142273,
0.12473264336585999,
-0.059831369668245316,
-0.14565323293209076,
-0.07252931594848633,
-0.11140626668930054,
-0.10361038148403168,
-0.114597849547863,
0.09021230787038803,
-0.1515585035085678,
0.043669845908880234,
0.08782785385847092,
0.018232036381959915,
0.02044183760881424,
0.015335612930357456,
0.1150602474808693,
0.06215450540184975,
-0.021775776520371437,
-0.040950044989585876,
0.06396783143281937,
-0.05641073361039162,
-0.017536209896206856,
0.09486902505159378,
-0.006629070732742548,
0.009153801947832108,
-0.035287655889987946,
-0.056171271950006485,
-0.07561753690242767,
0.07067297399044037,
-0.021796543151140213,
-0.005178432911634445,
-0.029626892879605293,
0.003730379045009613,
0.07837862521409988,
-0.017577122896909714,
-0.07809446007013321,
-0.09841111302375793,
0.031153229996562004,
0.203058123588562,
0.11801927536725998,
0.0683315172791481,
-0.10945241153240204,
-0.03567027300596237,
-0.04079709202051163,
0.012629594653844833,
0.016230851411819458,
0.06772293150424957,
0.10188218206167221,
-0.009821273386478424,
0.14005349576473236,
-0.036109816282987595,
-0.12721794843673706,
0.02328147366642952,
0.04768862947821617,
-0.08124464750289917,
-0.15775112807750702,
-0.006282359827309847,
-0.0007528617861680686,
-0.005197568330913782,
-0.05541291460394859,
0.15909510850906372,
0.04068714380264282,
-0.048156291246414185,
-0.01414984930306673,
0.04573714733123779,
-0.04416899010539055,
0.1339675486087799,
0.06278780102729797,
0.08368221670389175,
-0.1045653223991394,
0.04560253396630287,
0.09369289875030518,
-0.06601322442293167,
0.03790998086333275,
0.10514393448829651,
-0.08963968604803085,
-0.07547330111265182,
-0.019504684954881668,
0.06037639081478119,
0.004099571146070957,
-0.043986666947603226,
0.037290289998054504,
-0.09913882613182068,
0.04892804101109505,
0.20683737099170685,
0.015272757038474083,
0.024784263223409653,
0.02904963493347168,
-0.0461413599550724,
-0.0005275489529594779,
0.1242084875702858,
0.0641399398446083,
-0.022950462996959686,
-0.0825289934873581,
0.09856703877449036,
0.007191451266407967,
0.05163775384426117,
-0.019005639478564262,
-0.03659844771027565,
-0.07227437198162079,
0.023259393870830536,
-0.13886474072933197,
0.06508840620517731,
-0.0649767518043518,
-0.012691318988800049,
0.01576182246208191,
-0.039722319692373276,
0.012192637659609318,
0.011661574244499207,
-0.1041286513209343,
-0.040435418486595154,
-0.03453781455755234,
0.06611093878746033,
-0.1828252524137497,
-0.011281916871666908,
0.05481773242354393,
-0.051084741950035095,
0.09587474912405014,
0.050720877945423126,
-0.0300427433103323,
0.04498162865638733,
-0.17130103707313538,
-0.10562478750944138,
0.02339422143995762,
0.06332060694694519,
0.037832144647836685,
-0.1393057107925415,
0.024484097957611084,
0.018167398869991302,
-0.01675817184150219,
-0.01672591269016266,
0.029468128457665443,
-0.08215582370758057,
0.011767596006393433,
-0.08338966965675354,
-0.006552486680448055,
-0.032349251210689545,
0.04804914817214012,
0.08962766826152802,
0.051669005304574966,
0.07315743714570999,
-0.09855162352323532,
0.11530040949583054,
-0.09385546296834946,
-0.012242638505995274,
-0.01857510767877102,
-0.0024028599727898836,
0.003828631481155753,
-0.05641402676701546,
0.09695584326982498,
-0.027657872065901756,
0.09966564178466797,
-0.009335020557045937,
0.06379946321249008,
-0.010523422621190548,
-0.1277758777141571,
-0.1113913282752037,
0.08122605830430984,
0.06732384115457535,
0.06519682705402374,
0.002347404370084405,
0.0059219542890787125,
-0.0674365684390068,
0.0012000061105936766,
0.0669771060347557,
0.02913031354546547,
0.12206745147705078,
0.09496321529150009,
-0.01368305366486311,
0.11743634939193726,
-0.1316390186548233,
-0.0063494429923594,
0.020529750734567642,
-0.17823579907417297,
0.03985356539487839,
-0.07533065974712372,
0.09656701982021332,
0.054275404661893845,
-0.13179536163806915,
0.08990626037120819,
0.021236740052700043,
-0.07393956929445267,
-0.11819174885749817,
-0.11181862652301788,
-0.0886838510632515,
-0.04423746094107628,
0.015241794288158417,
-0.08568452298641205,
0.06246763467788696,
-0.008989912457764149,
0.07346892356872559,
0.00982124637812376,
0.09562578797340393,
-0.03176117688417435,
-0.12967248260974884,
0.11846421658992767,
-0.005621135700494051,
0.026911986991763115,
0.04821384325623512,
0.03609812632203102,
0.08923647552728653,
0.05595686286687851,
0.05289508029818535,
0.05340050160884857,
-0.03876049444079399,
0.032597243785858154,
-0.07341350615024567,
-0.07855984568595886,
0.020111529156565666,
-0.026029886677861214,
0.014101570472121239,
0.14504244923591614,
0.09150569885969162,
-0.04062623530626297,
0.0006230682483874261,
0.0786224827170372,
-0.04086486995220184,
-0.16431663930416107,
-0.16835348308086395,
0.01653432659804821,
-0.015022724866867065,
0.09834486991167068,
-0.007862647995352745,
-0.10222101956605911,
-0.04187969118356705,
0.16826093196868896,
0.14083650708198547,
-0.02474880777299404,
0.05717315524816513,
0.05660846084356308,
0.02481538988649845,
0.029884714633226395,
-0.006721316371113062,
0.03232874721288681,
0.255092054605484,
-0.02349516563117504,
-0.005863481666892767,
-0.029226578772068024,
-0.10411357879638672,
-0.033517591655254364,
0.029745083302259445,
-0.0693124532699585,
-0.022335803136229515,
-0.019672289490699768,
0.14907613396644592,
-0.10744494944810867,
-0.17408515512943268,
-0.005615281872451305,
-0.02772897109389305,
-0.08745166659355164,
-0.01194409653544426,
0.05857023969292641,
0.10117106884717941,
0.01462701614946127,
0.01669471152126789,
-0.07820916175842285,
0.21032670140266418,
-0.00702403299510479,
-0.0618995800614357,
0.01446255762130022,
-0.008310899138450623,
-0.11168550699949265,
0.08352937549352646,
-0.01319023035466671,
0.12999601662158966,
0.06484108418226242,
0.025903448462486267,
-0.061633169651031494,
0.08607442677021027,
0.07361453026533127,
-0.07649116218090057,
0.06909498572349548,
0.1514727920293808,
-0.0011709819082170725,
0.07385478168725967,
0.08790373802185059,
-0.0908021628856659,
0.057023268193006516,
0.06004978343844414,
0.004473076667636633,
-0.06072050333023071,
0.06221971660852432,
-0.09321652352809906,
0.1092400774359703,
0.1190296933054924,
-0.02295304648578167,
0.01460531447082758,
-0.042183682322502136,
0.03003641590476036,
-0.015668228268623352,
0.06079509109258652,
-0.028553735464811325,
-0.16837769746780396,
0.02020813338458538,
-0.016834115609526634,
0.08550513535737991,
-0.1453443467617035,
-0.029653476551175117,
0.03043241612613201,
-0.0066809505224227905,
0.012041107751429081,
0.1085980162024498,
0.027430303394794464,
0.0021176189184188843,
-0.02159048430621624,
-0.013114381581544876,
0.03369855508208275,
0.09494755417108536,
-0.11885331571102142,
-0.02060708962380886
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-300M-EN-15
Facebook's Wav2Vec2 XLS-R fine-tuned for **Speech Translation.**

This is a [SpeechEncoderDecoderModel](https://huggingface.co/transformers/model_doc/speechencoderdecoder.html) model.
The encoder was warm-started from the [**`facebook/wav2vec2-xls-r-300m`**](https://huggingface.co/facebook/wav2vec2-xls-r-300m) checkpoint and
the decoder from the [**`facebook/mbart-large-50`**](https://huggingface.co/facebook/mbart-large-50) checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 15 `en` -> `{lang}` translation pairs of the [Covost2 dataset](https://huggingface.co/datasets/covost2).
The model can translate from spoken `en` (Engish) to the following written languages `{lang}`:
`en` -> {`de`, `tr`, `fa`, `sv-SE`, `mn`, `zh-CN`, `cy`, `ca`, `sl`, `et`, `id`, `ar`, `ta`, `lv`, `ja`}
For more information, please refer to Section *5.1.1* of the [official XLS-R paper](https://arxiv.org/abs/2111.09296).
## Usage
### Demo
The model can be tested on [**this space**](https://huggingface.co/spaces/facebook/XLS-R-300m-EN-15).
You can select the target language, record some audio in English,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct `forced_bos_token_id` to `generate(...)` to condition
the decoder on the correct target language.
To select the correct `forced_bos_token_id` given your choosen language id, please make use
of the following mapping:
```python
MAPPING = {
"de": 250003,
"tr": 250023,
"fa": 250029,
"sv": 250042,
"mn": 250037,
"zh": 250025,
"cy": 250007,
"ca": 250005,
"sl": 250052,
"et": 250006,
"id": 250032,
"ar": 250001,
"ta": 250044,
"lv": 250017,
"ja": 250012,
}
```
As an example, if you would like to translate to Swedish, you can do the following:
```python
from datasets import load_dataset
from transformers import pipeline
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
# replace following lines to load an audio file of your choice
librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_file = librispeech_en[0]["file"]
asr = pipeline("automatic-speech-recognition", model="facebook/wav2vec2-xls-r-300m-en-to-15", feature_extractor="facebook/wav2vec2-xls-r-300m-en-to-15")
translation = asr(audio_file, forced_bos_token_id=forced_bos_token_id)
```
or step-by-step as follows:
```python
import torch
from transformers import Speech2Text2Processor, SpeechEncoderDecoderModel
from datasets import load_dataset
model = SpeechEncoderDecoderModel.from_pretrained("facebook/wav2vec2-xls-r-300m-en-to-15")
processor = Speech2Text2Processor.from_pretrained("facebook/wav2vec2-xls-r-300m-en-to-15")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# select correct `forced_bos_token_id`
forced_bos_token_id = MAPPING["sv"]
inputs = processor(ds[0]["audio"]["array"], sampling_rate=ds[0]["audio"]["array"]["sampling_rate"], return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"], forced_bos_token_id=forced_bos_token)
transcription = processor.batch_decode(generated_ids)
```
## Results `en` -> `{lang}`
See the row of **XLS-R (0.3B)** for the performance on [Covost2](https://huggingface.co/datasets/covost2) for this model.

## More XLS-R models for `{lang}` -> `en` Speech Translation
- [Wav2Vec2-XLS-R-300M-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-300m-en-to-15)
- [Wav2Vec2-XLS-R-1B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-1b-en-to-15)
- [Wav2Vec2-XLS-R-2B-EN-15](https://huggingface.co/facebook/wav2vec2-xls-r-2b-en-to-15)
- [Wav2Vec2-XLS-R-2B-22-16](https://huggingface.co/facebook/wav2vec2-xls-r-2b-22-to-16)
|
{"language": ["multilingual", "en", "de", "tr", "fa", "sv", "mn", "zh", "cy", "ca", "sl", "et", "id", "ar", "ta", "lv", "ja"], "license": "apache-2.0", "tags": ["speech", "xls_r", "xls_r_translation", "automatic-speech-recognition"], "datasets": ["common_voice", "multilingual_librispeech", "covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "English", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xls-r-300m-en-to-15
|
[
"transformers",
"pytorch",
"speech-encoder-decoder",
"automatic-speech-recognition",
"speech",
"xls_r",
"xls_r_translation",
"multilingual",
"en",
"de",
"tr",
"fa",
"sv",
"mn",
"zh",
"cy",
"ca",
"sl",
"et",
"id",
"ar",
"ta",
"lv",
"ja",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"dataset:covost2",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"en",
"de",
"tr",
"fa",
"sv",
"mn",
"zh",
"cy",
"ca",
"sl",
"et",
"id",
"ar",
"ta",
"lv",
"ja"
] |
TAGS
#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-300M-EN-15
Facebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.
!model image
This is a SpeechEncoderDecoderModel model.
The encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and
the decoder from the 'facebook/mbart-large-50' checkpoint.
Consequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.
The model can translate from spoken 'en' (Engish) to the following written languages '{lang}':
'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}
For more information, please refer to Section *5.1.1* of the official XLS-R paper.
## Usage
### Demo
The model can be tested on this space.
You can select the target language, record some audio in English,
and then sit back and see how well the checkpoint can translate the input.
### Example
As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the
transcripts by passing the speech features to the model.
You can use the model directly via the ASR pipeline. By default, the checkpoint will
translate spoken English to written German. To change the written target language,
you need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition
the decoder on the correct target language.
To select the correct 'forced_bos_token_id' given your choosen language id, please make use
of the following mapping:
As an example, if you would like to translate to Swedish, you can do the following:
or step-by-step as follows:
## Results 'en' -> '{lang}'
See the row of XLS-R (0.3B) for the performance on Covost2 for this model.
!results image
## More XLS-R models for '{lang}' -> 'en' Speech Translation
- Wav2Vec2-XLS-R-300M-EN-15
- Wav2Vec2-XLS-R-1B-EN-15
- Wav2Vec2-XLS-R-2B-EN-15
- Wav2Vec2-XLS-R-2B-22-16
|
[
"# Wav2Vec2-XLS-R-300M-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## Results 'en' -> '{lang}'\n\nSee the row of XLS-R (0.3B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
"TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-300M-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.",
"## Usage",
"### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input.",
"### Example \n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline. By default, the checkpoint will \ntranslate spoken English to written German. To change the written target language, \nyou need to pass the correct 'forced_bos_token_id' to 'generate(...)' to condition \nthe decoder on the correct target language. \n\nTo select the correct 'forced_bos_token_id' given your choosen language id, please make use \nof the following mapping:\n\n\n\nAs an example, if you would like to translate to Swedish, you can do the following:\n\n\n\nor step-by-step as follows:",
"## Results 'en' -> '{lang}'\n\nSee the row of XLS-R (0.3B) for the performance on Covost2 for this model.\n\n!results image",
"## More XLS-R models for '{lang}' -> 'en' Speech Translation\n\n- Wav2Vec2-XLS-R-300M-EN-15\n- Wav2Vec2-XLS-R-1B-EN-15\n- Wav2Vec2-XLS-R-2B-EN-15\n- Wav2Vec2-XLS-R-2B-22-16"
] |
[
140,
258,
3,
43,
176,
40,
82
] |
[
"passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #speech #xls_r #xls_r_translation #multilingual #en #de #tr #fa #sv #mn #zh #cy #ca #sl #et #id #ar #ta #lv #ja #dataset-common_voice #dataset-multilingual_librispeech #dataset-covost2 #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-XLS-R-300M-EN-15\n\nFacebook's Wav2Vec2 XLS-R fine-tuned for Speech Translation.\n\n!model image\n\nThis is a SpeechEncoderDecoderModel model. \nThe encoder was warm-started from the 'facebook/wav2vec2-xls-r-300m' checkpoint and\nthe decoder from the 'facebook/mbart-large-50' checkpoint.\nConsequently, the encoder-decoder model was fine-tuned on 15 'en' -> '{lang}' translation pairs of the Covost2 dataset.\n\nThe model can translate from spoken 'en' (Engish) to the following written languages '{lang}':\n\n'en' -> {'de', 'tr', 'fa', 'sv-SE', 'mn', 'zh-CN', 'cy', 'ca', 'sl', 'et', 'id', 'ar', 'ta', 'lv', 'ja'}\n\nFor more information, please refer to Section *5.1.1* of the official XLS-R paper.## Usage### Demo\n\nThe model can be tested on this space. \nYou can select the target language, record some audio in English, \nand then sit back and see how well the checkpoint can translate the input."
] |
[
-0.09546326100826263,
0.054057639092206955,
-0.0054699694737792015,
0.025988714769482613,
0.01465528178960085,
0.00021040493447799236,
0.03583920747041702,
0.10587923973798752,
0.06528456509113312,
0.11247698217630386,
0.012986846268177032,
0.04273058846592903,
0.10096718370914459,
0.14482568204402924,
-0.03127804771065712,
-0.1299830824136734,
0.040700118988752365,
-0.08489803224802017,
-0.0065363128669559956,
0.07778368145227432,
0.08213610202074051,
-0.09310661256313324,
0.04021640494465828,
-0.03702116012573242,
0.015358992852270603,
0.003417180385440588,
-0.027271853759884834,
-0.06396699696779251,
0.06777315586805344,
0.08526808023452759,
0.0713738203048706,
0.08429320901632309,
0.04352390766143799,
-0.24804334342479706,
0.026468873023986816,
0.07015072554349899,
-0.014992445707321167,
0.003990126773715019,
0.09201817214488983,
-0.017194701358675957,
0.06487002223730087,
-0.07532992213964462,
-0.012320335023105145,
0.06283767521381378,
-0.06463561207056046,
-0.2624368965625763,
-0.1000918373465538,
-0.02067134715616703,
0.10179201513528824,
0.05999693647027016,
-0.060999590903520584,
0.0652545839548111,
-0.020038995891809464,
0.08988986909389496,
0.10521911084651947,
-0.2706998288631439,
-0.005758008919656277,
0.036405544728040695,
0.11208482831716537,
0.052602410316467285,
-0.04547753557562828,
0.04334644228219986,
0.018087072297930717,
0.012379258871078491,
-0.03240085765719414,
-0.07442057132720947,
-0.027167370542883873,
-0.012062909081578255,
-0.15351314842700958,
-0.039015572518110275,
0.08391924947500229,
0.03490782529115677,
-0.0584423802793026,
-0.10935705155134201,
-0.040243614464998245,
-0.025284795090556145,
-0.031236330047249794,
-0.0856674313545227,
-0.04271790012717247,
-0.0022424061316996813,
-0.013840780593454838,
-0.10130354017019272,
-0.09443601220846176,
-0.033741261810064316,
-0.09870439022779465,
0.13869307935237885,
0.028815405443310738,
0.0073029650375247,
-0.01801186427474022,
0.042281415313482285,
0.039072275161743164,
-0.09283085912466049,
-0.08300750702619553,
-0.04989340901374817,
-0.17454974353313446,
-0.0006983219063840806,
-0.06844021379947662,
-0.1165495440363884,
0.13938328623771667,
0.1391574889421463,
-0.10659209638834,
0.08283567428588867,
-0.06474875658750534,
0.025120491161942482,
0.03466979041695595,
0.16969463229179382,
-0.04757765680551529,
-0.06386636942625046,
-0.03693464770913124,
-0.019311804324388504,
-0.04632285609841347,
-0.04087793827056885,
-0.05726069211959839,
-0.008050368167459965,
0.015402550809085369,
0.08943242579698563,
0.06839747726917267,
-0.0021099953446537256,
-0.064485564827919,
-0.009572616778314114,
0.10928501188755035,
-0.15634499490261078,
0.0643286481499672,
0.09667617827653885,
-0.01305670291185379,
0.045210592448711395,
-0.010864057578146458,
-0.0025454540736973286,
-0.08169785887002945,
-0.05006609484553337,
0.027496585622429848,
0.012104510329663754,
-0.02790556661784649,
-0.13113652169704437,
0.03605084493756294,
-0.03865313529968262,
-0.07658392935991287,
-0.14863188564777374,
-0.026601532474160194,
-0.0913810282945633,
0.008094226010143757,
-0.04249143600463867,
0.1267998069524765,
-0.08160906285047531,
-0.02291608415544033,
0.005134016275405884,
-0.013588162139058113,
-0.038036100566387177,
-0.04072781279683113,
0.03152231499552727,
-0.022653628140687943,
0.1194818988442421,
0.023040542379021645,
0.0031277406960725784,
-0.05627606436610222,
0.02760690078139305,
-0.1037246435880661,
0.18661387264728546,
-0.11950622498989105,
-0.0271911658346653,
-0.13100291788578033,
-0.030071649700403214,
-0.011166420765221119,
0.04305514320731163,
0.0514724925160408,
0.10647915303707123,
-0.2553277611732483,
-0.052388984709978104,
0.26783254742622375,
-0.08033209294080734,
-0.08985789865255356,
0.1648564636707306,
0.0255522932857275,
-0.04705800488591194,
0.06254755705595016,
0.1314452588558197,
0.15425096452236176,
-0.20319725573062897,
-0.020278390496969223,
0.02734806016087532,
-0.00007690222992096096,
0.13243839144706726,
0.07628469914197922,
-0.11806004494428635,
0.024857619777321815,
-0.0043422142043709755,
-0.0293517354875803,
0.0061613405123353004,
-0.01288693305104971,
-0.05404021963477135,
0.025062475353479385,
-0.01239815540611744,
0.094791941344738,
-0.02706928178668022,
-0.05772322416305542,
-0.05550219863653183,
-0.06201716139912605,
-0.06770206242799759,
0.06961312144994736,
-0.06240100413560867,
0.056908708065748215,
-0.1299382895231247,
0.06355639547109604,
0.000047377976443385705,
0.04009148105978966,
-0.18050959706306458,
-0.033590421080589294,
0.01730792596936226,
-0.06265735626220703,
0.059832386672496796,
0.07450401037931442,
0.05433262139558792,
0.014347115531563759,
0.04316549748182297,
-0.013241173699498177,
0.03363851085305214,
0.0017808297416195273,
-0.008564261719584465,
-0.08983959257602692,
-0.018086284399032593,
-0.07959835976362228,
0.07709672302007675,
-0.07514915615320206,
-0.022873716428875923,
0.13501767814159393,
0.1162586361169815,
0.01898704096674919,
0.012069291435182095,
-0.014919673092663288,
0.07908453047275543,
0.0005959857953712344,
-0.009920543991029263,
0.0014654631959274411,
-0.03534524142742157,
-0.06970473378896713,
0.09152371436357498,
-0.08605784177780151,
-0.10070936381816864,
0.07461307942867279,
-0.027033712714910507,
-0.060049451887607574,
0.011336558498442173,
-0.015921276062726974,
0.016445564106106758,
-0.06099921464920044,
-0.04931647330522537,
0.18885326385498047,
0.08889312297105789,
0.09389321506023407,
-0.06618629395961761,
-0.03920482471585274,
0.010927385650575161,
-0.06280894577503204,
-0.04738515615463257,
0.11436278373003006,
-0.032357219606637955,
-0.11943181604146957,
0.025214379653334618,
0.09622696787118912,
-0.007349213119596243,
0.14989757537841797,
-0.0007492955774068832,
-0.0857209712266922,
-0.05666407570242882,
0.06385543197393417,
-0.0011503116693347692,
0.013490184210240841,
0.02824695035815239,
-0.0022935569286346436,
0.029689880087971687,
0.037864744663238525,
0.033897001296281815,
-0.06546046584844589,
0.061983369290828705,
0.035249531269073486,
-0.10166404396295547,
0.06395988166332245,
0.054642707109451294,
0.023289013653993607,
0.02956359274685383,
0.0006812938372604549,
-0.05824754759669304,
-0.05419367924332619,
-0.056386590003967285,
-0.10641788691282272,
0.16954390704631805,
-0.14899921417236328,
-0.3574826419353485,
-0.11639267951250076,
-0.01954049803316593,
-0.021086717024445534,
-0.0015247623668983579,
0.07729451358318329,
-0.1134914830327034,
-0.04239589720964432,
-0.07132591307163239,
-0.01856774091720581,
0.008021818473935127,
-0.02752840891480446,
0.038302142173051834,
0.0010280570713803172,
0.010955092497169971,
-0.0866963192820549,
0.005310284439474344,
0.03480890765786171,
-0.013930493034422398,
0.030344408005475998,
0.03399025276303291,
0.09295095503330231,
0.11584702879190445,
-0.0029585917945951223,
0.04197346419095993,
-0.026484349742531776,
0.21094810962677002,
-0.11158866435289383,
0.0639922097325325,
0.09507446736097336,
-0.02252267114818096,
0.046912457793951035,
0.1262352019548416,
-0.006043675355613232,
-0.055312272161245346,
0.011106465943157673,
-0.00424383906647563,
0.00017159005801659077,
-0.23735322058200836,
-0.08980021625757217,
-0.048678722232580185,
0.03186330944299698,
0.023886190727353096,
0.02797798439860344,
0.013652273453772068,
-0.035961564630270004,
-0.0302396509796381,
-0.07790198177099228,
0.06834296882152557,
0.02180132456123829,
0.16274118423461914,
-0.02321780100464821,
0.05764481797814369,
-0.0423099584877491,
-0.0375487394630909,
0.1018604040145874,
0.04709431156516075,
0.05887402966618538,
0.07275556027889252,
0.16055803000926971,
0.09062174707651138,
0.0568702407181263,
-0.0012266935082152486,
-0.005008704494684935,
-0.019568422809243202,
0.03241141885519028,
0.038223184645175934,
-0.0726097971200943,
0.027490170672535896,
0.03560277447104454,
0.15800264477729797,
-0.12607626616954803,
-0.013026858679950237,
-0.007316849660128355,
0.10446633398532867,
0.10264348983764648,
0.13589134812355042,
-0.13138659298419952,
0.001292501692660153,
0.003130508353933692,
-0.011318211443722248,
-0.0262859258800745,
0.0009670252329669893,
0.13980436325073242,
-0.09314770251512527,
0.07530971616506577,
0.0060698953457176685,
0.0787971019744873,
-0.08672620356082916,
-0.007916796021163464,
-0.015975257381796837,
0.06431621313095093,
0.02427351102232933,
0.059871070086956024,
-0.2628955543041229,
0.1288314312696457,
0.028474632650613785,
0.039813365787267685,
-0.01495957188308239,
0.019996220245957375,
-0.005835981108248234,
-0.01886131800711155,
0.14340373873710632,
-0.0009567986126057804,
-0.12107361853122711,
-0.12136627733707428,
-0.07455883175134659,
0.04152423515915871,
0.1403816044330597,
-0.006733793765306473,
0.07446348667144775,
0.01835360750555992,
-0.06929723173379898,
-0.0874488577246666,
-0.03180629014968872,
-0.14533625543117523,
-0.11071830242872238,
0.045439671725034714,
0.04943050071597099,
0.05947042256593704,
-0.013151673600077629,
-0.00671002734452486,
-0.18651874363422394,
0.10118641704320908,
-0.1812368780374527,
-0.04331529885530472,
-0.11669275909662247,
-0.020894017070531845,
0.1421041041612625,
-0.10599512606859207,
0.0333859883248806,
0.014813363552093506,
0.10374869406223297,
-0.05425924062728882,
-0.07621859014034271,
0.05274641886353493,
-0.0679192766547203,
-0.11018228530883789,
-0.02667143940925598,
0.11135661602020264,
0.08626711368560791,
0.03294019773602486,
0.09031473100185394,
0.047934968024492264,
-0.02659468911588192,
-0.10267891734838486,
0.020681854337453842,
0.04250894859433174,
-0.04847719892859459,
0.05957290157675743,
0.01059461385011673,
-0.15701326727867126,
-0.09434755891561508,
0.0051236520521342754,
0.16618327796459198,
0.1800791174173355,
-0.10092075914144516,
0.13368377089500427,
0.20239801704883575,
-0.07278712838888168,
-0.20925317704677582,
-0.11836548149585724,
0.04518876597285271,
0.07393331080675125,
-0.03908505663275719,
-0.1571274697780609,
-0.0008279347093775868,
0.039987724274396896,
-0.0035788635723292828,
0.04100853204727173,
-0.2670585811138153,
-0.10106706619262695,
0.11857496201992035,
-0.013456166721880436,
-0.05860525369644165,
-0.07974391430616379,
-0.09324104338884354,
-0.07435600459575653,
-0.11188711225986481,
0.03164876252412796,
-0.16198314726352692,
0.05078280717134476,
0.05955217033624649,
-0.035943832248449326,
0.009702162817120552,
-0.005626906640827656,
0.12991401553153992,
0.013600707054138184,
-0.011602151207625866,
-0.031095638871192932,
0.09510239213705063,
-0.04374762997031212,
-0.01895623281598091,
0.09588252007961273,
-0.05721122771501541,
0.053703803569078445,
-0.048774003982543945,
-0.036550816148519516,
-0.04933222755789757,
0.07565392553806305,
-0.015195516869425774,
-0.0049470835365355015,
-0.06161004304885864,
0.009624813683331013,
0.04305914416909218,
0.0020310620311647654,
-0.0027816302608698606,
-0.0953565314412117,
0.03509778156876564,
0.19635619223117828,
0.10914777219295502,
0.04332711920142174,
-0.042692650109529495,
0.004318174906075001,
-0.03973698616027832,
0.050113048404455185,
-0.029856901615858078,
0.059698957949876785,
0.12604208290576935,
-0.011351720429956913,
0.13681413233280182,
-0.014807739295065403,
-0.1290232241153717,
0.04439511522650719,
0.06677346676588058,
-0.07907957583665848,
-0.15664167702198029,
0.0057799676433205605,
-0.0792558565735817,
0.011904185637831688,
0.01028512418270111,
0.18111129105091095,
0.03821434825658798,
-0.02620314247906208,
-0.025689423084259033,
0.036120060831308365,
-0.047376539558172226,
0.16696558892726898,
0.0565134696662426,
0.07408295571804047,
-0.07615159451961517,
0.04472748190164566,
0.07595421373844147,
-0.09881450235843658,
0.0460519902408123,
0.13070765137672424,
-0.10857866704463959,
-0.10248691588640213,
-0.03920033946633339,
0.09399724006652832,
-0.012533516623079777,
-0.0718502551317215,
-0.019062893465161324,
-0.08840088546276093,
0.031959716230630875,
0.20538851618766785,
0.026531364768743515,
0.05477339029312134,
0.038485076278448105,
-0.05800413712859154,
-0.001305562793277204,
0.11366883665323257,
0.026992321014404297,
-0.01378279086202383,
-0.04437318444252014,
0.09937422722578049,
-0.002316729398444295,
0.09431963413953781,
-0.029613163322210312,
-0.04480505734682083,
-0.09987972676753998,
0.03154134750366211,
-0.17349927127361298,
0.025774994865059853,
-0.06874746829271317,
-0.012713364325463772,
0.028548646718263626,
-0.02738756127655506,
0.03109356388449669,
0.017416298389434814,
-0.08357255160808563,
-0.030010852962732315,
-0.05505942553281784,
0.0540643148124218,
-0.17222902178764343,
-0.006704852916300297,
0.02942192368209362,
-0.04974939301609993,
0.07103715091943741,
0.012279503978788853,
-0.04646042361855507,
0.05071783810853958,
-0.15807165205478668,
-0.04149620980024338,
0.006510377395898104,
0.07034855335950851,
0.05224444344639778,
-0.09087080508470535,
0.02408444881439209,
0.03927425667643547,
0.011517767794430256,
0.01964322105050087,
0.06831194460391998,
-0.06517363339662552,
0.041314054280519485,
-0.10419298708438873,
-0.007285776548087597,
-0.02323056571185589,
0.044344253838062286,
0.05467653274536133,
0.08257011324167252,
0.09972880780696869,
-0.1074277013540268,
0.12203918397426605,
-0.09347768872976303,
-0.03308911994099617,
-0.00027448893524706364,
0.0068437219597399235,
0.02299605682492256,
-0.07759996503591537,
0.10652296990156174,
-0.03485364466905594,
0.10647761076688766,
-0.007359582465142012,
0.11520298570394516,
-0.012291650287806988,
-0.16085216403007507,
-0.10633691400289536,
0.06156473234295845,
0.09222565591335297,
0.06524103134870529,
-0.00831163115799427,
-0.012133784592151642,
-0.027995793148875237,
-0.0007464944501407444,
0.04715270549058914,
0.0572824664413929,
0.13791285455226898,
0.09287606179714203,
0.0700988844037056,
0.06892392039299011,
-0.07927347719669342,
0.018452221527695656,
-0.02019757591187954,
-0.13338421285152435,
0.04771614819765091,
-0.06055061146616936,
0.10019373893737793,
0.059749994426965714,
-0.15139327943325043,
0.05824093148112297,
0.027831438928842545,
-0.07808257639408112,
-0.11956565082073212,
-0.07350762188434601,
-0.09805840998888016,
-0.08117251843214035,
0.012331121601164341,
-0.0859687477350235,
0.03535113111138344,
-0.03160399571061134,
0.07602198421955109,
-0.010235960595309734,
0.10609631985425949,
-0.011054843664169312,
-0.09853550046682358,
0.08387202769517899,
-0.015702109783887863,
0.06714195758104324,
0.0882887989282608,
0.006812721490859985,
0.07603050768375397,
0.02599272131919861,
0.06047656759619713,
0.058470096439123154,
-0.03050820715725422,
0.053392164409160614,
-0.06297259032726288,
-0.07856214046478271,
0.012862916104495525,
0.013501806184649467,
0.029350677505135536,
0.14265939593315125,
0.12497939169406891,
-0.04144221171736717,
0.0026638987474143505,
0.08169985562562943,
-0.05026301369071007,
-0.14145879447460175,
-0.16013595461845398,
0.0930551066994667,
0.017454005777835846,
0.11107742786407471,
-0.020825156942009926,
-0.10180533677339554,
-0.047581981867551804,
0.20617856085300446,
0.111984983086586,
-0.026035722345113754,
0.04007089138031006,
0.05931887403130531,
0.026635756716132164,
0.026215683668851852,
0.029561476781964302,
0.06766489893198013,
0.2610882818698883,
-0.013092735782265663,
-0.040693409740924835,
-0.03293687105178833,
-0.07831010967493057,
-0.07140877097845078,
-0.018667181953787804,
-0.08329988270998001,
-0.037913691252470016,
-0.008433323353528976,
0.14919480681419373,
-0.0789213702082634,
-0.1333363801240921,
0.0638347938656807,
-0.044634316116571426,
-0.07633719593286514,
0.010143079794943333,
0.09017655998468399,
0.06319908797740936,
0.012299426831305027,
0.013584661297500134,
-0.04231373593211174,
0.20617075264453888,
-0.022322867065668106,
-0.014259726740419865,
-0.016659066081047058,
-0.010862812399864197,
-0.14415644109249115,
0.08368123322725296,
-0.011105815880000591,
0.16960753500461578,
0.09677081555128098,
0.053516242653131485,
-0.04459178447723389,
0.10855976492166519,
0.06896968185901642,
-0.06521371752023697,
0.08057459443807602,
0.08508830517530441,
0.0002329908311367035,
0.0742524191737175,
0.10902813822031021,
-0.07127171009778976,
0.07205821573734283,
0.03613884001970291,
0.017236974090337753,
-0.08449876308441162,
0.04989092797040939,
-0.08545295894145966,
0.10701191425323486,
0.1329111009836197,
-0.01882896199822426,
0.031438250094652176,
-0.011674349196255207,
0.036050647497177124,
-0.03718825802206993,
0.042394716292619705,
-0.06609740853309631,
-0.14378340542316437,
0.012676413170993328,
-0.01967500150203705,
0.057615477591753006,
-0.16964571177959442,
-0.007969033904373646,
0.01819128729403019,
0.008306778036057949,
0.022920338436961174,
0.11202510446310043,
0.03258780017495155,
0.011694475077092648,
-0.02662411704659462,
-0.07991104573011398,
0.04355347156524658,
0.10244442522525787,
-0.13090358674526215,
-0.04444750025868416
] |
null | null |
transformers
|
# Wav2Vec2-XLS-R-300M
[Facebook's Wav2Vec2 XLS-R](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) counting **300 million** parameters.

XLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the "XLM-R for Speech"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz.
**Note**: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out [**this blog**](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for more information about ASR.
[XLS-R Paper](https://arxiv.org/abs/2111.09296)
Authors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli
**Abstract**
This paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this google colab](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLS_R_on_Common_Voice.ipynb) for more information on how to fine-tune the model.
You can find other pretrained XLS-R models with different numbers of parameters:
* [300M parameters version](https://huggingface.co/facebook/wav2vec2-xls-r-300m)
* [1B version version](https://huggingface.co/facebook/wav2vec2-xls-r-1b)
* [2B version version](https://huggingface.co/facebook/wav2vec2-xls-r-2b)
|
{"language": ["multilingual", "ab", "af", "sq", "am", "ar", "hy", "as", "az", "ba", "eu", "be", "bn", "bs", "br", "bg", "my", "yue", "ca", "ceb", "km", "zh", "cv", "hr", "cs", "da", "dv", "nl", "en", "eo", "et", "fo", "fi", "fr", "gl", "lg", "ka", "de", "el", "gn", "gu", "ht", "cnh", "ha", "haw", "he", "hi", "hu", "is", "id", "ia", "ga", "it", "ja", "jv", "kb", "kn", "kk", "rw", "ky", "ko", "ku", "lo", "la", "lv", "ln", "lt", "lm", "mk", "mg", "ms", "ml", "mt", "gv", "mi", "mr", "mn", "ne", false, "nn", "oc", "or", "ps", "fa", "pl", "pt", "pa", "ro", "rm", "rm", "ru", "sah", "sa", "sco", "sr", "sn", "sd", "si", "sk", "sl", "so", "hsb", "es", "su", "sw", "sv", "tl", "tg", "ta", "tt", "te", "th", "bo", "tp", "tr", "tk", "uk", "ur", "uz", "vi", "vot", "war", "cy", "yi", "yo", "zu"], "license": "apache-2.0", "tags": ["speech", "xls_r", "xls_r_pretrained"], "datasets": ["common_voice", "multilingual_librispeech"], "language_bcp47": ["zh-HK", "zh-TW", "fy-NL"]}
| null |
facebook/wav2vec2-xls-r-300m
|
[
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"speech",
"xls_r",
"xls_r_pretrained",
"multilingual",
"ab",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"ba",
"eu",
"be",
"bn",
"bs",
"br",
"bg",
"my",
"yue",
"ca",
"ceb",
"km",
"zh",
"cv",
"hr",
"cs",
"da",
"dv",
"nl",
"en",
"eo",
"et",
"fo",
"fi",
"fr",
"gl",
"lg",
"ka",
"de",
"el",
"gn",
"gu",
"ht",
"cnh",
"ha",
"haw",
"he",
"hi",
"hu",
"is",
"id",
"ia",
"ga",
"it",
"ja",
"jv",
"kb",
"kn",
"kk",
"rw",
"ky",
"ko",
"ku",
"lo",
"la",
"lv",
"ln",
"lt",
"lm",
"mk",
"mg",
"ms",
"ml",
"mt",
"gv",
"mi",
"mr",
"mn",
"ne",
"no",
"nn",
"oc",
"or",
"ps",
"fa",
"pl",
"pt",
"pa",
"ro",
"rm",
"ru",
"sah",
"sa",
"sco",
"sr",
"sn",
"sd",
"si",
"sk",
"sl",
"so",
"hsb",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tp",
"tr",
"tk",
"uk",
"ur",
"uz",
"vi",
"vot",
"war",
"cy",
"yi",
"yo",
"zu",
"dataset:common_voice",
"dataset:multilingual_librispeech",
"arxiv:2111.09296",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2111.09296"
] |
[
"multilingual",
"ab",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"ba",
"eu",
"be",
"bn",
"bs",
"br",
"bg",
"my",
"yue",
"ca",
"ceb",
"km",
"zh",
"cv",
"hr",
"cs",
"da",
"dv",
"nl",
"en",
"eo",
"et",
"fo",
"fi",
"fr",
"gl",
"lg",
"ka",
"de",
"el",
"gn",
"gu",
"ht",
"cnh",
"ha",
"haw",
"he",
"hi",
"hu",
"is",
"id",
"ia",
"ga",
"it",
"ja",
"jv",
"kb",
"kn",
"kk",
"rw",
"ky",
"ko",
"ku",
"lo",
"la",
"lv",
"ln",
"lt",
"lm",
"mk",
"mg",
"ms",
"ml",
"mt",
"gv",
"mi",
"mr",
"mn",
"ne",
"no",
"nn",
"oc",
"or",
"ps",
"fa",
"pl",
"pt",
"pa",
"ro",
"rm",
"rm",
"ru",
"sah",
"sa",
"sco",
"sr",
"sn",
"sd",
"si",
"sk",
"sl",
"so",
"hsb",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tp",
"tr",
"tk",
"uk",
"ur",
"uz",
"vi",
"vot",
"war",
"cy",
"yi",
"yo",
"zu"
] |
TAGS
#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-XLS-R-300M
Facebook's Wav2Vec2 XLS-R counting 300 million parameters.
!model image
XLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the "XLM-R for Speech"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz.
Note: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.
XLS-R Paper
Authors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli
Abstract
This paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.
The original model can be found under URL
# Usage
See this google colab for more information on how to fine-tune the model.
You can find other pretrained XLS-R models with different numbers of parameters:
* 300M parameters version
* 1B version version
* 2B version version
|
[
"# Wav2Vec2-XLS-R-300M\n\nFacebook's Wav2Vec2 XLS-R counting 300 million parameters.\n\n!model image\n\nXLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the \"XLM-R for Speech\"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz. \n\nNote: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.\n\nXLS-R Paper\n\nAuthors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli\n\nAbstract\nThis paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.\n\nThe original model can be found under URL",
"# Usage\n\nSee this google colab for more information on how to fine-tune the model.\n\nYou can find other pretrained XLS-R models with different numbers of parameters:\n\n* 300M parameters version\n* 1B version version\n* 2B version version"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-XLS-R-300M\n\nFacebook's Wav2Vec2 XLS-R counting 300 million parameters.\n\n!model image\n\nXLS-R is Facebook AI's large-scale multilingual pretrained model for speech (the \"XLM-R for Speech\"). It is pretrained on 436k hours of unlabeled speech, including VoxPopuli, MLS, CommonVoice, BABEL, and VoxLingua107. It uses the wav2vec 2.0 objective, in 128 languages. When using the model make sure that your speech input is sampled at 16kHz. \n\nNote: This model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Translation, or Classification. Check out this blog for more information about ASR.\n\nXLS-R Paper\n\nAuthors: Arun Babu, Changhan Wang, Andros Tjandra, Kushal Lakhotia, Qiantong Xu, Naman Goyal, Kritika Singh, Patrick von Platen, Yatharth Saraf, Juan Pino, Alexei Baevski, Alexis Conneau, Michael Auli\n\nAbstract\nThis paper presents XLS-R, a large-scale model for cross-lingual speech representation learning based on wav2vec 2.0. We train models with up to 2B parameters on 436K hours of publicly available speech audio in 128 languages, an order of magnitude more public data than the largest known prior work. Our evaluation covers a wide range of tasks, domains, data regimes and languages, both high and low-resource. On the CoVoST-2 speech translation benchmark, we improve the previous state of the art by an average of 7.4 BLEU over 21 translation directions into English. For speech recognition, XLS-R improves over the best known prior work on BABEL, MLS, CommonVoice as well as VoxPopuli, lowering error rates by 20%-33% relative on average. XLS-R also sets a new state of the art on VoxLingua107 language identification. Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform English-only pretraining when translating English speech into other languages, a setting which favors monolingual pretraining. We hope XLS-R can help to improve speech processing tasks for many more languages of the world.\n\nThe original model can be found under URL",
"# Usage\n\nSee this google colab for more information on how to fine-tune the model.\n\nYou can find other pretrained XLS-R models with different numbers of parameters:\n\n* 300M parameters version\n* 1B version version\n* 2B version version"
] |
[
364,
542,
55
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #speech #xls_r #xls_r_pretrained #multilingual #ab #af #sq #am #ar #hy #as #az #ba #eu #be #bn #bs #br #bg #my #yue #ca #ceb #km #zh #cv #hr #cs #da #dv #nl #en #eo #et #fo #fi #fr #gl #lg #ka #de #el #gn #gu #ht #cnh #ha #haw #he #hi #hu #is #id #ia #ga #it #ja #jv #kb #kn #kk #rw #ky #ko #ku #lo #la #lv #ln #lt #lm #mk #mg #ms #ml #mt #gv #mi #mr #mn #ne #no #nn #oc #or #ps #fa #pl #pt #pa #ro #rm #ru #sah #sa #sco #sr #sn #sd #si #sk #sl #so #hsb #es #su #sw #sv #tl #tg #ta #tt #te #th #bo #tp #tr #tk #uk #ur #uz #vi #vot #war #cy #yi #yo #zu #dataset-common_voice #dataset-multilingual_librispeech #arxiv-2111.09296 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n"
] |
[
-0.04758894816040993,
0.046750012785196304,
-0.014598522335290909,
0.007297620642930269,
0.04550313577055931,
0.06332182139158249,
0.05419944226741791,
0.09116444736719131,
0.0727275013923645,
0.120418980717659,
0.11597013473510742,
0.08993609249591827,
0.10850890725851059,
0.030012592673301697,
0.02611636370420456,
-0.23409810662269592,
-0.014968891628086567,
-0.01329061295837164,
-0.050599705427885056,
0.09369145333766937,
0.0228695347905159,
-0.039926063269376755,
0.09142758697271347,
-0.08705759048461914,
0.05855416879057884,
0.025268973782658577,
-0.04975442960858345,
0.006925574969500303,
0.026553384959697723,
0.04827604070305824,
0.005549455527216196,
0.07044541090726852,
0.04715149477124214,
-0.24975258111953735,
0.032285042107105255,
-0.01567848026752472,
-0.03946636989712715,
-0.007352286018431187,
0.019108077511191368,
-0.12647803127765656,
0.16093921661376953,
-0.058167893439531326,
-0.0829150602221489,
0.048358362168073654,
-0.17256440222263336,
-0.1780303716659546,
-0.057935889810323715,
0.11529215425252914,
0.05751940608024597,
0.04282727837562561,
-0.054432131350040436,
0.07651355117559433,
-0.10932482779026031,
0.05207585170865059,
0.20425982773303986,
-0.18100734055042267,
-0.03396864980459213,
0.052614592015743256,
0.03836875036358833,
0.06726958602666855,
-0.10757510364055634,
0.004715980961918831,
0.014004664495587349,
0.018325919285416603,
-0.06147626042366028,
-0.04280159994959831,
0.09236277639865875,
0.05261838808655739,
-0.0792241171002388,
-0.0023866845294833183,
0.12998612225055695,
0.04852217063307762,
0.05794971063733101,
0.07729648798704147,
-0.014261530712246895,
-0.17769774794578552,
-0.036947861313819885,
-0.0072819264605641365,
0.02907484956085682,
0.03221805766224861,
-0.0040543596260249615,
0.10043403506278992,
-0.04863365739583969,
0.027797162532806396,
-0.0013738577254116535,
0.03689282014966011,
0.05448931083083153,
-0.01068132370710373,
0.002564662601798773,
-0.008259820751845837,
0.02728724479675293,
-0.1339804083108902,
0.0193843524903059,
0.03314534202218056,
-0.011473586782813072,
-0.0013692154316231608,
0.08231918513774872,
0.08224035054445267,
0.106025330722332,
0.11278236657381058,
-0.0741412341594696,
0.09792093932628632,
0.053250301629304886,
0.05515800416469574,
0.03846575319766998,
0.030388658866286278,
-0.05339635908603668,
-0.07941558212041855,
-0.11798667907714844,
0.011203414760529995,
-0.039740465581417084,
0.002606295980513096,
-0.0391656793653965,
0.03476659208536148,
-0.0027384550776332617,
0.04721203073859215,
-0.002601043554022908,
0.04684453085064888,
-0.0627133846282959,
0.024724356830120087,
0.010196039453148842,
-0.04707365855574608,
0.03391134366393089,
0.10114587098360062,
0.019277868792414665,
0.10931576043367386,
-0.03982828930020332,
0.015561006963253021,
-0.010288936085999012,
0.06658016890287399,
-0.034604910761117935,
0.055008504539728165,
-0.009356276132166386,
-0.02502163127064705,
0.0725674033164978,
-0.046018797904253006,
0.04567970335483551,
-0.08267710357904434,
-0.028219234198331833,
-0.05670318007469177,
-0.0090637831017375,
-0.07997855544090271,
-0.0053685917519032955,
-0.08900962024927139,
-0.1086217612028122,
-0.002038841601461172,
0.012349596247076988,
0.052931562066078186,
-0.0727030411362648,
0.08642856776714325,
0.009818961843848228,
0.08657389879226685,
0.060204003006219864,
0.026761578395962715,
0.0009941438911482692,
0.07027294486761093,
-0.05940583348274231,
0.06373874843120575,
-0.08054359257221222,
0.03210092708468437,
-0.0985342487692833,
-0.09965771436691284,
-0.10359422862529755,
0.017799248918890953,
0.0003231614828109741,
0.17196089029312134,
-0.1537889987230301,
-0.09573788195848465,
0.26548218727111816,
-0.03543860465288162,
-0.03795851394534111,
0.12055743485689163,
0.05553283542394638,
-0.018209131434559822,
0.04342658445239067,
0.14339213073253632,
0.023143338039517403,
-0.09280882775783539,
-0.09612368792295456,
0.03393334895372391,
0.06252431869506836,
0.10833461582660675,
0.10885844379663467,
0.01866021566092968,
0.1016145795583725,
0.025084640830755234,
0.01778208650648594,
0.06347259134054184,
-0.08169955015182495,
-0.08290635794401169,
0.0662018284201622,
-0.04978432506322861,
0.036164697259664536,
0.10354362428188324,
-0.025877490639686584,
-0.051357705146074295,
-0.03679794445633888,
-0.11615405976772308,
0.09353452920913696,
0.003735500853508711,
-0.009119843132793903,
-0.09966672956943512,
-0.0024344315752387047,
0.06647340953350067,
0.02288619615137577,
-0.04526057466864586,
0.09622130542993546,
-0.053359005600214005,
0.11819121986627579,
0.0636710673570633,
0.07794049382209778,
0.10187273472547531,
-0.018218638375401497,
-0.07862292975187302,
-0.041916824877262115,
0.09991682320833206,
0.012974705547094345,
-0.0592535175383091,
-0.22189302742481232,
0.05805857852101326,
-0.021290823817253113,
0.09959299862384796,
-0.173395037651062,
0.03262442350387573,
0.1390991061925888,
0.1537192165851593,
0.0032056004274636507,
-0.011890897527337074,
-0.007739355321973562,
0.08227310329675674,
0.01825963705778122,
-0.015648340806365013,
0.033745236694812775,
-0.030913399532437325,
-0.0483984649181366,
0.02221750095486641,
-0.07240423560142517,
0.11462175101041794,
0.10714226216077805,
-0.0774630531668663,
-0.07751689106225967,
0.10570073872804642,
-0.021646583452820778,
-0.019964344799518585,
0.11689845472574234,
-0.0033914693631231785,
0.10226988047361374,
0.0063048601150512695,
0.020653771236538887,
-0.021763304248452187,
-0.02627391926944256,
0.016238698735833168,
-0.078618124127388,
-0.04888671264052391,
0.19366852939128876,
0.039686936885118484,
-0.14944960176944733,
0.19100140035152435,
0.13074064254760742,
0.046479348093271255,
0.1761941760778427,
-0.02098187804222107,
-0.038460101932287216,
-0.09703352302312851,
0.008853917010128498,
0.006463043857365847,
0.06676548719406128,
-0.1745358407497406,
-0.0192779041826725,
-0.035569097846746445,
-0.012208925560116768,
0.014522350393235683,
-0.07549697160720825,
-0.06783230602741241,
-0.049335066229104996,
-0.04544844850897789,
0.0057989503256976604,
0.05347583442926407,
-0.06442030519247055,
0.0798012763261795,
0.01801256649196148,
-0.03578075021505356,
-0.05564208701252937,
-0.0182326789945364,
-0.07343576848506927,
0.14630886912345886,
-0.1613723635673523,
-0.0870039165019989,
0.06405065953731537,
-0.10177189111709595,
0.058744970709085464,
-0.019216667860746384,
0.0016281373100355268,
-0.14172931015491486,
0.012648227624595165,
0.03568769991397858,
0.0959029421210289,
-0.10252966731786728,
-0.022290745750069618,
-0.010165792889893055,
0.010574684478342533,
-0.04425327852368355,
-0.022789975628256798,
-0.03573226556181908,
0.015208350494503975,
-0.0862061157822609,
0.09914316236972809,
-0.13020643591880798,
0.05319688096642494,
0.11437389999628067,
0.10433543473482132,
0.010324402712285519,
-0.014090786688029766,
0.15116819739341736,
-0.14562298357486725,
0.003910788334906101,
-0.03903409093618393,
-0.007643275894224644,
0.03717243671417236,
0.1350412517786026,
0.04445036128163338,
-0.06503818184137344,
-0.03835732862353325,
0.015962857753038406,
-0.004784753080457449,
-0.1427658051252365,
-0.0325813964009285,
-0.047438591718673706,
0.11782156676054001,
-0.04103660210967064,
0.08624549210071564,
-0.023751046508550644,
0.0010536120971664786,
-0.047525715082883835,
-0.12167119979858398,
-0.008506490848958492,
-0.04549160227179527,
0.0391707718372345,
-0.045567408204078674,
0.013115744106471539,
-0.03516572713851929,
-0.022930510342121124,
0.04655591398477554,
0.08895205706357956,
-0.06574182957410812,
0.05527547001838684,
0.07031182944774628,
0.07370050996541977,
0.1494828313589096,
-0.0243929885327816,
-0.043639298528432846,
0.043996717780828476,
-0.018319150432944298,
0.0061632124707102776,
-0.019752908498048782,
-0.031136393547058105,
0.026026621460914612,
0.15798795223236084,
-0.025608809664845467,
0.035328567028045654,
0.0016783815808594227,
0.12035245448350906,
0.07534033805131912,
0.06748493760824203,
-0.10401834547519684,
-0.029330622404813766,
0.07391085475683212,
0.007402131799608469,
-0.004358747974038124,
0.03399980440735817,
0.03638716787099838,
-0.05962919816374779,
0.11692819744348526,
0.08322925120592117,
0.023530272766947746,
-0.07913004606962204,
0.06384597718715668,
0.007824680767953396,
-0.00374860898591578,
-0.009654784575104713,
0.05424704775214195,
-0.2965855002403259,
0.19174015522003174,
0.020947936922311783,
0.010615861043334007,
-0.008552956394851208,
-0.0559433214366436,
0.03683074191212654,
0.07165589183568954,
0.13723109662532806,
0.0654105469584465,
-0.16263112425804138,
-0.16869181394577026,
-0.011847032234072685,
0.0036132351960986853,
0.1084655374288559,
-0.025796353816986084,
0.04171372205018997,
0.0539994016289711,
-0.042270079255104065,
-0.02768274024128914,
0.009784230962395668,
-0.07511284947395325,
-0.008112223818898201,
0.07865383476018906,
-0.043466437608003616,
0.033007413148880005,
-0.019895141944289207,
-0.03433030843734741,
-0.17768992483615875,
0.011113160289824009,
-0.15309588611125946,
0.013765513896942139,
-0.034572720527648926,
0.01982167549431324,
0.057510241866111755,
-0.11803165823221207,
-0.10935557633638382,
0.043603166937828064,
-0.07333322614431381,
-0.019134515896439552,
0.021817652508616447,
0.08984815329313278,
-0.0452747568488121,
-0.18489821255207062,
-0.003973573446273804,
0.116004578769207,
0.08338615298271179,
0.11872392892837524,
-0.04745079576969147,
0.02563413791358471,
-0.00017659025616012514,
-0.08358296006917953,
0.15851682424545288,
-0.021242165938019753,
-0.028059806674718857,
0.04623347893357277,
-0.008475400507450104,
-0.06448058784008026,
-0.08969849348068237,
-0.07474170625209808,
0.08955075591802597,
0.2982664108276367,
-0.02794746123254299,
0.10638932883739471,
0.0982741042971611,
-0.08372929692268372,
-0.2470681369304657,
-0.11061340570449829,
-0.01849932037293911,
0.022611845284700394,
-0.014536612667143345,
-0.20981232821941376,
-0.037290822714567184,
-0.0007734635728411376,
0.027293013408780098,
-0.011917482130229473,
-0.27352064847946167,
-0.03471164405345917,
0.1081906408071518,
0.0014343776274472475,
0.08150352537631989,
-0.1988932341337204,
-0.03273606672883034,
-0.006223125848919153,
-0.061205051839351654,
-0.08480805903673172,
-0.01567237451672554,
0.04754459857940674,
-0.012480777688324451,
0.01996791735291481,
-0.007350675296038389,
-0.009788032621145248,
0.16906900703907013,
0.055363982915878296,
-0.015854258090257645,
-0.08185693621635437,
-0.112275131046772,
0.011410282924771309,
0.027432221919298172,
-0.020014578476548195,
-0.12136157602071762,
-0.06844186037778854,
-0.05250456929206848,
0.028336280956864357,
-0.1432066410779953,
-0.0038499508518725634,
-0.053854309022426605,
0.01680470071732998,
-0.043834712356328964,
0.07147806882858276,
0.061997462064027786,
0.015688452869653702,
0.07988321781158447,
-0.09488020837306976,
0.10145483165979385,
0.003983052913099527,
0.13370566070079803,
0.05125358700752258,
0.003471267642453313,
-0.032494474202394485,
-0.012622587382793427,
-0.007999869994819164,
-0.09816669672727585,
-0.008034632541239262,
0.1326776146888733,
0.006311078555881977,
0.08053737878799438,
0.0388166643679142,
-0.11542125046253204,
0.005140396766364574,
0.10659848153591156,
-0.08538658916950226,
-0.16917502880096436,
-0.0030268800910562277,
-0.0890909805893898,
-0.02113277278840542,
-0.013151636347174644,
0.09828539192676544,
-0.004258833825588226,
-0.009548707865178585,
0.009000595659017563,
0.08283234387636185,
-0.06984352320432663,
0.1317896991968155,
0.06573610007762909,
0.007921171374619007,
-0.07596545666456223,
0.018116388469934464,
0.008600310422480106,
-0.05593857914209366,
0.01814286783337593,
0.16202816367149353,
-0.03931687772274017,
-0.08665654808282852,
-0.002317215083166957,
0.11540139466524124,
0.09805311262607574,
-0.014468617737293243,
-0.00561782019212842,
-0.12169135361909866,
0.05704066902399063,
0.1739104688167572,
0.02329704351723194,
0.03515571355819702,
0.06376361101865768,
0.03490538150072098,
0.04636535793542862,
0.09959180653095245,
0.06675101816654205,
-0.019476937130093575,
-0.04469487816095352,
0.0639730915427208,
-0.05874791368842125,
0.10523588955402374,
-0.008864161558449268,
-0.013945162296295166,
-0.18875719606876373,
0.055915020406246185,
-0.051443327218294144,
-0.051493141800165176,
-0.1284194439649582,
-0.028333108872175217,
0.01252016332000494,
-0.10935238003730774,
-0.05806810036301613,
-0.051216673105955124,
-0.08376302570104599,
-0.011372096836566925,
0.022658830508589745,
0.12448020279407501,
-0.06087949499487877,
-0.06464768201112747,
0.08071742206811905,
-0.06336674839258194,
0.0726860836148262,
0.11158103495836258,
-0.0033723560627549887,
0.08947547525167465,
-0.15136584639549255,
-0.020136872306466103,
0.03837635740637779,
0.018109142780303955,
-0.026558663696050644,
-0.021494993939995766,
-0.0733703002333641,
-0.06686865538358688,
0.019500169903039932,
0.056125253438949585,
0.023654242977499962,
0.018599938601255417,
0.15732167661190033,
-0.03313099220395088,
-0.08085320144891739,
-0.027194727212190628,
0.04946548864245415,
0.07403677701950073,
-0.013004578649997711,
0.021954303607344627,
-0.08872180432081223,
0.04596378654241562,
-0.10859571397304535,
0.029522854834794998,
0.0036232504062354565,
-0.09073837846517563,
0.012730208225548267,
-0.034900274127721786,
0.08679858595132828,
-0.0240815170109272,
0.06667061150074005,
-0.07957162708044052,
-0.12135081738233566,
0.014396972954273224,
-0.03353375941514969,
-0.07763993740081787,
0.04483365640044212,
0.018808657303452492,
0.052962254732847214,
-0.06260966509580612,
-0.09313981235027313,
0.03413401544094086,
-0.027125364169478416,
-0.03185851871967316,
0.10172892361879349,
0.10287643224000931,
0.15682417154312134,
0.04110949859023094,
-0.012061616405844688,
-0.10464885085821152,
0.030519677326083183,
0.04306397587060928,
-0.1267947256565094,
-0.03761861100792885,
-0.033327169716358185,
0.1723002791404724,
0.14371195435523987,
-0.12686452269554138,
0.025362113490700722,
-0.0958649069070816,
-0.06121267378330231,
-0.09734506160020828,
-0.11149827390909195,
-0.02032903954386711,
-0.04046092554926872,
0.034334179013967514,
-0.08462601155042648,
0.023711245507001877,
0.05536153167486191,
0.055003099143505096,
0.022635193541646004,
0.0749579444527626,
0.05964932218194008,
-0.029332127422094345,
0.02037837915122509,
0.015804825350642204,
-0.027766520157456398,
-0.12268771976232529,
0.06084098666906357,
0.012114623561501503,
-0.04785197228193283,
0.00784006342291832,
0.006400256883352995,
-0.1174456775188446,
-0.0028134838212281466,
-0.08843544125556946,
-0.10307108610868454,
0.0019086425891146064,
0.03586211055517197,
0.01882612146437168,
0.08446840196847916,
0.018749628216028214,
0.0027973330579698086,
0.03613686561584473,
0.05063589662313461,
-0.0379384309053421,
-0.05817336589097977,
-0.054089002311229706,
0.12242035567760468,
-0.05788540840148926,
0.024130774661898613,
-0.005720221437513828,
-0.010565275326371193,
0.045951247215270996,
0.1654592603445053,
0.22614310681819916,
-0.06411873549222946,
0.06531936675310135,
0.01827136240899563,
0.04413125663995743,
0.037272192537784576,
-0.05637273192405701,
0.09209465235471725,
0.14816640317440033,
-0.10727906227111816,
0.04184422641992569,
-0.0515945665538311,
-0.018564878031611443,
-0.042329996824264526,
-0.04453852027654648,
0.031944289803504944,
0.01670663431286812,
-0.04309053346514702,
0.11987631022930145,
-0.19023706018924713,
-0.09840022027492523,
0.019816596060991287,
-0.19020907580852509,
-0.026547834277153015,
-0.031278468668460846,
0.033922936767339706,
0.15247024595737457,
0.07217798382043839,
-0.018671637400984764,
-0.07651641219854355,
0.046709176152944565,
0.047176163643598557,
-0.09028580039739609,
0.06400150060653687,
0.05047034099698067,
-0.09749908745288849,
0.04046724736690521,
-0.07054471224546432,
0.03860846161842346,
0.10197609663009644,
0.001449240604415536,
0.02716143988072872,
0.01903568208217621,
0.07957754284143448,
-0.04702087119221687,
-0.09546404331922531,
0.08421297371387482,
-0.0016376468120142817,
0.04542030394077301,
0.1331513673067093,
-0.02716847136616707,
0.0395800918340683,
0.0850001648068428,
-0.05050958693027496,
0.03210289403796196,
0.11880776286125183,
-0.039885617792606354,
0.06432854384183884,
0.10259599983692169,
-0.009587311185896397,
-0.021405193954706192,
-0.011172259226441383,
-0.00789770670235157,
-0.033448997884988785,
0.05320707708597183,
-0.03739314153790474,
-0.1015140563249588,
-0.019261715933680534,
-0.01634345017373562,
0.08806314319372177,
0.000858892744872719,
-0.06017018482089043,
0.017442673444747925,
0.012204518541693687,
-0.07268474251031876,
0.09057633578777313,
0.07205245643854141,
0.019738230854272842,
-0.05116759613156319,
-0.1648966372013092,
0.0272023007273674,
0.13811902701854706,
-0.09347319602966309,
-0.050105031579732895
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53 finetuned on multi-lingual Common Voice
This checkpoint leverages the pretrained checkpoint [wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53)
and is fine-tuned on [CommonVoice](https://huggingface.co/datasets/common_voice) to recognize phonetic labels in multiple languages.
When using the model make sure that your speech input is sampled at 16kHz.
Note that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words
has to be used to map the phonetic output labels to output words.
[Paper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition](https://arxiv.org/abs/2109.11680)
Authors: Qiantong Xu, Alexei Baevski, Michael Auli
**Abstract**
Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-xlsr-53-espeak-cv-ft")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-xlsr-53-espeak-cv-ft")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values
# retrieve logits
with torch.no_grad():
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
# => should give ['m ɪ s t ɚ k w ɪ l t ɚ ɪ z ð ɪ ɐ p ɑː s əl l ʌ v ð ə m ɪ d əl k l æ s ɪ z æ n d w iː aʊ ɡ l æ d t ə w ɛ l k ə m h ɪ z ɡ ɑː s p ə']
```
|
{"language": "multi-lingual", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition", "phoneme-recognition"], "datasets": ["common_voice"], "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
|
automatic-speech-recognition
|
facebook/wav2vec2-xlsr-53-espeak-cv-ft
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"phoneme-recognition",
"dataset:common_voice",
"arxiv:2109.11680",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.11680"
] |
[
"multi-lingual"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #phoneme-recognition #dataset-common_voice #arxiv-2109.11680 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Wav2Vec2-Large-XLSR-53 finetuned on multi-lingual Common Voice
This checkpoint leverages the pretrained checkpoint wav2vec2-large-xlsr-53
and is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.
When using the model make sure that your speech input is sampled at 16kHz.
Note that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words
has to be used to map the phonetic output labels to output words.
Paper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition
Authors: Qiantong Xu, Alexei Baevski, Michael Auli
Abstract
Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.
The original model can be found under URL
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
|
[
"# Wav2Vec2-Large-XLSR-53 finetuned on multi-lingual Common Voice\n\nThis checkpoint leverages the pretrained checkpoint wav2vec2-large-xlsr-53 \nand is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.\n\nWhen using the model make sure that your speech input is sampled at 16kHz. \nNote that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words \nhas to be used to map the phonetic output labels to output words.\n\nPaper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition\n\nAuthors: Qiantong Xu, Alexei Baevski, Michael Auli\n\nAbstract\nRecent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #phoneme-recognition #dataset-common_voice #arxiv-2109.11680 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Wav2Vec2-Large-XLSR-53 finetuned on multi-lingual Common Voice\n\nThis checkpoint leverages the pretrained checkpoint wav2vec2-large-xlsr-53 \nand is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.\n\nWhen using the model make sure that your speech input is sampled at 16kHz. \nNote that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words \nhas to be used to map the phonetic output labels to output words.\n\nPaper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition\n\nAuthors: Qiantong Xu, Alexei Baevski, Michael Auli\n\nAbstract\nRecent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.\n\nThe original model can be found under URL",
"# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
79,
329,
25
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #phoneme-recognition #dataset-common_voice #arxiv-2109.11680 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Wav2Vec2-Large-XLSR-53 finetuned on multi-lingual Common Voice\n\nThis checkpoint leverages the pretrained checkpoint wav2vec2-large-xlsr-53 \nand is fine-tuned on CommonVoice to recognize phonetic labels in multiple languages.\n\nWhen using the model make sure that your speech input is sampled at 16kHz. \nNote that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words \nhas to be used to map the phonetic output labels to output words.\n\nPaper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition\n\nAuthors: Qiantong Xu, Alexei Baevski, Michael Auli\n\nAbstract\nRecent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.\n\nThe original model can be found under URL# Usage\n\nTo transcribe audio files the model can be used as a standalone acoustic model as follows:"
] |
[
-0.10009510815143585,
0.05788782238960266,
-0.0040947021916508675,
-0.008102689869701862,
0.11367283016443253,
-0.026569116860628128,
0.1794198453426361,
0.05898961424827576,
0.03372364491224289,
0.05688885226845741,
-0.11819995939731598,
0.04850875958800316,
0.060570526868104935,
0.014073656871914864,
0.023627737537026405,
-0.13612030446529388,
0.03243709355592728,
-0.0553986057639122,
0.2188163846731186,
0.05417991057038307,
0.037770889699459076,
-0.04785347357392311,
0.033005241304636,
0.004940831568092108,
-0.06082296371459961,
0.01951027661561966,
0.009797731414437294,
-0.08121592551469803,
0.09020920842885971,
-0.009968060068786144,
0.07076726108789444,
0.02848207950592041,
0.019682373851537704,
-0.13355465233325958,
0.00379763706587255,
0.06153976544737816,
0.00489389942958951,
-0.017822103574872017,
0.11650800704956055,
-0.06524357199668884,
0.022705335170030594,
-0.0038019451312720776,
-0.014382853172719479,
0.07060500234365463,
-0.04944336414337158,
-0.13998302817344666,
-0.09382030367851257,
0.06749208271503448,
0.019154515117406845,
0.12893423438072205,
-0.04446708783507347,
0.10822683572769165,
0.10224873572587967,
0.06350332498550415,
0.14168550074100494,
-0.24425646662712097,
0.036888301372528076,
0.043741241097450256,
0.08781187236309052,
0.09294097870588303,
-0.034994371235370636,
0.04515836760401726,
0.017374277114868164,
0.0007866147789172828,
-0.0643886998295784,
-0.04354604333639145,
0.011242127977311611,
-0.09735269099473953,
-0.07355149835348129,
-0.0035281414166092873,
0.11993079632520676,
-0.06104786694049835,
-0.11663559824228287,
-0.14415177702903748,
-0.019975021481513977,
0.007603371515870094,
0.003632169682532549,
-0.022490743547677994,
0.016266094520688057,
0.010993294417858124,
0.033538803458213806,
-0.04388092830777168,
-0.05014572665095329,
-0.0183878056704998,
-0.092498280107975,
0.10926118493080139,
0.04718902334570885,
0.0011814810568466783,
-0.04415610805153847,
0.0521986149251461,
-0.06546783447265625,
-0.03067745454609394,
-0.03422301262617111,
-0.022852109745144844,
-0.09208352863788605,
0.05307460203766823,
0.00970237236469984,
-0.17688341438770294,
0.003433847101405263,
0.01771986298263073,
0.012188058346509933,
0.05550845339894295,
-0.13573770225048065,
0.004088045097887516,
0.02906208112835884,
0.06660610437393188,
0.02813316136598587,
-0.05486763268709183,
0.002587607130408287,
-0.05077278986573219,
0.05119726061820984,
-0.041173893958330154,
-0.06709906458854675,
-0.04504197835922241,
0.05933878570795059,
0.09460318833589554,
0.017004942521452904,
-0.016949281096458435,
-0.015856539830565453,
-0.05976416543126106,
0.10952499508857727,
-0.11680690199136734,
0.042813871055841446,
0.013520998880267143,
0.06999440491199493,
0.15339283645153046,
0.053916241973638535,
0.013344736769795418,
-0.1393786519765854,
0.005305564031004906,
0.023434778675436974,
0.02675652503967285,
-0.05221528559923172,
-0.1398836076259613,
0.004263021983206272,
-0.09251070022583008,
-0.08823128789663315,
-0.15470288693904877,
-0.12734951078891754,
-0.06205480545759201,
-0.047398969531059265,
-0.026975229382514954,
0.03528410196304321,
-0.04512540623545647,
0.0002033938217209652,
-0.036346644163131714,
-0.0008525343728251755,
-0.06379245221614838,
-0.02842779830098152,
-0.005013017449527979,
-0.038374099880456924,
0.08924058824777603,
0.0002911966585088521,
0.053419630974531174,
-0.035698480904102325,
-0.038160696625709534,
-0.044524289667606354,
0.13967199623584747,
-0.05779413878917694,
-0.050052449107170105,
-0.04911249503493309,
-0.011017323471605778,
-0.03919332101941109,
0.042150430381298065,
0.009119399823248386,
0.05529024824500084,
-0.17802809178829193,
-0.10868999361991882,
0.16410160064697266,
-0.19858035445213318,
0.02497830241918564,
0.10518844425678253,
0.007606990169733763,
0.0919184535741806,
0.10058335214853287,
0.1583893746137619,
0.1380046308040619,
-0.14952120184898376,
-0.1000874936580658,
-0.1147724911570549,
-0.03839225694537163,
0.029450247064232826,
0.02125881239771843,
-0.08555068075656891,
0.06747598201036453,
0.020350247621536255,
0.10003290325403214,
-0.08406651020050049,
-0.012468949891626835,
-0.034069593995809555,
-0.0013064451050013304,
-0.01873406209051609,
0.004309504292905331,
-0.006757655646651983,
0.04623812064528465,
-0.0022171030286699533,
-0.04561268165707588,
0.03919205069541931,
0.040124714374542236,
-0.09814786911010742,
0.08928371220827103,
-0.07607384771108627,
0.03144851326942444,
-0.00914907455444336,
-0.008643819950520992,
-0.15630507469177246,
0.02114495262503624,
0.0706663578748703,
-0.09483632445335388,
0.10765402019023895,
-0.019196530804038048,
-0.020089950412511826,
0.06449144333600998,
-0.048871394246816635,
0.017081543803215027,
0.008885245770215988,
-0.003481716848909855,
-0.051804907619953156,
-0.04529722407460213,
0.0016145537374541163,
-0.05750860273838043,
-0.027537662535905838,
-0.025223053991794586,
-0.007365490775555372,
-0.03962930291891098,
0.06666003912687302,
0.03467902913689613,
-0.0678369328379631,
0.01914261281490326,
0.11008699238300323,
-0.016309931874275208,
-0.012304863892495632,
0.03736208751797676,
-0.012897951528429985,
0.03156276419758797,
0.12752732634544373,
-0.1621115803718567,
-0.13018947839736938,
0.0866205170750618,
-0.017774205654859543,
-0.054565079510211945,
0.045836251229047775,
0.04833491891622543,
-0.05584118142724037,
-0.08199987560510635,
-0.036139730364084244,
0.21801221370697021,
-0.003868701169267297,
0.12148696184158325,
-0.06753676384687424,
0.027747344225645065,
0.028874432668089867,
-0.03248487412929535,
-0.0008269888930954039,
0.0669059231877327,
-0.052402909845113754,
-0.014892240054905415,
0.03200525790452957,
-0.03802766650915146,
0.02459430880844593,
0.20327922701835632,
0.03216014802455902,
-0.044454701244831085,
-0.005319035612046719,
-0.016605820506811142,
0.07643917202949524,
0.02010921947658062,
-0.11726242303848267,
-0.038766730576753616,
0.052492786198854446,
0.09576154500246048,
0.07863578200340271,
-0.06385774165391922,
0.08002299070358276,
0.0013237370876595378,
-0.042211081832647324,
-0.054735343903303146,
0.04202958196401596,
0.02567397803068161,
0.0240169744938612,
-0.05783090367913246,
0.060215242207050323,
0.003432595171034336,
-0.044145796447992325,
-0.13297484815120697,
0.0741388127207756,
-0.12557056546211243,
-0.27549976110458374,
-0.16157928109169006,
-0.005145625211298466,
-0.0674501359462738,
-0.0035339214373379946,
0.08109065890312195,
-0.00851185992360115,
-0.04503141716122627,
-0.07796228677034378,
0.05000320076942444,
-0.0675840675830841,
-0.09649435430765152,
-0.046253934502601624,
0.017332913354039192,
-0.054361741989851,
-0.11472783237695694,
0.018805649131536484,
-0.012975291348993778,
-0.1187184527516365,
0.05512900650501251,
-0.054677803069353104,
0.020998379215598106,
0.1302318125963211,
0.003938006237149239,
-0.02726454846560955,
-0.03081466257572174,
-0.0034426420461386442,
-0.03521302342414856,
0.00803623627871275,
0.18032848834991455,
0.05052807182073593,
0.03249569609761238,
0.04649898782372475,
-0.03645457699894905,
-0.029243212193250656,
-0.004087518434971571,
0.021690111607313156,
-0.06568284332752228,
-0.217637836933136,
-0.11295939981937408,
-0.06439796835184097,
0.029634205624461174,
-0.035153426229953766,
0.010298142209649086,
0.06466371566057205,
-0.010026476345956326,
-0.0019287040922790766,
-0.04204774647951126,
0.04556416720151901,
0.10651706904172897,
0.13883861899375916,
0.014334486797451973,
0.10119915008544922,
-0.07422702759504318,
0.05225272476673126,
0.06961997598409653,
0.07009314000606537,
0.2345995157957077,
0.04706573858857155,
0.07462410628795624,
0.07268371433019638,
0.06956277787685394,
0.1372346132993698,
0.07098692655563354,
0.013757532462477684,
0.06290601938962936,
-0.012311032973229885,
-0.06553777307271957,
0.024820631369948387,
0.019651545211672783,
0.023489531129598618,
-0.09512022882699966,
-0.0218668095767498,
-0.02272133342921734,
0.014898699708282948,
0.10088293254375458,
0.07006267458200455,
-0.024361925199627876,
-0.04593520611524582,
-0.019112909212708473,
-0.08320184797048569,
-0.08671361207962036,
0.031152736395597458,
0.07825043797492981,
-0.1336139291524887,
0.08764459937810898,
0.01987861841917038,
0.07582834362983704,
-0.10117242485284805,
-0.010839678347110748,
-0.06976396590471268,
0.005345598328858614,
-0.007327473722398281,
0.02638348750770092,
-0.10290605574846268,
0.09893222898244858,
-0.0048607513308525085,
0.10143852978944778,
-0.04517597705125809,
0.008422850631177425,
0.0061029852367937565,
-0.012933610007166862,
0.09045607596635818,
0.01628963090479374,
0.011228904128074646,
0.0630282536149025,
-0.08571818470954895,
0.01857728511095047,
0.11532538384199142,
0.0027563225012272596,
0.03595590591430664,
0.027999529615044594,
-0.019622627645730972,
-0.05672517791390419,
0.015776127576828003,
-0.18187855184078217,
-0.13744787871837616,
0.08752623200416565,
0.043991681188344955,
0.09167473763227463,
-0.01675078645348549,
-0.07184381037950516,
-0.15482382476329803,
0.13196133077144623,
-0.13248838484287262,
-0.06455924361944199,
-0.07509178668260574,
-0.16253018379211426,
0.1046704575419426,
-0.010133795440196991,
0.07425077259540558,
0.0009434542153030634,
0.051409777253866196,
-0.10786572098731995,
-0.07319357991218567,
0.038095273077487946,
-0.06348276138305664,
-0.139889195561409,
-0.00005430719829746522,
0.14451800286769867,
0.038541555404663086,
-0.003360138041898608,
0.020473923534154892,
0.04029303789138794,
0.0032836010213941336,
-0.060928650200366974,
0.003849823260679841,
0.13798265159130096,
0.013561297208070755,
0.17853997647762299,
-0.08783704787492752,
-0.3707391321659088,
-0.08395999670028687,
-0.12060308456420898,
0.0829126313328743,
0.16672922670841217,
-0.03226121887564659,
0.19991986453533173,
0.15535297989845276,
-0.1572258472442627,
-0.18442243337631226,
-0.05703005567193031,
0.029686961323022842,
0.05286092311143875,
-0.001140426960773766,
-0.15118218958377838,
0.026356708258390427,
0.020883629098534584,
0.01562870480120182,
-0.012298635207116604,
-0.18455685675144196,
-0.1635325402021408,
0.025178520008921623,
-0.015998857095837593,
0.11984450370073318,
-0.07210313528776169,
-0.06403844803571701,
-0.05471307784318924,
-0.06660810858011246,
0.10702817887067795,
-0.02238657884299755,
0.1337902843952179,
0.015707997605204582,
0.037400610744953156,
0.004269321449100971,
-0.02759566344320774,
0.05950690433382988,
0.10301772505044937,
0.0383012555539608,
0.03815711662173271,
0.026613973081111908,
0.03181248530745506,
0.01770808733999729,
0.049168895930051804,
0.09341919422149658,
0.019572919234633446,
-0.025891713798046112,
-0.07891567796468735,
-0.05879537761211395,
0.07099394500255585,
0.00640428252518177,
-0.004117502365261316,
-0.046829983592033386,
0.03700084984302521,
0.02008918486535549,
-0.04111324995756149,
0.02650250867009163,
-0.15265129506587982,
0.029414357617497444,
0.19611501693725586,
0.15493974089622498,
-0.011224150657653809,
-0.0019163909601047635,
-0.07648378610610962,
-0.08716866374015808,
0.058429036289453506,
0.03420165926218033,
0.043003518134355545,
0.04933566227555275,
0.04572224244475365,
0.06505425274372101,
0.018366873264312744,
-0.10563802719116211,
0.08084039390087128,
0.01385405845940113,
-0.019183339551091194,
-0.08558464795351028,
0.014943644404411316,
0.014497141353785992,
0.07138901948928833,
0.1043347641825676,
0.17008116841316223,
-0.06265697628259659,
-0.03905202075839043,
-0.037534911185503006,
0.01073131337761879,
-0.02423807978630066,
0.0909794270992279,
-0.060919590294361115,
0.01827377639710903,
-0.07511195540428162,
0.14371976256370544,
0.03440654277801514,
-0.026350194588303566,
0.028541183099150658,
0.08992742747068405,
-0.087889164686203,
-0.053809430450201035,
-0.14392463862895966,
0.1439560502767563,
-0.08247081935405731,
-0.0923343077301979,
0.01798032410442829,
-0.058353714644908905,
-0.013144634664058685,
0.17027747631072998,
-0.02067432552576065,
0.04150301590561867,
-0.045149508863687515,
-0.032141800969839096,
-0.08125293254852295,
0.019040288403630257,
0.06873153895139694,
0.01194707490503788,
-0.06916234642267227,
0.14138337969779968,
0.04541345685720444,
-0.008124846033751965,
-0.03545502945780754,
-0.07359857112169266,
-0.0518224872648716,
0.010396548546850681,
-0.13392630219459534,
0.003489919239655137,
-0.07872916758060455,
-0.038674283772706985,
0.00416590878739953,
0.018434982746839523,
0.02038263902068138,
0.0569227896630764,
-0.016937730833888054,
-0.008848287165164948,
-0.048673950135707855,
0.0631563812494278,
-0.061442699283361435,
0.05778127908706665,
0.009290197864174843,
-0.04769137129187584,
0.10470249503850937,
0.06776789575815201,
-0.060655370354652405,
0.023770252242684364,
-0.11588995158672333,
-0.011876540258526802,
0.005114841274917126,
0.04239067807793617,
-0.04088025912642479,
-0.10458943247795105,
-0.014091173186898232,
0.058303721249103546,
0.012630664743483067,
0.027293451130390167,
0.134836345911026,
-0.07626020908355713,
0.04247389733791351,
-0.10888753831386566,
0.024189580231904984,
-0.04935687407851219,
0.043308887630701065,
0.08081244677305222,
0.0639965683221817,
0.054387353360652924,
-0.10718093067407608,
0.08453372120857239,
-0.024893924593925476,
-0.02102544531226158,
-0.014069701544940472,
-0.00228075310587883,
0.012956174090504646,
-0.04183744266629219,
0.059913843870162964,
0.04907459765672684,
0.13461124897003174,
-0.010561712086200714,
0.005181551445275545,
-0.016762997955083847,
-0.03205728530883789,
-0.17693988978862762,
-0.005923798307776451,
0.10346487164497375,
0.048032842576503754,
0.020073916763067245,
0.010454236529767513,
0.012907741591334343,
-0.05352360010147095,
0.09922510385513306,
0.06568149477243423,
0.16447991132736206,
0.003026064718142152,
0.05730859562754631,
0.022553838789463043,
-0.08409439772367477,
-0.12006114423274994,
0.005917930975556374,
-0.0694882869720459,
0.06798823922872543,
-0.07947788387537003,
-0.030089356005191803,
0.04503568634390831,
-0.07227626442909241,
0.1323818415403366,
0.06904993206262589,
-0.06229164078831673,
-0.06556852161884308,
-0.10994765907526016,
-0.03232307732105255,
-0.05202103406190872,
-0.010813447646796703,
-0.07076278328895569,
0.07211991399526596,
0.04432861506938934,
0.048145003616809845,
0.007345946971327066,
0.06658434122800827,
-0.2145790010690689,
-0.11065782606601715,
0.09124734252691269,
-0.03941572457551956,
0.05153544619679451,
-0.002478464972227812,
0.0108482101932168,
0.037316419184207916,
0.08679080754518509,
0.04071506857872009,
0.07980892807245255,
0.055691078305244446,
0.03430231288075447,
-0.0741673931479454,
-0.04351503774523735,
0.00723665114492178,
-0.0029999276157468557,
0.06724438816308975,
0.20918773114681244,
0.08574468642473221,
-0.09194047003984451,
0.023857522755861282,
0.14506444334983826,
-0.06673067063093185,
-0.1750587373971939,
-0.07706397026777267,
0.17072995007038116,
-0.02608010172843933,
0.01392942201346159,
-0.0151632996276021,
-0.07199369370937347,
0.00917277205735445,
0.20076267421245575,
0.12821927666664124,
0.04075123369693756,
0.0014657782157883048,
0.035689182579517365,
-0.008907247334718704,
0.019118987023830414,
0.011902090162038803,
0.026044480502605438,
0.34053835272789,
0.023080790415406227,
0.09408164024353027,
-0.0514373704791069,
-0.02799874357879162,
-0.12177180498838425,
0.08701961487531662,
-0.10971465706825256,
-0.07181157916784286,
0.005872547626495361,
0.0943932980298996,
-0.0869358628988266,
-0.17144905030727386,
0.0829409658908844,
-0.04580482468008995,
-0.0450005866587162,
0.013766712509095669,
0.05900842323899269,
0.01030051615089178,
0.004156032577157021,
0.026487620547413826,
0.013591155409812927,
0.12040717154741287,
0.004480987787246704,
-0.03719194978475571,
0.009869246743619442,
-0.033598434180021286,
-0.10754581540822983,
0.003876997157931328,
0.0026106054428964853,
0.10495315492153168,
0.02319897711277008,
0.08309505134820938,
-0.024784712120890617,
0.12769384682178497,
-0.0636984258890152,
-0.07325151562690735,
0.04663202166557312,
0.1415795087814331,
-0.020854545757174492,
0.11790960282087326,
0.015413239598274231,
-0.04328663647174835,
0.04427584633231163,
0.05540267378091812,
-0.04901895672082901,
-0.03506065905094147,
0.03122301772236824,
-0.10476842522621155,
0.11453873664140701,
0.026015354320406914,
-0.004775136709213257,
-0.029065292328596115,
-0.00694376090541482,
0.045621536672115326,
-0.029385101050138474,
0.023649634793400764,
0.0021193339489400387,
-0.18474943935871124,
-0.014973836950957775,
-0.04092767834663391,
0.060491252690553665,
-0.2414408028125763,
-0.015475046820938587,
-0.01490323431789875,
-0.018745267763733864,
-0.08407413214445114,
0.051354389637708664,
0.04844627529382706,
-0.024531986564397812,
-0.02456153929233551,
-0.13317036628723145,
0.0794360414147377,
0.12999080121517181,
-0.06460348516702652,
-0.06480122357606888
] |
null | null |
transformers
|
# FSMT
## Model description
This is a ported version of [fairseq wmt19 transformer](https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md) for de-en.
For more details, please see, [Facebook FAIR's WMT19 News Translation Task Submission](https://arxiv.org/abs/1907.06616).
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* [wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru)
* [wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en)
* [wmt19-en-de](https://huggingface.co/facebook/wmt19-en-de)
* [wmt19-de-en](https://huggingface.co/facebook/wmt19-de-en)
## Intended uses & limitations
#### How to use
```python
from transformers import FSMTForConditionalGeneration, FSMTTokenizer
mname = "facebook/wmt19-de-en"
tokenizer = FSMTTokenizer.from_pretrained(mname)
model = FSMTForConditionalGeneration.from_pretrained(mname)
input = "Maschinelles Lernen ist großartig, oder?"
input_ids = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(input_ids)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded) # Machine learning is great, isn't it?
```
#### Limitations and bias
- The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, [content gets truncated](https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981)
## Training data
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the [paper](https://arxiv.org/abs/1907.06616).
## Eval results
pair | fairseq | transformers
-------|---------|----------
de-en | [42.3](http://matrix.statmt.org/matrix/output/1902?run_id=6750) | 41.35
The score is slightly below the score reported by `fairseq`, since `transformers`` currently doesn't support:
- model ensemble, therefore the best performing checkpoint was ported (``model4.pt``).
- re-ranking
The score was calculated using this code:
```bash
git clone https://github.com/huggingface/transformers
cd transformers
export PAIR=de-en
export DATA_DIR=data/$PAIR
export SAVE_DIR=data/$PAIR
export BS=8
export NUM_BEAMS=15
mkdir -p $DATA_DIR
sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source
sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target
echo $PAIR
PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py facebook/wmt19-$PAIR $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS
```
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with `--num_beams 50`.
## Data Sources
- [training, etc.](http://www.statmt.org/wmt19/)
- [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561)
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020},
title={Facebook FAIR's WMT19 News Translation Task Submission},
author={Ng, Nathan and Yee, Kyra and Baevski, Alexei and Ott, Myle and Auli, Michael and Edunov, Sergey},
booktitle={Proc. of WMT},
}
```
## TODO
- port model ensemble (fairseq uses 4 model checkpoints)
|
{"language": ["de", "en"], "license": "apache-2.0", "tags": ["translation", "wmt19", "facebook"], "datasets": ["wmt19"], "metrics": ["bleu"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
|
translation
|
facebook/wmt19-de-en
|
[
"transformers",
"pytorch",
"safetensors",
"fsmt",
"text2text-generation",
"translation",
"wmt19",
"facebook",
"de",
"en",
"dataset:wmt19",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.06616"
] |
[
"de",
"en"
] |
TAGS
#transformers #pytorch #safetensors #fsmt #text2text-generation #translation #wmt19 #facebook #de #en #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FSMT
====
Model description
-----------------
This is a ported version of fairseq wmt19 transformer for de-en.
For more details, please see, Facebook FAIR's WMT19 News Translation Task Submission.
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* wmt19-en-ru
* wmt19-ru-en
* wmt19-en-de
* wmt19-de-en
Intended uses & limitations
---------------------------
#### How to use
#### Limitations and bias
* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated
Training data
-------------
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.
Eval results
------------
pair: de-en, fairseq: 42.3, transformers: 41.35
The score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:
* model ensemble, therefore the best performing checkpoint was ported (''URL'').
* re-ranking
The score was calculated using this code:
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\_beams 50'.
Data Sources
------------
* training, etc.
* test set
### BibTeX entry and citation info
TODO
----
* port model ensemble (fairseq uses 4 model checkpoints)
|
[
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: de-en, fairseq: 42.3, transformers: 41.35\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
"TAGS\n#transformers #pytorch #safetensors #fsmt #text2text-generation #translation #wmt19 #facebook #de #en #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: de-en, fairseq: 42.3, transformers: 41.35\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
86,
5,
205,
30
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #fsmt #text2text-generation #translation #wmt19 #facebook #de #en #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### How to use#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: de-en, fairseq: 42.3, transformers: 41.35\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
-0.09834343194961548,
0.06747286021709442,
-0.004262413829565048,
0.06690538674592972,
0.10396237671375275,
0.019523361697793007,
0.07929617911577225,
0.12989918887615204,
-0.02820827253162861,
0.06302408874034882,
-0.029546871781349182,
0.07681608200073242,
0.09445782750844955,
0.1367112547159195,
-0.02670903690159321,
-0.14409691095352173,
0.04237806424498558,
0.0212907362729311,
-0.06292715668678284,
0.10245773941278458,
0.10064899176359177,
-0.07345359027385712,
0.027375925332307816,
-0.008848433382809162,
-0.01500968262553215,
0.08930063992738724,
0.038425832986831665,
-0.03122320957481861,
0.07758381962776184,
0.04246784746646881,
0.10733215510845184,
0.04716292768716812,
0.028149442747235298,
-0.24613620340824127,
0.027156636118888855,
0.13618074357509613,
0.0004044653323944658,
0.06835142523050308,
0.09118036925792694,
-0.07575548440217972,
0.10502023994922638,
-0.032454945147037506,
0.02586064487695694,
0.06462827324867249,
-0.12279664725065231,
-0.2654934823513031,
-0.09788315743207932,
0.07459772378206253,
0.09317842125892639,
0.02702125534415245,
-0.042444027960300446,
0.13639725744724274,
-0.06830771267414093,
0.046602677553892136,
0.14407752454280853,
-0.3204197883605957,
-0.006377970799803734,
0.1067018136382103,
0.01601337641477585,
0.019002793356776237,
-0.1353163719177246,
-0.02543235756456852,
0.036841023713350296,
0.0266055129468441,
0.07396043837070465,
-0.037312667816877365,
-0.0135994553565979,
0.04159815236926079,
-0.1614016741514206,
-0.04535835236310959,
0.09863010048866272,
0.06809486448764801,
-0.09650881588459015,
-0.09247686713933945,
-0.08136873692274094,
-0.06328827142715454,
0.02845975197851658,
-0.10836873948574066,
0.027719393372535706,
0.03474830836057663,
-0.012637972831726074,
0.044758621603250504,
-0.07778684794902802,
-0.07089594006538391,
-0.059370625764131546,
0.012915188446640968,
0.018968885764479637,
0.011712855659425259,
-0.044001415371894836,
0.16305388510227203,
-0.05607229471206665,
-0.07672285288572311,
-0.05956824868917465,
-0.051558252424001694,
-0.14284779131412506,
-0.039969637989997864,
0.03365866467356682,
-0.041745010763406754,
0.02542356587946415,
0.23448236286640167,
-0.02186821959912777,
0.03570855036377907,
-0.05368832126259804,
0.0068031661212444305,
-0.002605825662612915,
0.08312364667654037,
-0.09854725748300552,
-0.06936751306056976,
0.03358374908566475,
0.0577935166656971,
-0.026270791888237,
0.02674473449587822,
0.03465504199266434,
-0.050632327795028687,
0.0862552598118782,
0.07690419256687164,
-0.011664224788546562,
0.02540779486298561,
-0.07111950218677521,
-0.029665065929293633,
0.14305399358272552,
-0.15332156419754028,
0.0038501755334436893,
0.019521959125995636,
-0.06848221272230148,
0.021933462470769882,
0.09355082362890244,
-0.0007017204188741744,
-0.07470748573541641,
0.0721556544303894,
-0.04463514685630798,
-0.06418723613023758,
-0.0729692131280899,
-0.11433975398540497,
0.0020286517683416605,
-0.02938341349363327,
-0.04054859280586243,
-0.09307920932769775,
-0.158187597990036,
-0.0930122509598732,
-0.0014291106490418315,
-0.014216724783182144,
0.01100391149520874,
0.01731134206056595,
-0.025472037494182587,
-0.030148329213261604,
-0.016703542321920395,
0.03893597424030304,
-0.05750660225749016,
0.050220049917697906,
-0.014536372385919094,
0.041184745728969574,
0.11056489497423172,
0.0342649482190609,
-0.10534847527742386,
0.053834907710552216,
-0.21768823266029358,
0.0881274938583374,
-0.06880611181259155,
-0.04462979733943939,
-0.12868215143680573,
-0.04932749643921852,
-0.039199139922857285,
-0.008645649999380112,
0.05254824832081795,
0.12030794471502304,
-0.16491732001304626,
-0.03428946062922478,
0.1727934032678604,
-0.12108418345451355,
-0.08816541731357574,
0.09933990985155106,
-0.046615567058324814,
0.02536853775382042,
0.1103772297501564,
0.06441091746091843,
0.17459067702293396,
-0.1041744276881218,
-0.047827910631895065,
-0.053442586213350296,
-0.027490761131048203,
0.10323431342840195,
0.08420458436012268,
-0.026333309710025787,
-0.02029380016028881,
-0.011837414465844631,
-0.029771961271762848,
-0.008924470283091068,
-0.01419348269701004,
-0.024911928921937943,
-0.025614207610487938,
-0.03140588849782944,
-0.013650620356202126,
0.048741500824689865,
-0.01766148768365383,
-0.03522507846355438,
-0.09665893018245697,
-0.00817037746310234,
0.10262291878461838,
-0.056552302092313766,
0.008671930059790611,
-0.08094139397144318,
0.06940784305334091,
-0.06107659637928009,
-0.0038208344485610723,
-0.18607006967067719,
-0.0018651289865374565,
0.030232669785618782,
-0.023674724623560905,
-0.024663368239998817,
0.12763656675815582,
0.06756433844566345,
0.039581868797540665,
-0.08018340915441513,
-0.008231827057898045,
-0.008025799877941608,
-0.01988118141889572,
-0.052317459136247635,
-0.16517876088619232,
-0.09084858745336533,
-0.01874946430325508,
0.15849214792251587,
-0.24944043159484863,
0.012479886412620544,
0.06129145249724388,
0.18931283056735992,
0.05862412229180336,
-0.03877315670251846,
0.028694413602352142,
-0.028476376086473465,
-0.028556322678923607,
-0.06282089650630951,
0.011944928206503391,
0.026010576635599136,
-0.025031594559550285,
0.03835827857255936,
-0.15250858664512634,
-0.010584401898086071,
0.07845017313957214,
0.10457073152065277,
-0.1163007989525795,
-0.00764802610501647,
-0.0873340591788292,
-0.02184043452143669,
-0.14589770138263702,
-0.022485703229904175,
0.0574822835624218,
0.033443838357925415,
0.12222956866025925,
-0.08605607599020004,
-0.08700297772884369,
-0.018260443583130836,
0.03970254585146904,
0.037867866456508636,
0.12359679490327835,
-0.011967867612838745,
-0.1861829310655594,
0.051233626902103424,
-0.007372261956334114,
0.012797309085726738,
0.13724195957183838,
-0.023925183340907097,
-0.10276496410369873,
-0.026618817821145058,
0.03818358853459358,
0.04361027479171753,
0.08370459824800491,
0.045483920723199844,
0.06672941148281097,
0.06727965176105499,
0.022244613617658615,
0.026233069598674774,
-0.13858823478221893,
0.012691077776253223,
0.03952781483530998,
-0.10306563973426819,
-0.03808451443910599,
0.032065436244010925,
0.04087575152516365,
0.12004472315311432,
-0.04252789914608002,
0.003951697144657373,
-0.0035860249772667885,
-0.04264059290289879,
-0.1448320597410202,
0.17129801213741302,
-0.03574901074171066,
-0.09598427265882492,
-0.16504766047000885,
0.12659378349781036,
-0.028146639466285706,
0.017060263082385063,
0.030364960432052612,
-0.024082200601696968,
-0.09083046764135361,
-0.07581466436386108,
0.059788864105939865,
0.03726150095462799,
-0.0347883477807045,
-0.10463422536849976,
0.019485386088490486,
0.11130529642105103,
-0.14192938804626465,
0.019920101389288902,
0.018933093175292015,
-0.05420408770442009,
-0.010310267098248005,
-0.01492402795702219,
0.06298208981752396,
0.09501086920499802,
-0.023319058120250702,
0.006279021501541138,
-0.03988887369632721,
0.19876858592033386,
-0.08331581205129623,
0.0470360592007637,
0.10443975776433945,
-0.0012336447834968567,
0.04702751711010933,
0.11149989068508148,
-0.01879058964550495,
-0.05225749686360359,
0.04192814975976944,
0.09807569533586502,
-0.03681201487779617,
-0.2989037334918976,
0.03214716538786888,
-0.05599791184067726,
0.0033588202204555273,
0.06893705576658249,
0.05601157248020172,
-0.008871383965015411,
0.054755087941884995,
-0.1319996565580368,
0.03317096456885338,
0.058899495750665665,
0.014700793661177158,
0.032608650624752045,
0.09449972957372665,
0.09188617765903473,
-0.10537923872470856,
-0.05439015105366707,
0.10045506805181503,
0.013355870731174946,
0.1407938152551651,
-0.07095446437597275,
0.016575228422880173,
0.10321775078773499,
0.10975047200918198,
0.03151592239737511,
0.15059460699558258,
-0.022972391918301582,
0.02652931772172451,
0.006459815893322229,
-0.07347497344017029,
0.0004974230541847646,
0.028681010007858276,
0.011964415200054646,
0.04241716116666794,
-0.09260066598653793,
-0.011123071424663067,
0.03797880560159683,
0.21722903847694397,
0.10831361263990402,
-0.20903372764587402,
-0.06272382289171219,
0.03449664264917374,
-0.01954302377998829,
-0.054139308631420135,
-0.0029253405518829823,
0.10540297627449036,
-0.15498565137386322,
0.13062356412410736,
-0.0716092512011528,
0.09033320844173431,
-0.08357114344835281,
-0.016939224675297737,
0.029156925156712532,
0.06659819185733795,
-0.02504245936870575,
0.08208323270082474,
-0.17845705151557922,
0.11543194204568863,
0.007389189675450325,
0.09139426797628403,
-0.050845202058553696,
0.0053417496383190155,
0.057369500398635864,
0.03305165097117424,
0.10222966969013214,
-0.007627697195857763,
0.0036742263473570347,
-0.14498423039913177,
-0.0012765637366101146,
0.021563436836004257,
0.015471166931092739,
-0.10744588822126389,
0.14941880106925964,
-0.042871586978435516,
-0.006337764207273722,
0.0049689291045069695,
0.06809849292039871,
-0.17105069756507874,
-0.09610220044851303,
0.034115225076675415,
-0.0010672039352357388,
0.06728022545576096,
-0.05123797059059143,
-0.055934466421604156,
-0.1135036051273346,
0.15572871267795563,
-0.14491988718509674,
-0.06635169684886932,
-0.12098652124404907,
0.06138613075017929,
0.12912879884243011,
-0.07937545329332352,
-0.01888618804514408,
-0.00981045700609684,
0.12129461765289307,
0.012419062666594982,
-0.10987382382154465,
0.04617232456803322,
-0.0271111149340868,
-0.16055336594581604,
-0.010757198557257652,
0.0862874910235405,
0.04300078749656677,
0.04080560430884361,
0.050585824996232986,
0.03809260576963425,
0.0515931136906147,
-0.08902710676193237,
0.06179045885801315,
0.05063776299357414,
0.08732133358716965,
0.10257523506879807,
-0.03869808465242386,
0.028808176517486572,
-0.08955805003643036,
-0.02638182044029236,
0.06512758135795593,
0.18454651534557343,
-0.09030923247337341,
0.026753446087241173,
0.04801280423998833,
-0.08094741404056549,
-0.17321906983852386,
-0.022621935233473778,
0.005932140629738569,
0.07909601926803589,
0.052098311483860016,
-0.1131972074508667,
0.08041200041770935,
0.08497124910354614,
-0.009760338813066483,
0.04420018568634987,
-0.24236588180065155,
-0.12800025939941406,
0.07398982346057892,
0.03527270257472992,
-0.04452070966362953,
-0.1552528440952301,
-0.0595443919301033,
-0.04792363941669464,
-0.16113822162151337,
0.09966094046831131,
-0.23853793740272522,
0.09265314787626266,
-0.02169710211455822,
-0.05555340275168419,
0.02782549522817135,
-0.03254164382815361,
0.14559084177017212,
0.03704315423965454,
0.10650400817394257,
-0.029908033087849617,
0.13842105865478516,
0.04860394075512886,
-0.06872908771038055,
0.1158016249537468,
0.006159549113363028,
0.06470879167318344,
-0.17451032996177673,
-0.03292342275381088,
-0.052091557532548904,
0.030730942264199257,
-0.032239582389593124,
-0.04431142285466194,
-0.052751749753952026,
0.0572318434715271,
0.10745333135128021,
-0.03819030150771141,
0.05011237412691116,
-0.03163585811853409,
0.19525079429149628,
0.1247667670249939,
0.13444754481315613,
0.030452733859419823,
-0.12904657423496246,
-0.012894667685031891,
0.003914995584636927,
0.05429791286587715,
-0.09973862022161484,
0.10106287896633148,
0.15233537554740906,
0.012433977797627449,
0.17428183555603027,
0.0346231684088707,
-0.1322135180234909,
-0.011582823470234871,
0.09459736943244934,
-0.14378295838832855,
-0.1588556170463562,
0.008689901791512966,
-0.08055943995714188,
-0.11555758863687515,
0.0012805425794795156,
0.15421158075332642,
-0.037018876522779465,
-0.011771781370043755,
0.041255511343479156,
0.038262467831373215,
-0.037689756602048874,
0.19010360538959503,
0.07061760127544403,
0.08380077034235,
-0.09309913218021393,
0.11735571920871735,
0.11578018963336945,
-0.11111869663000107,
-0.012193690985441208,
0.01728857308626175,
-0.09585456550121307,
-0.018226953223347664,
0.019519004970788956,
0.18881253898143768,
-0.03977097570896149,
-0.06123020499944687,
-0.11453235149383545,
-0.11591246724128723,
0.055069223046302795,
0.10023930668830872,
0.03270302712917328,
0.10500282049179077,
-0.0003626321267802268,
-0.03652825206518173,
-0.08710423111915588,
0.07694271951913834,
0.15196344256401062,
0.043458037078380585,
-0.1407405287027359,
0.13849440217018127,
-0.020573407411575317,
-0.0025760356802493334,
-0.01876174658536911,
0.033242367208004,
-0.08561629056930542,
-0.003736833343282342,
-0.18896110355854034,
0.034863878041505814,
-0.026453686878085136,
-0.0038379926700145006,
-0.023923564702272415,
-0.03101806342601776,
-0.054383594542741776,
0.041795358061790466,
-0.08512568473815918,
-0.04478934034705162,
-0.012925226241350174,
0.042628213763237,
-0.17092210054397583,
-0.03362124040722847,
0.0603402778506279,
-0.1127355545759201,
0.07236781716346741,
0.003489978378638625,
-0.008068853057920933,
-0.018130263313651085,
-0.09774915874004364,
-0.03954703360795975,
-0.010304348543286324,
0.006390545517206192,
0.039100393652915955,
-0.14559894800186157,
0.07456061989068985,
0.000941780861467123,
-0.03393121436238289,
0.038738396018743515,
-0.01606045290827751,
-0.12187308818101883,
-0.0790872722864151,
-0.009733447805047035,
0.051739051938056946,
-0.06423092633485794,
0.027424415573477745,
0.060230087488889694,
0.10498575866222382,
0.16508427262306213,
-0.07841294258832932,
0.03636123239994049,
-0.25854331254959106,
-0.025417223572731018,
-0.0033441248815506697,
-0.0411950945854187,
0.018056832253932953,
0.012135620228946209,
0.07566048204898834,
-0.03624873608350754,
0.11404351890087128,
0.019343150779604912,
0.06742441654205322,
0.05102495104074478,
-0.08117131143808365,
0.03944941982626915,
0.008575784042477608,
0.09375345706939697,
0.04758157208561897,
-0.02848994731903076,
0.03304126486182213,
-0.018421867862343788,
0.012458342127501965,
0.07978923618793488,
0.13940207660198212,
0.13609212636947632,
-0.0017206503544002771,
0.021805433556437492,
-0.024333074688911438,
-0.07573685795068741,
0.05650962516665459,
0.031830184161663055,
-0.04395115748047829,
0.03778282552957535,
-0.012584991753101349,
0.021406039595603943,
0.1436203420162201,
-0.17951186001300812,
0.08747665584087372,
-0.049867670983076096,
-0.05348190665245056,
-0.15392807126045227,
-0.10550229251384735,
-0.05880706012248993,
-0.10987960547208786,
0.012671267613768578,
-0.11225052922964096,
0.0438743531703949,
0.12157690525054932,
0.024316778406500816,
0.018065670505166054,
0.04562734067440033,
-0.07020201534032822,
-0.06943084299564362,
0.07417966425418854,
0.003148031188175082,
-0.0019237904343754053,
0.06586325168609619,
0.025162745267152786,
0.1178017258644104,
0.0014010283630341291,
0.05504902824759483,
-0.014361097477376461,
0.07447054237127304,
0.05873376503586769,
-0.031791653484106064,
-0.056100666522979736,
-0.0024554443079978228,
0.01660841517150402,
0.026090018451213837,
0.1305352747440338,
0.08046795427799225,
0.0309448204934597,
-0.019971691071987152,
0.2008925825357437,
-0.025860639289021492,
-0.08918096125125885,
-0.18379880487918854,
0.20077621936798096,
0.0808362141251564,
0.029976531863212585,
0.05722963064908981,
-0.16164810955524445,
0.059910550713539124,
0.19286887347698212,
0.1101892814040184,
0.014757688157260418,
-0.006194668356329203,
-0.014300189912319183,
-0.002889749826863408,
0.06667070835828781,
0.036278143525123596,
0.0715307965874672,
0.09362518042325974,
-0.055411167442798615,
0.03879031166434288,
-0.04948335886001587,
-0.01676633022725582,
0.051776424050331116,
0.15711288154125214,
-0.021067045629024506,
0.016916586086153984,
-0.019342605024576187,
0.09116991609334946,
0.00023806294484529644,
-0.23061583936214447,
-0.017853092402219772,
-0.06166793778538704,
-0.11864379793405533,
-0.07222165912389755,
-0.020011162385344505,
0.008631603792309761,
0.0025081478524953127,
-0.023083331063389778,
0.00982372835278511,
0.13497918844223022,
0.015136156231164932,
-0.07153511047363281,
-0.07345874607563019,
0.0388149619102478,
-0.029323339462280273,
0.10181671380996704,
-0.0007514455355703831,
0.04775787517428398,
0.11478477716445923,
-0.019608354195952415,
-0.13460804522037506,
0.03267057240009308,
0.04361112415790558,
-0.03224005922675133,
0.006032648961991072,
0.13965760171413422,
0.011689568869769573,
0.006631003227084875,
0.01264915056526661,
-0.05429833009839058,
0.02932305820286274,
-0.029830701649188995,
-0.05425968021154404,
-0.11244375258684158,
0.03143589571118355,
-0.05551641806960106,
0.11448633670806885,
0.21459098160266876,
-0.06123781576752663,
0.08264227211475372,
-0.08803632855415344,
0.025157643482089043,
0.00905520934611559,
-0.0018319167429581285,
0.05162343010306358,
-0.14300870895385742,
0.049682583659887314,
0.01377181801944971,
0.016534820199012756,
-0.26991161704063416,
-0.05447360500693321,
0.06701420247554779,
-0.08360456675291061,
-0.06344336271286011,
0.16716840863227844,
0.026188325136899948,
0.09171880781650543,
-0.04927476495504379,
-0.20905500650405884,
-0.019415000453591347,
0.1128198429942131,
-0.09368766844272614,
-0.07421284914016724
] |
null | null |
transformers
|
# FSMT
## Model description
This is a ported version of [fairseq wmt19 transformer](https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md) for en-de.
For more details, please see, [Facebook FAIR's WMT19 News Translation Task Submission](https://arxiv.org/abs/1907.06616).
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* [wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru)
* [wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en)
* [wmt19-en-de](https://huggingface.co/facebook/wmt19-en-de)
* [wmt19-de-en](https://huggingface.co/facebook/wmt19-de-en)
## Intended uses & limitations
#### How to use
```python
from transformers import FSMTForConditionalGeneration, FSMTTokenizer
mname = "facebook/wmt19-en-de"
tokenizer = FSMTTokenizer.from_pretrained(mname)
model = FSMTForConditionalGeneration.from_pretrained(mname)
input = "Machine learning is great, isn't it?"
input_ids = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(input_ids)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded) # Maschinelles Lernen ist großartig, oder?
```
#### Limitations and bias
- The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, [content gets truncated](https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981)
## Training data
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the [paper](https://arxiv.org/abs/1907.06616).
## Eval results
pair | fairseq | transformers
-------|---------|----------
en-de | [43.1](http://matrix.statmt.org/matrix/output/1909?run_id=6862) | 42.83
The score is slightly below the score reported by `fairseq`, since `transformers`` currently doesn't support:
- model ensemble, therefore the best performing checkpoint was ported (``model4.pt``).
- re-ranking
The score was calculated using this code:
```bash
git clone https://github.com/huggingface/transformers
cd transformers
export PAIR=en-de
export DATA_DIR=data/$PAIR
export SAVE_DIR=data/$PAIR
export BS=8
export NUM_BEAMS=15
mkdir -p $DATA_DIR
sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source
sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target
echo $PAIR
PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py facebook/wmt19-$PAIR $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS
```
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with `--num_beams 50`.
## Data Sources
- [training, etc.](http://www.statmt.org/wmt19/)
- [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561)
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020},
title={Facebook FAIR's WMT19 News Translation Task Submission},
author={Ng, Nathan and Yee, Kyra and Baevski, Alexei and Ott, Myle and Auli, Michael and Edunov, Sergey},
booktitle={Proc. of WMT},
}
```
## TODO
- port model ensemble (fairseq uses 4 model checkpoints)
|
{"language": ["en", "de"], "license": "apache-2.0", "tags": ["translation", "wmt19", "facebook"], "datasets": ["wmt19"], "metrics": ["bleu"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
|
translation
|
facebook/wmt19-en-de
|
[
"transformers",
"pytorch",
"fsmt",
"text2text-generation",
"translation",
"wmt19",
"facebook",
"en",
"de",
"dataset:wmt19",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.06616"
] |
[
"en",
"de"
] |
TAGS
#transformers #pytorch #fsmt #text2text-generation #translation #wmt19 #facebook #en #de #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FSMT
====
Model description
-----------------
This is a ported version of fairseq wmt19 transformer for en-de.
For more details, please see, Facebook FAIR's WMT19 News Translation Task Submission.
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* wmt19-en-ru
* wmt19-ru-en
* wmt19-en-de
* wmt19-de-en
Intended uses & limitations
---------------------------
#### How to use
#### Limitations and bias
* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated
Training data
-------------
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.
Eval results
------------
pair: en-de, fairseq: 43.1, transformers: 42.83
The score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:
* model ensemble, therefore the best performing checkpoint was ported (''URL'').
* re-ranking
The score was calculated using this code:
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\_beams 50'.
Data Sources
------------
* training, etc.
* test set
### BibTeX entry and citation info
TODO
----
* port model ensemble (fairseq uses 4 model checkpoints)
|
[
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: en-de, fairseq: 43.1, transformers: 42.83\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
"TAGS\n#transformers #pytorch #fsmt #text2text-generation #translation #wmt19 #facebook #en #de #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: en-de, fairseq: 43.1, transformers: 42.83\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
81,
5,
205,
30
] |
[
"passage: TAGS\n#transformers #pytorch #fsmt #text2text-generation #translation #wmt19 #facebook #en #de #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### How to use#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: en-de, fairseq: 43.1, transformers: 42.83\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
-0.10516039282083511,
0.0652281641960144,
-0.0045941779389977455,
0.06413496285676956,
0.08192984014749527,
0.0069724638015031815,
0.0946304053068161,
0.13249416649341583,
-0.008602321147918701,
0.058056656271219254,
-0.025802144780755043,
0.09506805986166,
0.12326735258102417,
0.15398485958576202,
-0.00935448333621025,
-0.12320607900619507,
0.030172400176525116,
0.03757316246628761,
-0.03647517040371895,
0.11605235934257507,
0.09997153282165527,
-0.08767979592084885,
0.024841785430908203,
0.008398984558880329,
-0.02202090248465538,
0.10085320472717285,
0.03158373758196831,
-0.042431026697158813,
0.0669657364487648,
0.0455281101167202,
0.08212672919034958,
0.03638264536857605,
0.015827598050236702,
-0.24470342695713043,
0.026417454704642296,
0.10836448520421982,
0.007000049576163292,
0.08406084030866623,
0.10160651057958603,
-0.04914337396621704,
0.10717915743589401,
-0.042621269822120667,
0.0012036740081384778,
0.0483839325606823,
-0.10938220471143723,
-0.24128426611423492,
-0.09809029847383499,
0.10713057219982147,
0.089288130402565,
0.04631141200661659,
-0.037179868668317795,
0.15757906436920166,
-0.040223557502031326,
0.05290009081363678,
0.17128711938858032,
-0.30581533908843994,
-0.01405942253768444,
0.0723179504275322,
0.04225267842411995,
0.012069808319211006,
-0.12659145891666412,
-0.038866713643074036,
0.030393585562705994,
0.028317198157310486,
0.04526340961456299,
-0.04567605257034302,
-0.0015803196001797915,
0.041972048580646515,
-0.15605562925338745,
-0.058492690324783325,
0.09852701425552368,
0.07470645010471344,
-0.10972422361373901,
-0.08577203005552292,
-0.055694885551929474,
-0.04795006290078163,
0.025979001075029373,
-0.10548226535320282,
0.025575101375579834,
0.028625020757317543,
-0.016438636928796768,
0.003089767647907138,
-0.09586161375045776,
-0.07006315886974335,
-0.059633418917655945,
0.007077611517161131,
0.010340340435504913,
0.01782320626080036,
-0.03614171966910362,
0.15229229629039764,
-0.031048297882080078,
-0.06409811973571777,
-0.06532365828752518,
-0.05982007086277008,
-0.10978057980537415,
-0.05104377493262291,
0.05852112919092178,
-0.04942796751856804,
0.028857572004199028,
0.19058628380298615,
-0.03382248431444168,
0.03277130052447319,
-0.026568220928311348,
0.01604039967060089,
0.02186746336519718,
0.0996093899011612,
-0.06572677195072174,
-0.06696508079767227,
0.03110911138355732,
0.043280377984046936,
-0.03212878480553627,
0.0379377156496048,
0.039695415645837784,
-0.04377402737736702,
0.08772150427103043,
0.06330502033233643,
0.019974637776613235,
-0.0070617846213281155,
-0.0781637504696846,
-0.039266426116228104,
0.18108806014060974,
-0.14082962274551392,
0.022494979202747345,
0.03404076397418976,
-0.07524246722459793,
0.029745232313871384,
0.06879289448261261,
-0.00873563438653946,
-0.09287512302398682,
0.030967434868216515,
-0.03511117026209831,
-0.022792477160692215,
-0.07167629897594452,
-0.11561538279056549,
0.014236736111342907,
-0.035168297588825226,
-0.04690663516521454,
-0.08329354226589203,
-0.17282864451408386,
-0.09472450613975525,
-0.0018120934255421162,
-0.019360825419425964,
0.024880854412913322,
0.027007073163986206,
-0.0033726354595273733,
-0.030309483408927917,
-0.029582291841506958,
0.04598284512758255,
-0.06339610368013382,
0.06178751215338707,
-0.013208502903580666,
0.02376706898212433,
0.08741562068462372,
0.017654743045568466,
-0.10475552082061768,
0.03102591633796692,
-0.1953497976064682,
0.08383185416460037,
-0.06578349322080612,
-0.051361918449401855,
-0.14222046732902527,
-0.06787017732858658,
-0.046753302216529846,
-0.007619245443493128,
0.07346892356872559,
0.13975055515766144,
-0.16528546810150146,
-0.02757827192544937,
0.15294887125492096,
-0.09720437973737717,
-0.09226532280445099,
0.08403726667165756,
-0.035280026495456696,
0.05036440119147301,
0.11190929263830185,
0.05921967700123787,
0.14607378840446472,
-0.11618934571743011,
-0.07665728032588959,
-0.015744628384709358,
-0.020576782524585724,
0.05689292401075363,
0.08904798328876495,
-0.039579879492521286,
0.003191671334207058,
-0.01790548861026764,
-0.0029184608720242977,
-0.021049296483397484,
-0.030132131651043892,
-0.01658003404736519,
-0.026553772389888763,
-0.025032470002770424,
-0.05101305618882179,
0.0600309856235981,
-0.005148855969309807,
-0.02998393587768078,
-0.07297658920288086,
0.03151527792215347,
0.08732752501964569,
-0.049102265387773514,
0.020207708701491356,
-0.08069059252738953,
0.09065019339323044,
-0.10103168338537216,
-0.015564030967652798,
-0.20283903181552887,
0.03748096525669098,
0.029714157804846764,
-0.01240081712603569,
-0.0124453604221344,
0.11610909551382065,
0.06662683188915253,
0.011653482913970947,
-0.0553751066327095,
-0.009016976691782475,
0.0009712449973449111,
-0.026861170306801796,
-0.07968156784772873,
-0.16889357566833496,
-0.08358090370893478,
-0.00830908678472042,
0.09113378822803497,
-0.24928303062915802,
-0.0034926070366054773,
0.07271168380975723,
0.17743219435214996,
0.043747711926698685,
-0.05060642585158348,
0.036295972764492035,
-0.027575356885790825,
-0.031928081065416336,
-0.054005350917577744,
0.014120304957032204,
0.02873160131275654,
-0.017228810116648674,
0.006687878165394068,
-0.10283328592777252,
-0.02407846972346306,
0.07287140190601349,
0.06866925209760666,
-0.10153066366910934,
0.00633761752396822,
-0.10306531190872192,
-0.0036975424736738205,
-0.14936070144176483,
-0.0009387654135935009,
0.04985848069190979,
0.03560224175453186,
0.11122174561023712,
-0.09419022500514984,
-0.10021783411502838,
-0.004708987660706043,
0.03061278723180294,
0.01971011608839035,
0.15089696645736694,
0.0026157477404922247,
-0.1947130560874939,
0.045461494475603104,
-0.008538200519979,
0.04370442032814026,
0.1383446753025055,
-0.02160109207034111,
-0.10925914347171783,
-0.025917140766978264,
0.04726213961839676,
0.02618025615811348,
0.11032329499721527,
0.042879316955804825,
0.06551673263311386,
0.06693956255912781,
0.029451588168740273,
0.03248753398656845,
-0.14372193813323975,
0.01906287856400013,
0.025047363713383675,
-0.10585673898458481,
-0.051658548414707184,
0.0266034547239542,
0.037540990859270096,
0.11078601330518723,
-0.02782188169658184,
0.03425445780158043,
-0.012927720323204994,
-0.03960231691598892,
-0.15881361067295074,
0.1739177405834198,
-0.04010994732379913,
-0.12120270729064941,
-0.17485302686691284,
0.1223616749048233,
-0.011676397174596786,
0.00626174733042717,
0.025970090180635452,
-0.015785232186317444,
-0.0708293542265892,
-0.08786240965127945,
0.049427445977926254,
0.034491345286369324,
-0.030691664665937424,
-0.09245941042900085,
0.014607632532715797,
0.11631150543689728,
-0.13559947907924652,
0.017849842086434364,
0.01288920734077692,
-0.018959568813443184,
-0.01630755327641964,
-0.00491733755916357,
0.06101624295115471,
0.06521563231945038,
-0.01103487703949213,
0.01464752946048975,
-0.042175617069005966,
0.2554662823677063,
-0.09791602194309235,
0.0415346622467041,
0.11440542340278625,
0.01339997909963131,
0.029334457591176033,
0.08434397727251053,
-0.014093254692852497,
-0.03993469104170799,
0.023674942553043365,
0.07980087399482727,
-0.023805776610970497,
-0.3061072528362274,
0.05636786296963692,
-0.06459541618824005,
-0.015210681594908237,
0.06259400397539139,
0.06319285929203033,
-0.03797321394085884,
0.05306246131658554,
-0.09961158782243729,
0.03920267894864082,
0.04360093176364899,
0.025251641869544983,
0.02648422122001648,
0.09931343048810959,
0.0747724249958992,
-0.11935422569513321,
-0.053758010268211365,
0.11771631985902786,
0.0066415718756616116,
0.12969999015331268,
-0.06182409077882767,
0.007844598963856697,
0.11177326738834381,
0.09973854571580887,
0.013269396498799324,
0.1458212286233902,
-0.03384656831622124,
0.029742849990725517,
0.013681276701390743,
-0.06725917756557465,
0.005078090820461512,
0.012919943779706955,
0.018612928688526154,
0.04712502285838127,
-0.07229921221733093,
-0.04806053638458252,
0.036701444536447525,
0.20827975869178772,
0.11129958182573318,
-0.16824905574321747,
-0.08337491005659103,
0.03224276378750801,
-0.04299621656537056,
-0.07674150168895721,
-0.028774747624993324,
0.08674892038106918,
-0.16631661355495453,
0.12339507043361664,
-0.04902990907430649,
0.10393862426280975,
-0.07558037340641022,
-0.02364322543144226,
0.009300686419010162,
0.04707825556397438,
-0.02109977975487709,
0.0849890261888504,
-0.1875973343849182,
0.11324485391378403,
0.020166750997304916,
0.11975651979446411,
-0.051883962005376816,
0.0043119993060827255,
0.035442106425762177,
0.00613303342834115,
0.09611426293849945,
-0.003750243689864874,
0.04348355904221535,
-0.1567012220621109,
-0.03025748021900654,
0.02036789245903492,
0.011090816929936409,
-0.07981111854314804,
0.13720817863941193,
-0.01762678660452366,
0.001592225511558354,
-0.015047164633870125,
0.043577760457992554,
-0.17684097588062286,
-0.0838654637336731,
0.044940035790205,
-0.024386024102568626,
0.052291139960289,
-0.020386075600981712,
-0.05127380043268204,
-0.1220182478427887,
0.17411215603351593,
-0.18464572727680206,
-0.06490161269903183,
-0.1250794529914856,
0.08409661054611206,
0.1407518833875656,
-0.07860340923070908,
-0.02358887530863285,
-0.011590746231377125,
0.1165100485086441,
0.026105374097824097,
-0.10322114080190659,
0.03307120129466057,
-0.031093956902623177,
-0.15793642401695251,
0.0046193720772862434,
0.08085597306489944,
0.0509490892291069,
0.04239838942885399,
0.052083570510149,
0.028277631849050522,
0.016684094443917274,
-0.09248258918523788,
0.06598420441150665,
0.03564585745334625,
0.08360284566879272,
0.08216232806444168,
-0.018591100350022316,
0.02215045504271984,
-0.08477602899074554,
-0.04385880380868912,
0.061754390597343445,
0.18162432312965393,
-0.07137995958328247,
0.024429801851511,
0.03336149826645851,
-0.10153813660144806,
-0.13917535543441772,
0.008523032069206238,
0.015063783153891563,
0.07332935184240341,
0.027973424643278122,
-0.11355379223823547,
0.0853329598903656,
0.06488244980573654,
-0.01623864471912384,
0.06613472104072571,
-0.1970815360546112,
-0.12436020374298096,
0.06411975622177124,
0.03936218470335007,
-0.0156980250030756,
-0.16041965782642365,
-0.07238051295280457,
-0.030796095728874207,
-0.1825561821460724,
0.11899665743112564,
-0.1885983645915985,
0.08580758422613144,
-0.020305760204792023,
-0.06364059448242188,
0.014509446918964386,
-0.03370054438710213,
0.13295656442642212,
0.016793303191661835,
0.09183218330144882,
-0.011341444216668606,
0.13494127988815308,
0.016200847923755646,
-0.05051669478416443,
0.12146730720996857,
0.040778834372758865,
0.05623443052172661,
-0.21481676399707794,
-0.03735191747546196,
-0.059171099215745926,
0.03901316225528717,
-0.03815031051635742,
-0.02422293648123741,
-0.05800517275929451,
0.0555596686899662,
0.08964243531227112,
-0.023126384243369102,
0.07683448493480682,
-0.04028879106044769,
0.17544874548912048,
0.09038740396499634,
0.13698714971542358,
0.038189563900232315,
-0.12159446626901627,
-0.011219954118132591,
-0.013469092547893524,
0.04226745292544365,
-0.09388669580221176,
0.0857602134346962,
0.15284523367881775,
0.002557633677497506,
0.1771954447031021,
0.01827104389667511,
-0.13760218024253845,
-0.00940043106675148,
0.08971413224935532,
-0.14444959163665771,
-0.14023226499557495,
0.01805308274924755,
-0.06830083578824997,
-0.10548709332942963,
0.009679307229816914,
0.15376423299312592,
-0.019822528585791588,
-0.02998383343219757,
0.03355643153190613,
0.035971906036138535,
-0.03274312615394592,
0.19395451247692108,
0.07614811509847641,
0.08002597838640213,
-0.09078957885503769,
0.09006612002849579,
0.09417752921581268,
-0.08546263724565506,
-0.0012149304384365678,
0.021170346066355705,
-0.08210603147745132,
-0.01708124950528145,
0.04555688798427582,
0.17838120460510254,
-0.04437089338898659,
-0.07097133249044418,
-0.13358080387115479,
-0.10758951306343079,
0.0686853751540184,
0.1189361959695816,
0.03406848385930061,
0.07573170959949493,
0.020089467987418175,
-0.03231954574584961,
-0.07927720993757248,
0.060068901628255844,
0.1462107002735138,
0.026910295709967613,
-0.11219493299722672,
0.13980647921562195,
-0.022482018917798996,
0.00957133062183857,
-0.008394778706133366,
0.03463944420218468,
-0.06798295676708221,
-0.004118942189961672,
-0.16639955341815948,
0.039176519960165024,
-0.041147563606500626,
-0.010557434521615505,
-0.028243383392691612,
-0.038843050599098206,
-0.058933988213539124,
0.0347176194190979,
-0.09249301254749298,
-0.029999295249581337,
-0.00892410334199667,
0.034069206565618515,
-0.17940349876880646,
-0.019747819751501083,
0.04630706459283829,
-0.10634347051382065,
0.06778376549482346,
0.014825044199824333,
-0.001422148197889328,
-0.020277511328458786,
-0.10413805395364761,
-0.0077799041755497456,
-0.003964412957429886,
0.0213830154389143,
0.03379787132143974,
-0.12007851898670197,
0.0866902619600296,
0.02861926332116127,
-0.007903792895376682,
0.04586254060268402,
-0.022733887657523155,
-0.10600187629461288,
-0.08533350378274918,
-0.034566059708595276,
0.06289457529783249,
-0.05159949138760567,
0.05277245119214058,
0.05814452841877937,
0.09721646457910538,
0.16223721206188202,
-0.07184465229511261,
0.033114198595285416,
-0.24995702505111694,
-0.023395752534270287,
-0.00605610478669405,
-0.042350172996520996,
0.0014001373201608658,
-0.0007074789609760046,
0.08681821078062057,
-0.036075230687856674,
0.1263466477394104,
0.02535555325448513,
0.08105196058750153,
0.048715561628341675,
-0.09311621636152267,
0.02857702225446701,
-0.002842303831130266,
0.07686223089694977,
0.037647999823093414,
-0.020236294716596603,
0.04364525154232979,
-0.016240127384662628,
0.014829285442829132,
0.09121287614107132,
0.15177586674690247,
0.1567230224609375,
-0.007997545413672924,
0.03970639035105705,
-0.04227833449840546,
-0.06545314192771912,
0.0826084315776825,
0.003316687885671854,
-0.06588700413703918,
0.05547504127025604,
-0.010171467438340187,
0.04526659473776817,
0.10150235891342163,
-0.1861346960067749,
0.06723177433013916,
-0.04666899889707565,
-0.05515904352068901,
-0.15761232376098633,
-0.08716680109500885,
-0.05230388417840004,
-0.08557175099849701,
0.012718170881271362,
-0.11460024118423462,
0.03742005676031113,
0.08351787179708481,
0.03764978051185608,
0.00714060990139842,
0.025645967572927475,
-0.029820548370480537,
-0.07102145999670029,
0.09703586995601654,
-0.001890911953523755,
0.0007846778607927263,
0.016287455335259438,
0.04513378441333771,
0.11184023320674896,
-0.010624046437442303,
0.07049679756164551,
-0.0025320968125015497,
0.06066809594631195,
0.03578471019864082,
-0.02441267855465412,
-0.06526735424995422,
-0.005552307236939669,
0.01976386271417141,
0.03914069011807442,
0.15093189477920532,
0.08087234944105148,
0.03987861052155495,
-0.020100891590118408,
0.18082357943058014,
-0.015579520724713802,
-0.09336446225643158,
-0.16466449201107025,
0.18302381038665771,
0.05966983735561371,
0.010205560363829136,
0.04842043295502663,
-0.16470865905284882,
0.05309073626995087,
0.23357360064983368,
0.07312346994876862,
0.007427297066897154,
-0.014584213495254517,
0.0131358802318573,
-0.004427922889590263,
0.07905199378728867,
0.024795640259981155,
0.05658441036939621,
0.1249692440032959,
-0.05963745713233948,
0.023462843149900436,
-0.0710558146238327,
-0.011176078580319881,
0.06458189338445663,
0.15257489681243896,
-0.015435406938195229,
0.00745205394923687,
-0.025270745158195496,
0.08852327615022659,
-0.03382391855120659,
-0.25208359956741333,
-0.0010272054933011532,
-0.06532638520002365,
-0.11633098125457764,
-0.05092582106590271,
-0.026230599731206894,
0.01098901592195034,
-0.011087716557085514,
-0.020724812522530556,
0.02307893894612789,
0.11775720864534378,
0.016874607652425766,
-0.06990237534046173,
-0.07599201053380966,
0.050839535892009735,
-0.06089484691619873,
0.09152290225028992,
0.0005011954926885664,
0.09688106924295425,
0.1144329234957695,
-0.027120569720864296,
-0.1333247423171997,
0.02149677462875843,
0.04487702623009682,
-0.03054458647966385,
0.016151556745171547,
0.11681804060935974,
0.032751258462667465,
-0.017032908275723457,
0.025129033252596855,
-0.0037197230849415064,
0.01412134524434805,
-0.0401555597782135,
-0.05135420709848404,
-0.11416593939065933,
0.02556857280433178,
-0.07583924382925034,
0.10327556729316711,
0.18139435350894928,
-0.06930194795131683,
0.08159726113080978,
-0.08379114419221878,
0.05432690680027008,
-0.0010330952936783433,
0.004000550601631403,
0.03752734884619713,
-0.16920974850654602,
0.0503879114985466,
0.004406018648296595,
0.026272302493453026,
-0.2685440182685852,
-0.050460461527109146,
0.06801062077283859,
-0.0907241553068161,
-0.07343287765979767,
0.1589672714471817,
0.0019370749359950423,
0.08737332373857498,
-0.04510083049535751,
-0.12768957018852234,
-0.02806350402534008,
0.10077153891324997,
-0.06850620359182358,
-0.055470243096351624
] |
null | null |
transformers
|
# FSMT
## Model description
This is a ported version of [fairseq wmt19 transformer](https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md) for en-ru.
For more details, please see, [Facebook FAIR's WMT19 News Translation Task Submission](https://arxiv.org/abs/1907.06616).
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* [wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru)
* [wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en)
* [wmt19-en-de](https://huggingface.co/facebook/wmt19-en-de)
* [wmt19-de-en](https://huggingface.co/facebook/wmt19-de-en)
## Intended uses & limitations
#### How to use
```python
from transformers import FSMTForConditionalGeneration, FSMTTokenizer
mname = "facebook/wmt19-en-ru"
tokenizer = FSMTTokenizer.from_pretrained(mname)
model = FSMTForConditionalGeneration.from_pretrained(mname)
input = "Machine learning is great, isn't it?"
input_ids = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(input_ids)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded) # Машинное обучение - это здорово, не так ли?
```
#### Limitations and bias
- The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, [content gets truncated](https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981)
## Training data
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the [paper](https://arxiv.org/abs/1907.06616).
## Eval results
pair | fairseq | transformers
-------|---------|----------
en-ru | [36.4](http://matrix.statmt.org/matrix/output/1914?run_id=6724) | 33.47
The score is slightly below the score reported by `fairseq`, since `transformers`` currently doesn't support:
- model ensemble, therefore the best performing checkpoint was ported (``model4.pt``).
- re-ranking
The score was calculated using this code:
```bash
git clone https://github.com/huggingface/transformers
cd transformers
export PAIR=en-ru
export DATA_DIR=data/$PAIR
export SAVE_DIR=data/$PAIR
export BS=8
export NUM_BEAMS=15
mkdir -p $DATA_DIR
sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source
sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target
echo $PAIR
PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py facebook/wmt19-$PAIR $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS
```
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with `--num_beams 50`.
## Data Sources
- [training, etc.](http://www.statmt.org/wmt19/)
- [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561)
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020},
title={Facebook FAIR's WMT19 News Translation Task Submission},
author={Ng, Nathan and Yee, Kyra and Baevski, Alexei and Ott, Myle and Auli, Michael and Edunov, Sergey},
booktitle={Proc. of WMT},
}
```
## TODO
- port model ensemble (fairseq uses 4 model checkpoints)
|
{"language": ["en", "ru"], "license": "apache-2.0", "tags": ["translation", "wmt19", "facebook"], "datasets": ["wmt19"], "metrics": ["bleu"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
|
translation
|
facebook/wmt19-en-ru
|
[
"transformers",
"pytorch",
"fsmt",
"text2text-generation",
"translation",
"wmt19",
"facebook",
"en",
"ru",
"dataset:wmt19",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.06616"
] |
[
"en",
"ru"
] |
TAGS
#transformers #pytorch #fsmt #text2text-generation #translation #wmt19 #facebook #en #ru #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FSMT
====
Model description
-----------------
This is a ported version of fairseq wmt19 transformer for en-ru.
For more details, please see, Facebook FAIR's WMT19 News Translation Task Submission.
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* wmt19-en-ru
* wmt19-ru-en
* wmt19-en-de
* wmt19-de-en
Intended uses & limitations
---------------------------
#### How to use
#### Limitations and bias
* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated
Training data
-------------
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.
Eval results
------------
pair: en-ru, fairseq: 36.4, transformers: 33.47
The score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:
* model ensemble, therefore the best performing checkpoint was ported (''URL'').
* re-ranking
The score was calculated using this code:
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\_beams 50'.
Data Sources
------------
* training, etc.
* test set
### BibTeX entry and citation info
TODO
----
* port model ensemble (fairseq uses 4 model checkpoints)
|
[
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: en-ru, fairseq: 36.4, transformers: 33.47\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
"TAGS\n#transformers #pytorch #fsmt #text2text-generation #translation #wmt19 #facebook #en #ru #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: en-ru, fairseq: 36.4, transformers: 33.47\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
81,
5,
204,
30
] |
[
"passage: TAGS\n#transformers #pytorch #fsmt #text2text-generation #translation #wmt19 #facebook #en #ru #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### How to use#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: en-ru, fairseq: 36.4, transformers: 33.47\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
-0.09625723958015442,
0.06669428944587708,
-0.004331756383180618,
0.05941084399819374,
0.08462470024824142,
0.005468032322824001,
0.10183728486299515,
0.1313902735710144,
-0.013683722354471684,
0.060591861605644226,
-0.018233006820082664,
0.09253878891468048,
0.12984412908554077,
0.16163688898086548,
-0.016305768862366676,
-0.1261809915304184,
0.03441765159368515,
0.030601294711232185,
-0.02980784885585308,
0.11872310936450958,
0.10067074745893478,
-0.08498363941907883,
0.026057275012135506,
0.0027440960984677076,
-0.041771192103624344,
0.09909225255250931,
0.03385809063911438,
-0.036920178681612015,
0.06129489839076996,
0.05465365946292877,
0.08197847008705139,
0.041889797896146774,
0.021914098411798477,
-0.24770747125148773,
0.027346031740307808,
0.10508846491575241,
0.0025530022103339434,
0.08344373106956482,
0.09457514435052872,
-0.04485655948519707,
0.10886842757463455,
-0.04553057998418808,
-0.0022070438135415316,
0.05142040550708771,
-0.1120135635137558,
-0.22460708022117615,
-0.10440376400947571,
0.11666837334632874,
0.08494675159454346,
0.04212566465139389,
-0.03905213624238968,
0.15440353751182556,
-0.034592267125844955,
0.053231529891490936,
0.18232756853103638,
-0.30560532212257385,
-0.016283109784126282,
0.07622741162776947,
0.03496502712368965,
0.018666060641407967,
-0.12812739610671997,
-0.03375909850001335,
0.026307670399546623,
0.02853643335402012,
0.04098794236779213,
-0.04467679560184479,
-0.010788711719214916,
0.035809267312288284,
-0.15505243837833405,
-0.05962589755654335,
0.09804821014404297,
0.07599043101072311,
-0.11071224510669708,
-0.08733253180980682,
-0.05683353170752525,
-0.03687864914536476,
0.023625275120139122,
-0.10105551034212112,
0.017607733607292175,
0.027486125007271767,
-0.017489423975348473,
-0.0024351286701858044,
-0.09310398250818253,
-0.07282204180955887,
-0.05077816918492317,
0.01321316510438919,
0.007582117337733507,
0.025979777798056602,
-0.036215201020240784,
0.14712023735046387,
-0.02022917941212654,
-0.06951349228620529,
-0.05970490723848343,
-0.05911097675561905,
-0.11747473478317261,
-0.04420478641986847,
0.04823809117078781,
-0.05768643692135811,
0.03118664026260376,
0.1922498494386673,
-0.04079757630825043,
0.032559119164943695,
-0.031384553760290146,
0.018633615225553513,
0.021520555019378662,
0.09594209492206573,
-0.07756689190864563,
-0.05878012627363205,
0.02734909951686859,
0.04140827804803848,
-0.036887455731630325,
0.0302397720515728,
0.034649066627025604,
-0.04906727373600006,
0.08802378922700882,
0.05799190327525139,
0.016802344471216202,
-0.005045122466981411,
-0.07197289913892746,
-0.04123935475945473,
0.18182320892810822,
-0.13957764208316803,
0.018228555098176003,
0.03572356328368187,
-0.069283626973629,
0.03877248242497444,
0.06817764788866043,
-0.013326183892786503,
-0.09074647724628448,
0.02395504340529442,
-0.034971121698617935,
-0.02468777634203434,
-0.07275734096765518,
-0.11277671903371811,
0.010938063263893127,
-0.0458197183907032,
-0.0489204116165638,
-0.07891833037137985,
-0.17397727072238922,
-0.0993264764547348,
-0.00019855894788634032,
-0.022126510739326477,
0.030825074762105942,
0.023568980395793915,
-0.011202665977180004,
-0.024405587464571,
-0.02835182473063469,
0.03850596398115158,
-0.057622749358415604,
0.0657930076122284,
-0.024976558983325958,
0.026197655126452446,
0.07511036098003387,
0.018887575715780258,
-0.09929049760103226,
0.028372369706630707,
-0.18264983594417572,
0.09051868319511414,
-0.062344182282686234,
-0.05946620553731918,
-0.14217723906040192,
-0.06361722201108932,
-0.04155608266592026,
-0.012898806482553482,
0.07780018448829651,
0.14381812512874603,
-0.1762959212064743,
-0.027742860838770866,
0.16287212073802948,
-0.10349459201097488,
-0.08470766991376877,
0.08857414126396179,
-0.038140445947647095,
0.046161532402038574,
0.10592732578516006,
0.05824629217386246,
0.14756140112876892,
-0.10465199500322342,
-0.08178982138633728,
-0.02173483371734619,
-0.023327652364969254,
0.05590321868658066,
0.084730364382267,
-0.029089972376823425,
0.0021226590033620596,
-0.01764281652867794,
0.0020811473950743675,
-0.02703564241528511,
-0.0355469286441803,
-0.016665782779455185,
-0.02661943808197975,
-0.023544276133179665,
-0.04716472327709198,
0.06070828437805176,
-0.011973096057772636,
-0.02311955951154232,
-0.07146298140287399,
0.0298297218978405,
0.08519598096609116,
-0.051876358687877655,
0.01926589198410511,
-0.07552772015333176,
0.09017064422369003,
-0.09648220241069794,
-0.012429961934685707,
-0.2071409374475479,
0.03863006830215454,
0.02685428410768509,
0.0044740550220012665,
0.0019825883209705353,
0.12205777317285538,
0.06182154640555382,
0.006662165280431509,
-0.05253065377473831,
-0.009801922366023064,
-0.008233228698372841,
-0.025914307683706284,
-0.07327758520841599,
-0.1648569256067276,
-0.09045414626598358,
-0.010119636543095112,
0.09386249631643295,
-0.25460535287857056,
-0.0011587238404899836,
0.06830743700265884,
0.1768886297941208,
0.03995928540825844,
-0.050873152911663055,
0.034205686300992966,
-0.030127357691526413,
-0.027429217472672462,
-0.04922792315483093,
0.012087246403098106,
0.030185304582118988,
-0.019849035888910294,
0.00365994474850595,
-0.0948643907904625,
-0.039578620344400406,
0.06838586181402206,
0.06999880820512772,
-0.11101066321134567,
0.0049600438214838505,
-0.1021093800663948,
-0.0048629241064190865,
-0.1486118882894516,
0.0006333973142318428,
0.04354134947061539,
0.03872350975871086,
0.10409370809793472,
-0.09405834227800369,
-0.0958721861243248,
-0.0011516250669956207,
0.016655011102557182,
0.019194701686501503,
0.14237459003925323,
-0.0018688203999772668,
-0.2173759788274765,
0.048466891050338745,
-0.007950248196721077,
0.03712167590856552,
0.1407507359981537,
-0.018552428111433983,
-0.10964944213628769,
-0.028396615758538246,
0.04326965659856796,
0.02612270787358284,
0.10997653007507324,
0.07052872329950333,
0.0655900165438652,
0.06658822298049927,
0.026782488450407982,
0.02960912138223648,
-0.13877372443675995,
0.019190380349755287,
0.01735723949968815,
-0.10938825458288193,
-0.05032633990049362,
0.03767592832446098,
0.039330486208200455,
0.11482363939285278,
-0.02601909078657627,
0.033171918243169785,
-0.016248080879449844,
-0.04564798250794411,
-0.15979211032390594,
0.17653337121009827,
-0.039126887917518616,
-0.12967367470264435,
-0.18159860372543335,
0.12252088636159897,
-0.015752648934721947,
0.004382593557238579,
0.018995629623532295,
-0.017826998606324196,
-0.062306296080350876,
-0.08496488630771637,
0.045338716357946396,
0.03618118166923523,
-0.031065937131643295,
-0.0980077013373375,
0.013763150200247765,
0.116177998483181,
-0.12522418797016144,
0.01884104311466217,
0.007990055717527866,
-0.023575471714138985,
-0.012806249782443047,
-0.011414031498134136,
0.05915993079543114,
0.06286070495843887,
-0.021513957530260086,
0.0193193182349205,
-0.038178663700819016,
0.250343382358551,
-0.09160323441028595,
0.047585126012563705,
0.11669658869504929,
0.003832242451608181,
0.03211342543363571,
0.09279190003871918,
-0.013292556628584862,
-0.04063240811228752,
0.0213673897087574,
0.07535357773303986,
-0.026333699002861977,
-0.3023402988910675,
0.056743260473012924,
-0.06357157975435257,
-0.015939680859446526,
0.0529615618288517,
0.0583454929292202,
-0.043741192668676376,
0.057366061955690384,
-0.10115049034357071,
0.0322832353413105,
0.04190795123577118,
0.02432670071721077,
0.011235611513257027,
0.09877793490886688,
0.07589447498321533,
-0.11966057866811752,
-0.0475638285279274,
0.11396658420562744,
0.0007393776904791594,
0.12983591854572296,
-0.058104608207941055,
0.0036575247067958117,
0.11778624355792999,
0.0985197201371193,
0.015507955104112625,
0.14034052193164825,
-0.03842965140938759,
0.02931085228919983,
0.010071389377117157,
-0.06585188955068588,
-0.006879027932882309,
0.01038365438580513,
0.019773058593273163,
0.04854052886366844,
-0.07534032315015793,
-0.05466219782829285,
0.03630444407463074,
0.20997390151023865,
0.1174263060092926,
-0.1699979156255722,
-0.08324874192476273,
0.030686277896165848,
-0.0393272303044796,
-0.07913286238908768,
-0.03010210022330284,
0.08984868973493576,
-0.16903281211853027,
0.12224021553993225,
-0.05308559909462929,
0.10539311915636063,
-0.07246927917003632,
-0.025375861674547195,
0.0006254612235352397,
0.052756961435079575,
-0.02063477225601673,
0.08378104120492935,
-0.17973454296588898,
0.1182699054479599,
0.020359255373477936,
0.10501863807439804,
-0.0487554632127285,
0.0006199743947945535,
0.03169552981853485,
0.006072016432881355,
0.09714893251657486,
-0.0031280498951673508,
0.03259402886033058,
-0.15293215215206146,
-0.023588895797729492,
0.019343312829732895,
0.02206515520811081,
-0.0798458456993103,
0.1405189484357834,
-0.019598253071308136,
0.007261367980390787,
-0.015552339144051075,
0.05036161094903946,
-0.1738414168357849,
-0.08983131498098373,
0.04202289879322052,
-0.021124204620718956,
0.04103618115186691,
-0.01710321381688118,
-0.04706139862537384,
-0.11084330081939697,
0.17697283625602722,
-0.1658226102590561,
-0.05919457599520683,
-0.12871308624744415,
0.09864027053117752,
0.14242249727249146,
-0.08021429926156998,
-0.022192759439349174,
-0.014841100201010704,
0.11670292913913727,
0.023482749238610268,
-0.10640165954828262,
0.02630751021206379,
-0.02988198772072792,
-0.16083453595638275,
0.004789451137185097,
0.07993172109127045,
0.06287173926830292,
0.03927091881632805,
0.05491999164223671,
0.02987763285636902,
0.01838075928390026,
-0.09469324350357056,
0.06323105841875076,
0.033183395862579346,
0.08382610976696014,
0.09667350351810455,
-0.017704227939248085,
0.009012673981487751,
-0.08490800857543945,
-0.039522554725408554,
0.0603322759270668,
0.17748673260211945,
-0.07551168650388718,
0.02806943655014038,
0.0440891832113266,
-0.09902691096067429,
-0.1484312266111374,
0.004360616207122803,
0.018456825986504555,
0.07654998451471329,
0.029942985624074936,
-0.11239486187696457,
0.07276532799005508,
0.052583251148462296,
-0.017047487199306488,
0.048811834305524826,
-0.19456979632377625,
-0.11852288991212845,
0.07424304634332657,
0.038654010742902756,
-0.01140260137617588,
-0.1579618602991104,
-0.07402285188436508,
-0.035442546010017395,
-0.17718929052352905,
0.12481734156608582,
-0.19443082809448242,
0.08741042017936707,
-0.023582633584737778,
-0.05661068484187126,
0.01748923771083355,
-0.03200904279947281,
0.12586380541324615,
0.021554548293352127,
0.09148432314395905,
-0.015119032934308052,
0.12811416387557983,
0.012051088735461235,
-0.050641972571611404,
0.1239750012755394,
0.03570596128702164,
0.063824363052845,
-0.20480625331401825,
-0.03116511180996895,
-0.06422492116689682,
0.038354694843292236,
-0.03791461139917374,
-0.0217551551759243,
-0.06544484198093414,
0.06345771253108978,
0.08766689151525497,
-0.023081419989466667,
0.07121875882148743,
-0.04510948061943054,
0.17673060297966003,
0.1112806499004364,
0.12614189088344574,
0.04844808951020241,
-0.1028808057308197,
-0.011511939577758312,
-0.014040619134902954,
0.043926503509283066,
-0.08438359200954437,
0.08228011429309845,
0.14643500745296478,
0.0032946420833468437,
0.1797570139169693,
0.016945255920290947,
-0.14174683392047882,
-0.014056402258574963,
0.08702060580253601,
-0.15128591656684875,
-0.14233316481113434,
0.021552158519625664,
-0.057591162621974945,
-0.11442101746797562,
0.011206535622477531,
0.1563420593738556,
-0.022327348589897156,
-0.028019290417432785,
0.03622499480843544,
0.03948560729622841,
-0.034634217619895935,
0.18784089386463165,
0.08371393382549286,
0.07797069847583771,
-0.08982305973768234,
0.09070099890232086,
0.09453907608985901,
-0.06632951647043228,
0.002828268799930811,
0.01861797831952572,
-0.08867063373327255,
-0.01348090823739767,
0.05579344183206558,
0.16743868589401245,
-0.05081220343708992,
-0.06832435727119446,
-0.12689955532550812,
-0.10745978355407715,
0.06621461361646652,
0.13147231936454773,
0.0356471873819828,
0.0725383311510086,
0.01781262271106243,
-0.03470509126782417,
-0.07555608451366425,
0.05995427444577217,
0.14342477917671204,
0.024569794535636902,
-0.1087508276104927,
0.13113319873809814,
-0.02452322654426098,
0.02009161189198494,
-0.010614505037665367,
0.03652456775307655,
-0.07582900673151016,
-0.0033052354119718075,
-0.1471235752105713,
0.0398639477789402,
-0.039927657693624496,
-0.015606286935508251,
-0.030754685401916504,
-0.04191068559885025,
-0.05254455655813217,
0.028583526611328125,
-0.08878891170024872,
-0.0313667356967926,
-0.012048683129251003,
0.040202148258686066,
-0.18529069423675537,
-0.021101294085383415,
0.051966119557619095,
-0.10497342050075531,
0.06894208490848541,
0.01850864477455616,
-0.002173504326492548,
-0.010448052547872066,
-0.11085629463195801,
-0.008282099850475788,
-0.00788721814751625,
0.02505079284310341,
0.033612195402383804,
-0.12175340950489044,
0.08502968400716782,
0.023310182616114616,
-0.009114116430282593,
0.04452928155660629,
-0.013976536691188812,
-0.09842728823423386,
-0.07732673734426498,
-0.049150027334690094,
0.06678928434848785,
-0.05479166656732559,
0.05996248871088028,
0.05835188925266266,
0.10521477460861206,
0.1585218608379364,
-0.07312677055597305,
0.03862528130412102,
-0.2505578100681305,
-0.02187846601009369,
-0.000011379689567547757,
-0.04362434521317482,
0.0021217111498117447,
-0.004196514841169119,
0.08659908175468445,
-0.037657931447029114,
0.11899081617593765,
0.02182937227189541,
0.07771912217140198,
0.0448615700006485,
-0.09337154030799866,
0.025526707991957664,
-0.0008486026781611145,
0.07077804952859879,
0.038050588220357895,
-0.024493062868714333,
0.0421362929046154,
-0.022935589775443077,
0.0014439396327361465,
0.08529875427484512,
0.1622086465358734,
0.15132805705070496,
-0.011856411583721638,
0.03514978289604187,
-0.031189359724521637,
-0.05652106925845146,
0.10074082016944885,
0.007229257375001907,
-0.06555606424808502,
0.04450627788901329,
-0.010188126005232334,
0.056220829486846924,
0.11575357615947723,
-0.18803608417510986,
0.0746816098690033,
-0.03652024641633034,
-0.058882273733615875,
-0.15729615092277527,
-0.08925439417362213,
-0.05273539200425148,
-0.07989785075187683,
0.01518801786005497,
-0.11523169279098511,
0.04583168774843216,
0.08690986782312393,
0.03266657888889313,
0.009904621168971062,
0.024084148928523064,
-0.024167897179722786,
-0.06967662274837494,
0.09032529592514038,
-0.002427211729809642,
-0.0019210119498893619,
0.01545677799731493,
0.04716922342777252,
0.12134521454572678,
-0.0035513315815478563,
0.07387675344944,
0.0003076489083468914,
0.0703197792172432,
0.03565099090337753,
-0.026977302506566048,
-0.06806657463312149,
-0.0033061655703932047,
0.016574373468756676,
0.044786155223846436,
0.1432390660047531,
0.0843387022614479,
0.03995705395936966,
-0.023639924824237823,
0.16553859412670135,
-0.019356349483132362,
-0.09298516064882278,
-0.16889698803424835,
0.17119567096233368,
0.054635416716337204,
0.011123460717499256,
0.04531877860426903,
-0.16435246169567108,
0.047920335084199905,
0.22939035296440125,
0.07930205017328262,
-0.004425578750669956,
-0.01057558972388506,
0.014356683939695358,
-0.0023351889103651047,
0.07854112982749939,
0.03064119815826416,
0.04841991141438484,
0.1220460757613182,
-0.05636931583285332,
0.021411556750535965,
-0.07656969875097275,
-0.016481690108776093,
0.07152749598026276,
0.15196493268013,
-0.015893975272774696,
0.005297765135765076,
-0.027993958443403244,
0.09230456501245499,
-0.04342135787010193,
-0.26724886894226074,
-0.0055703409016132355,
-0.0625624880194664,
-0.11070822924375534,
-0.05163520947098732,
-0.04225732386112213,
0.006363884080201387,
-0.01029176078736782,
-0.02570469304919243,
0.02696552500128746,
0.12263564020395279,
0.016113121062517166,
-0.059595342725515366,
-0.0700482726097107,
0.04944305866956711,
-0.0678425282239914,
0.08207432925701141,
0.0019118398195132613,
0.0957542359828949,
0.11403615772724152,
-0.024706296622753143,
-0.12839841842651367,
0.02333441935479641,
0.039932359009981155,
-0.01676150970160961,
0.01376576628535986,
0.12675297260284424,
0.03240213543176651,
-0.015702323988080025,
0.025323079898953438,
-0.005066682118922472,
0.018463481217622757,
-0.048678670078516006,
-0.050238050520420074,
-0.11058586835861206,
0.03241643309593201,
-0.07311829924583435,
0.10745025426149368,
0.1833968311548233,
-0.06850792467594147,
0.08374611288309097,
-0.0847906544804573,
0.05011754482984543,
-0.001755312317982316,
0.006238959729671478,
0.03835863247513771,
-0.16075095534324646,
0.046379439532756805,
0.00612655421718955,
0.02798234485089779,
-0.2679539620876312,
-0.04766705632209778,
0.0678228959441185,
-0.08706004917621613,
-0.06882078945636749,
0.16385824978351593,
-0.0013923273654654622,
0.0860530287027359,
-0.04418596997857094,
-0.12444879114627838,
-0.02841981314122677,
0.10250719636678696,
-0.07093983888626099,
-0.05906933546066284
] |
null | null |
transformers
|
# FSMT
## Model description
This is a ported version of [fairseq wmt19 transformer](https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md) for ru-en.
For more details, please see, [Facebook FAIR's WMT19 News Translation Task Submission](https://arxiv.org/abs/1907.06616).
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* [wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru)
* [wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en)
* [wmt19-en-de](https://huggingface.co/facebook/wmt19-en-de)
* [wmt19-de-en](https://huggingface.co/facebook/wmt19-de-en)
## Intended uses & limitations
#### How to use
```python
from transformers import FSMTForConditionalGeneration, FSMTTokenizer
mname = "facebook/wmt19-ru-en"
tokenizer = FSMTTokenizer.from_pretrained(mname)
model = FSMTForConditionalGeneration.from_pretrained(mname)
input = "Машинное обучение - это здорово, не так ли?"
input_ids = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(input_ids)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded) # Machine learning is great, isn't it?
```
#### Limitations and bias
- The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, [content gets truncated](https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981)
## Training data
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the [paper](https://arxiv.org/abs/1907.06616).
## Eval results
pair | fairseq | transformers
-------|---------|----------
ru-en | [41.3](http://matrix.statmt.org/matrix/output/1907?run_id=6937) | 39.20
The score is slightly below the score reported by `fairseq`, since `transformers`` currently doesn't support:
- model ensemble, therefore the best performing checkpoint was ported (``model4.pt``).
- re-ranking
The score was calculated using this code:
```bash
git clone https://github.com/huggingface/transformers
cd transformers
export PAIR=ru-en
export DATA_DIR=data/$PAIR
export SAVE_DIR=data/$PAIR
export BS=8
export NUM_BEAMS=15
mkdir -p $DATA_DIR
sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source
sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target
echo $PAIR
PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py facebook/wmt19-$PAIR $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS
```
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with `--num_beams 50`.
## Data Sources
- [training, etc.](http://www.statmt.org/wmt19/)
- [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561)
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020},
title={Facebook FAIR's WMT19 News Translation Task Submission},
author={Ng, Nathan and Yee, Kyra and Baevski, Alexei and Ott, Myle and Auli, Michael and Edunov, Sergey},
booktitle={Proc. of WMT},
}
```
## TODO
- port model ensemble (fairseq uses 4 model checkpoints)
|
{"language": ["ru", "en"], "license": "apache-2.0", "tags": ["translation", "wmt19", "facebook"], "datasets": ["wmt19"], "metrics": ["bleu"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
|
translation
|
facebook/wmt19-ru-en
|
[
"transformers",
"pytorch",
"safetensors",
"fsmt",
"text2text-generation",
"translation",
"wmt19",
"facebook",
"ru",
"en",
"dataset:wmt19",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.06616"
] |
[
"ru",
"en"
] |
TAGS
#transformers #pytorch #safetensors #fsmt #text2text-generation #translation #wmt19 #facebook #ru #en #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FSMT
====
Model description
-----------------
This is a ported version of fairseq wmt19 transformer for ru-en.
For more details, please see, Facebook FAIR's WMT19 News Translation Task Submission.
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* wmt19-en-ru
* wmt19-ru-en
* wmt19-en-de
* wmt19-de-en
Intended uses & limitations
---------------------------
#### How to use
#### Limitations and bias
* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated
Training data
-------------
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.
Eval results
------------
pair: ru-en, fairseq: 41.3, transformers: 39.20
The score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:
* model ensemble, therefore the best performing checkpoint was ported (''URL'').
* re-ranking
The score was calculated using this code:
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\_beams 50'.
Data Sources
------------
* training, etc.
* test set
### BibTeX entry and citation info
TODO
----
* port model ensemble (fairseq uses 4 model checkpoints)
|
[
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: ru-en, fairseq: 41.3, transformers: 39.20\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
"TAGS\n#transformers #pytorch #safetensors #fsmt #text2text-generation #translation #wmt19 #facebook #ru #en #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"#### How to use",
"#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: ru-en, fairseq: 41.3, transformers: 39.20\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set",
"### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
86,
5,
205,
30
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #fsmt #text2text-generation #translation #wmt19 #facebook #ru #en #dataset-wmt19 #arxiv-1907.06616 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### How to use#### Limitations and bias\n\n\n* The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, content gets truncated\n\n\nTraining data\n-------------\n\n\nPretrained weights were left identical to the original model released by fairseq. For more details, please, see the paper.\n\n\nEval results\n------------\n\n\npair: ru-en, fairseq: 41.3, transformers: 39.20\n\n\nThe score is slightly below the score reported by 'fairseq', since 'transformers'' currently doesn't support:\n\n\n* model ensemble, therefore the best performing checkpoint was ported (''URL'').\n* re-ranking\n\n\nThe score was calculated using this code:\n\n\nnote: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with '--num\\_beams 50'.\n\n\nData Sources\n------------\n\n\n* training, etc.\n* test set### BibTeX entry and citation info\n\n\nTODO\n----\n\n\n* port model ensemble (fairseq uses 4 model checkpoints)"
] |
[
-0.09554524719715118,
0.0655684843659401,
-0.004202446900308132,
0.06553530693054199,
0.10435760766267776,
0.019605513662099838,
0.07621949911117554,
0.13024753332138062,
-0.030725639313459396,
0.06561913341283798,
-0.029770836234092712,
0.06869972497224808,
0.09686067700386047,
0.13443030416965485,
-0.02503197453916073,
-0.1444750279188156,
0.04297570884227753,
0.015285450965166092,
-0.06016101688146591,
0.10393190383911133,
0.10050860047340393,
-0.07280010730028152,
0.02763126976788044,
-0.008405298925936222,
-0.018323855474591255,
0.08969289809465408,
0.03912380710244179,
-0.029998889192938805,
0.0766763761639595,
0.04254717379808426,
0.1113378033041954,
0.046959150582551956,
0.03151795640587807,
-0.24577586352825165,
0.0279220100492239,
0.13602851331233978,
-0.0009962093317881227,
0.06845404952764511,
0.09376306086778641,
-0.0773521214723587,
0.11093804240226746,
-0.02881433255970478,
0.023583825677633286,
0.06645679473876953,
-0.12287192046642303,
-0.25635936856269836,
-0.10171685367822647,
0.08105885982513428,
0.08998216688632965,
0.029030628502368927,
-0.04449249058961868,
0.13439787924289703,
-0.06951212137937546,
0.043458469212055206,
0.1503390073776245,
-0.322265088558197,
-0.00883246585726738,
0.10839758068323135,
0.013843596912920475,
0.023530254140496254,
-0.13727906346321106,
-0.025716938078403473,
0.0358424037694931,
0.027759557589888573,
0.06801696121692657,
-0.03692014887928963,
-0.015361219644546509,
0.04078869894146919,
-0.162591353058815,
-0.04459890350699425,
0.09767971932888031,
0.0673215463757515,
-0.09459549188613892,
-0.08924517035484314,
-0.08185364305973053,
-0.06863190233707428,
0.026613855734467506,
-0.10524892061948776,
0.025218958035111427,
0.03613235056400299,
-0.01241322886198759,
0.047314755618572235,
-0.07838160544633865,
-0.06820995360612869,
-0.056230057030916214,
0.010821402072906494,
0.020885497331619263,
0.012368817813694477,
-0.04017437994480133,
0.16204142570495605,
-0.05615291744470596,
-0.07636002451181412,
-0.059643782675266266,
-0.05359869822859764,
-0.14631466567516327,
-0.03930487856268883,
0.030514713376760483,
-0.04414430633187294,
0.025829311460256577,
0.23344534635543823,
-0.02299816533923149,
0.03754573315382004,
-0.051728326827287674,
0.008882788009941578,
-0.005352029111236334,
0.08428832143545151,
-0.10150156915187836,
-0.06425657868385315,
0.03074539825320244,
0.055043477565050125,
-0.028184296563267708,
0.025940172374248505,
0.035949986428022385,
-0.04764194041490555,
0.08265656977891922,
0.07498332858085632,
-0.013791559264063835,
0.02451160177588463,
-0.07244087755680084,
-0.028666537255048752,
0.13575570285320282,
-0.15251979231834412,
0.0024379317183047533,
0.019718695431947708,
-0.06700219959020615,
0.02317899279296398,
0.09206172078847885,
0.00038812425918877125,
-0.07261621952056885,
0.07274989783763885,
-0.04494233801960945,
-0.06343656033277512,
-0.07246730476617813,
-0.11151085793972015,
0.0010635589715093374,
-0.03524449095129967,
-0.04275940731167793,
-0.09431105107069016,
-0.15083354711532593,
-0.0948370173573494,
0.000295578851364553,
-0.015992455184459686,
0.01339440606534481,
0.020571449771523476,
-0.02591218426823616,
-0.029698725789785385,
-0.013810047879815102,
0.03742711991071701,
-0.05577899143099785,
0.0489385761320591,
-0.018003815785050392,
0.04005114361643791,
0.11396874487400055,
0.03533763810992241,
-0.10575824975967407,
0.05374157056212425,
-0.21525044739246368,
0.08693130314350128,
-0.07188503444194794,
-0.04425878822803497,
-0.12963198125362396,
-0.048842255026102066,
-0.04084896296262741,
-0.011046910658478737,
0.0514787957072258,
0.11789747327566147,
-0.16725558042526245,
-0.033688027411699295,
0.17472083866596222,
-0.12320002168416977,
-0.08502928167581558,
0.10034476220607758,
-0.04539746046066284,
0.02342761494219303,
0.10643970966339111,
0.0664290189743042,
0.17616219818592072,
-0.10117998719215393,
-0.05426965653896332,
-0.058347396552562714,
-0.026538483798503876,
0.10502893477678299,
0.0848822072148323,
-0.02389768324792385,
-0.01776115968823433,
-0.012121559120714664,
-0.02899325080215931,
-0.010005072690546513,
-0.015118635259568691,
-0.02611869014799595,
-0.028065742924809456,
-0.03211864084005356,
-0.013945089653134346,
0.05073430761694908,
-0.017254073172807693,
-0.035105761140584946,
-0.09935800731182098,
-0.01708953082561493,
0.10402649641036987,
-0.054738402366638184,
0.009017151780426502,
-0.08327457308769226,
0.07325363904237747,
-0.056393932551145554,
-0.0038691535592079163,
-0.18856370449066162,
-0.0018061932642012835,
0.02967587485909462,
-0.017326921224594116,
-0.02181190438568592,
0.12859562039375305,
0.06829371303319931,
0.038793861865997314,
-0.08080204576253891,
-0.009840516373515129,
-0.010823836550116539,
-0.020102523267269135,
-0.05008574575185776,
-0.16367864608764648,
-0.0933864489197731,
-0.019853973761200905,
0.15797173976898193,
-0.24995455145835876,
0.011299628764390945,
0.06155025586485863,
0.1932762861251831,
0.05766357108950615,
-0.03910726308822632,
0.02748643048107624,
-0.027315521612763405,
-0.02947121486067772,
-0.06285107135772705,
0.011648733168840408,
0.024146417155861855,
-0.022571813315153122,
0.03650228679180145,
-0.14548823237419128,
-0.017056895419955254,
0.07636623829603195,
0.110320545732975,
-0.11922924220561981,
-0.0007170306635089219,
-0.0895254984498024,
-0.021084340289235115,
-0.14804252982139587,
-0.018259309232234955,
0.06078973039984703,
0.03490397706627846,
0.11791543662548065,
-0.0825854241847992,
-0.08371119201183319,
-0.016764303669333458,
0.036916688084602356,
0.03751465305685997,
0.12528663873672485,
-0.011387303471565247,
-0.18962018191814423,
0.05050819367170334,
-0.007450785022228956,
0.011605900712311268,
0.13180513679981232,
-0.02472980134189129,
-0.10435765236616135,
-0.024722494184970856,
0.03895802050828934,
0.04575100168585777,
0.07921120524406433,
0.05520498752593994,
0.06606922298669815,
0.06700829416513443,
0.02038082666695118,
0.025111908093094826,
-0.13786229491233826,
0.010530316270887852,
0.036076370626688004,
-0.10088969022035599,
-0.03681419417262077,
0.03450788930058479,
0.04000529274344444,
0.1200101301074028,
-0.04325568303465843,
0.0021561484318226576,
-0.006429341156035662,
-0.042443789541721344,
-0.1431044042110443,
0.1694980412721634,
-0.03540430963039398,
-0.09848283231258392,
-0.1640033721923828,
0.12155194580554962,
-0.02750326879322529,
0.01627703569829464,
0.029428081586956978,
-0.03067438304424286,
-0.09056061506271362,
-0.07563167065382004,
0.0573367103934288,
0.038081489503383636,
-0.035792477428913116,
-0.10617990791797638,
0.021647902205586433,
0.11159498989582062,
-0.13693076372146606,
0.02064593881368637,
0.018028805032372475,
-0.054411884397268295,
-0.006735525093972683,
-0.014642921276390553,
0.06389104574918747,
0.09491223096847534,
-0.024200815707445145,
0.006632725242525339,
-0.037412162870168686,
0.1935531497001648,
-0.08140638470649719,
0.04712013155221939,
0.10452253371477127,
-0.002833023201674223,
0.048158977180719376,
0.11405543982982635,
-0.01895136572420597,
-0.05021997168660164,
0.0422392264008522,
0.09621486067771912,
-0.03750605881214142,
-0.3003571033477783,
0.03053535893559456,
-0.05677185580134392,
0.0017516290536150336,
0.06900370866060257,
0.05763204023241997,
-0.014749133959412575,
0.05543859675526619,
-0.13119451701641083,
0.028579477220773697,
0.05569259077310562,
0.01330513320863247,
0.024037886410951614,
0.09279017150402069,
0.09190484881401062,
-0.1066812202334404,
-0.05335522070527077,
0.10120932012796402,
0.00851280614733696,
0.14766497910022736,
-0.06932774186134338,
0.016640586778521538,
0.10456397384405136,
0.11952503770589828,
0.0354531891644001,
0.1468774378299713,
-0.020180707797408104,
0.02443515881896019,
0.00679118512198329,
-0.07078089565038681,
-0.001197895733639598,
0.03037705458700657,
0.015134092420339584,
0.03981200233101845,
-0.09022548049688339,
-0.00903537031263113,
0.03693981096148491,
0.22144334018230438,
0.10850328952074051,
-0.20743213593959808,
-0.06362650543451309,
0.03393250331282616,
-0.020094428211450577,
-0.052293192595243454,
-0.0007202739361673594,
0.10618720948696136,
-0.15505415201187134,
0.12762439250946045,
-0.07403673231601715,
0.09026772528886795,
-0.07869305461645126,
-0.019191443920135498,
0.0299769788980484,
0.06838428229093552,
-0.023480892181396484,
0.08370356261730194,
-0.1731015145778656,
0.11919558793306351,
0.004201194271445274,
0.08795047551393509,
-0.04722786322236061,
0.003876370145007968,
0.05823912099003792,
0.033320102840662,
0.10279537737369537,
-0.0067960163578391075,
-0.008212736807763577,
-0.1467093527317047,
-0.0008320282795466483,
0.021568700671195984,
0.019429802894592285,
-0.10863490402698517,
0.15055963397026062,
-0.046055957674980164,
-0.005028230603784323,
0.0044683488085865974,
0.07123914361000061,
-0.17082048952579498,
-0.09736038744449615,
0.03465826064348221,
-0.00038218090776354074,
0.06757141649723053,
-0.053931981325149536,
-0.05584675073623657,
-0.11259183287620544,
0.15772166848182678,
-0.13426800072193146,
-0.061914775520563126,
-0.12246663868427277,
0.06234537065029144,
0.13208891451358795,
-0.08136367052793503,
-0.020396986976265907,
-0.009959409944713116,
0.12270928174257278,
0.012170071713626385,
-0.10882426798343658,
0.043167855590581894,
-0.02784069813787937,
-0.16249071061611176,
-0.010043003596365452,
0.08649403601884842,
0.04567520692944527,
0.04256406053900719,
0.048909012228250504,
0.04371882230043411,
0.04978553205728531,
-0.08761592954397202,
0.0630299523472786,
0.04805584251880646,
0.08454645425081253,
0.10409975051879883,
-0.03879188373684883,
0.02882247231900692,
-0.09167565405368805,
-0.028621818870306015,
0.06841704994440079,
0.18065764009952545,
-0.09234097599983215,
0.03151274099946022,
0.04649367928504944,
-0.0786023959517479,
-0.1753910630941391,
-0.027442969381809235,
0.00791424885392189,
0.07941381633281708,
0.0551665760576725,
-0.11213210970163345,
0.08020190894603729,
0.08212348073720932,
-0.010282186791300774,
0.03674915432929993,
-0.2435019612312317,
-0.12503713369369507,
0.07579845190048218,
0.034701354801654816,
-0.05039437860250473,
-0.15370742976665497,
-0.05866118147969246,
-0.04695958271622658,
-0.1544247716665268,
0.09412423521280289,
-0.23867706954479218,
0.09509667754173279,
-0.022087344899773598,
-0.05423261225223541,
0.02730954810976982,
-0.0321730338037014,
0.14277175068855286,
0.03636544570326805,
0.10102302581071854,
-0.030528990551829338,
0.13902296125888824,
0.04836884140968323,
-0.06729470193386078,
0.1119527816772461,
0.007354711648076773,
0.06473113596439362,
-0.17138849198818207,
-0.03085220977663994,
-0.05118265002965927,
0.030607150867581367,
-0.02979935333132744,
-0.0422891266644001,
-0.05575214698910713,
0.061422158032655716,
0.10750874131917953,
-0.03716787323355675,
0.05211411044001579,
-0.0305051077157259,
0.20068366825580597,
0.1274264007806778,
0.12882788479328156,
0.035264018923044205,
-0.13061127066612244,
-0.016079725697636604,
0.005041667725890875,
0.05069020017981529,
-0.09793756157159805,
0.09853214770555496,
0.1524345725774765,
0.011767683550715446,
0.17377693951129913,
0.03359514847397804,
-0.1339162290096283,
-0.00903529766947031,
0.09712821990251541,
-0.14372968673706055,
-0.16295522451400757,
0.00851717870682478,
-0.07751498371362686,
-0.1178918406367302,
-0.0021252308506518602,
0.15366780757904053,
-0.04095783457159996,
-0.0123579828068614,
0.04120811074972153,
0.04124170541763306,
-0.04016251116991043,
0.18967750668525696,
0.07125767320394516,
0.08373891562223434,
-0.09291422367095947,
0.11806753277778625,
0.11772888898849487,
-0.10759976506233215,
-0.013061901554465294,
0.02027873881161213,
-0.09822248667478561,
-0.018487311899662018,
0.023733481764793396,
0.18097543716430664,
-0.03952110558748245,
-0.05981052294373512,
-0.11327783018350601,
-0.11522180587053299,
0.05521006882190704,
0.09877610206604004,
0.0329768992960453,
0.1017625480890274,
-0.003148851450532675,
-0.039733801037073135,
-0.08477797359228134,
0.07569803297519684,
0.15464946627616882,
0.043591778725385666,
-0.1425851285457611,
0.13316674530506134,
-0.018815331161022186,
0.0000699959637131542,
-0.018767105415463448,
0.03337160870432854,
-0.0864567831158638,
-0.0028807634953409433,
-0.19014538824558258,
0.03486653417348862,
-0.02392536960542202,
-0.005578033626079559,
-0.024850815534591675,
-0.03312292695045471,
-0.05654924735426903,
0.0407446064054966,
-0.08630920946598053,
-0.045839402824640274,
-0.014765400439500809,
0.045714929699897766,
-0.16951918601989746,
-0.030447788536548615,
0.06446535885334015,
-0.11022976785898209,
0.06982626020908356,
0.005370330065488815,
-0.009036578238010406,
-0.015040726400911808,
-0.09833575040102005,
-0.0422796793282032,
-0.011255149729549885,
0.00660503376275301,
0.03954467177391052,
-0.1491822749376297,
0.07522089779376984,
-0.0007432499551214278,
-0.03641703724861145,
0.03880111128091812,
-0.015525273978710175,
-0.12018220126628876,
-0.07706336677074432,
-0.015456614084541798,
0.050484154373407364,
-0.06519737094640732,
0.026448838412761688,
0.06110437586903572,
0.1095522865653038,
0.164738729596138,
-0.07944788038730621,
0.040473051369190216,
-0.25664153695106506,
-0.024954169988632202,
-0.0033884388394653797,
-0.040562424808740616,
0.015488814562559128,
0.010769203305244446,
0.07385241240262985,
-0.03649376705288887,
0.11089283227920532,
0.018163971602916718,
0.06506505608558655,
0.051720038056373596,
-0.07906439155340195,
0.03403802216053009,
0.008258281275629997,
0.09026437997817993,
0.05133766680955887,
-0.029766077175736427,
0.03407059609889984,
-0.022764161229133606,
0.008586149662733078,
0.07455960661172867,
0.14132462441921234,
0.1372019648551941,
-0.00011747029202524573,
0.018360549584031105,
-0.025249406695365906,
-0.07703510671854019,
0.061449769884347916,
0.03758613020181656,
-0.04814089089632034,
0.03353205695748329,
-0.013601352460682392,
0.029300523921847343,
0.1475902944803238,
-0.17812597751617432,
0.08668213337659836,
-0.053217217326164246,
-0.053641077131032944,
-0.15055106580257416,
-0.10603949427604675,
-0.06042762100696564,
-0.11195056885480881,
0.015569440089166164,
-0.11100103706121445,
0.04462369158864021,
0.1269376128911972,
0.02318403869867325,
0.020418861880898476,
0.046262819319963455,
-0.07366055250167847,
-0.07079710811376572,
0.0719163566827774,
0.002387130167335272,
-0.0028551039285957813,
0.06608277559280396,
0.024729296565055847,
0.11792933940887451,
0.0002087678003590554,
0.05585700646042824,
-0.014149916358292103,
0.07456093281507492,
0.05949131026864052,
-0.03269362077116966,
-0.05699186772108078,
-0.0008654964040033519,
0.015881143510341644,
0.02783072739839554,
0.12673722207546234,
0.08118195831775665,
0.03350149467587471,
-0.020891064777970314,
0.20318689942359924,
-0.026825379580259323,
-0.08630844950675964,
-0.18548916280269623,
0.19836516678333282,
0.07840238511562347,
0.026952316984534264,
0.05736883357167244,
-0.16135822236537933,
0.05803125724196434,
0.19593223929405212,
0.1110944077372551,
0.01541503518819809,
-0.00574477156624198,
-0.014877994544804096,
-0.0017544809961691499,
0.06538408249616623,
0.036591026932001114,
0.07209483534097672,
0.09029088914394379,
-0.059388935565948486,
0.038589511066675186,
-0.05129871517419815,
-0.018944036215543747,
0.050062213093042374,
0.15833567082881927,
-0.020675381645560265,
0.017275966703891754,
-0.021791545674204826,
0.09401874244213104,
-0.004202604293823242,
-0.23524215817451477,
-0.016935233026742935,
-0.06164015829563141,
-0.11902875453233719,
-0.07190587371587753,
-0.017134761437773705,
0.010334857739508152,
0.0041991956532001495,
-0.023429835215210915,
0.0070237633772194386,
0.13362224400043488,
0.01763647049665451,
-0.06930731236934662,
-0.07230299711227417,
0.040205683559179306,
-0.03264268860220909,
0.09896478056907654,
-0.0033904758747667074,
0.04477880522608757,
0.11512693762779236,
-0.018444692716002464,
-0.13507436215877533,
0.031240953132510185,
0.043846968561410904,
-0.03487279266119003,
0.0036993580870330334,
0.14395476877689362,
0.011421539820730686,
0.00736380647867918,
0.014788152649998665,
-0.05305280536413193,
0.03161340579390526,
-0.03355235606431961,
-0.047309573739767075,
-0.11005864292383194,
0.03452495485544205,
-0.05576785281300545,
0.11690802872180939,
0.2165408879518509,
-0.06151727959513664,
0.08199484646320343,
-0.08942354470491409,
0.024657579138875008,
0.008290784433484077,
0.0002505206794012338,
0.054991934448480606,
-0.14424003660678864,
0.04674224928021431,
0.01651657000184059,
0.01584770530462265,
-0.2683885097503662,
-0.054304517805576324,
0.06678421050310135,
-0.08207718282938004,
-0.06269869208335876,
0.1700807511806488,
0.028549373149871826,
0.08977223187685013,
-0.04712799936532974,
-0.20721472799777985,
-0.018540404736995697,
0.11075086146593094,
-0.0936267152428627,
-0.07278773933649063
] |
null | null |
transformers
|
# WMT 21 En-X
WMT 21 En-X is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.
It was introduced in this [paper](https://arxiv.org/abs/2108.03265) and first released in [this](https://github.com/pytorch/fairseq/tree/main/examples/wmt21) repository.
The model can directly translate English text into 7 other languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de).
To translate into a target language, the target language id is forced as the first generated token.
To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method.
*Note: `M2M100Tokenizer` depends on `sentencepiece`, so make sure to install it before running the example.*
To install `sentencepiece` run `pip install sentencepiece`
Since the model was trained with domain tags, you should prepend them to the input as well.
* "wmtdata newsdomain": Use for sentences in the news domain
* "wmtdata otherdomain": Use for sentences in all other domain
```python
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/wmt21-dense-24-wide-en-x")
tokenizer = AutoTokenizer.from_pretrained("facebook/wmt21-dense-24-wide-en-x")
inputs = tokenizer("wmtdata newsdomain One model for many languages.", return_tensors="pt")
# translate English to German
generated_tokens = model.generate(**inputs, forced_bos_token_id=tokenizer.get_lang_id("de"))
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "Ein Modell für viele Sprachen."
# translate English to Icelandic
generated_tokens = model.generate(**inputs, forced_bos_token_id=tokenizer.get_lang_id("is"))
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "Ein fyrirmynd fyrir mörg tungumál."
```
See the [model hub](https://huggingface.co/models?filter=wmt21) to look for more fine-tuned versions.
## Languages covered
English (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)
## BibTeX entry and citation info
```
@inproceedings{tran2021facebook
title={Facebook AI’s WMT21 News Translation Task Submission},
author={Chau Tran and Shruti Bhosale and James Cross and Philipp Koehn and Sergey Edunov and Angela Fan},
booktitle={Proc. of WMT},
year={2021},
}
```
|
{"language": ["multilingual", "ha", "is", "ja", "cs", "ru", "zh", "de", "en"], "license": "mit", "tags": ["translation", "wmt21"]}
|
translation
|
facebook/wmt21-dense-24-wide-en-x
|
[
"transformers",
"pytorch",
"m2m_100",
"text2text-generation",
"translation",
"wmt21",
"multilingual",
"ha",
"is",
"ja",
"cs",
"ru",
"zh",
"de",
"en",
"arxiv:2108.03265",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2108.03265"
] |
[
"multilingual",
"ha",
"is",
"ja",
"cs",
"ru",
"zh",
"de",
"en"
] |
TAGS
#transformers #pytorch #m2m_100 #text2text-generation #translation #wmt21 #multilingual #ha #is #ja #cs #ru #zh #de #en #arxiv-2108.03265 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# WMT 21 En-X
WMT 21 En-X is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.
It was introduced in this paper and first released in this repository.
The model can directly translate English text into 7 other languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de).
To translate into a target language, the target language id is forced as the first generated token.
To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.
*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*
To install 'sentencepiece' run 'pip install sentencepiece'
Since the model was trained with domain tags, you should prepend them to the input as well.
* "wmtdata newsdomain": Use for sentences in the news domain
* "wmtdata otherdomain": Use for sentences in all other domain
See the model hub to look for more fine-tuned versions.
## Languages covered
English (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)
## BibTeX entry and citation info
|
[
"# WMT 21 En-X\nWMT 21 En-X is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model can directly translate English text into 7 other languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de).\n\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\nSince the model was trained with domain tags, you should prepend them to the input as well.\n* \"wmtdata newsdomain\": Use for sentences in the news domain\n* \"wmtdata otherdomain\": Use for sentences in all other domain\n\n\n\nSee the model hub to look for more fine-tuned versions.",
"## Languages covered\nEnglish (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)",
"## BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #m2m_100 #text2text-generation #translation #wmt21 #multilingual #ha #is #ja #cs #ru #zh #de #en #arxiv-2108.03265 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# WMT 21 En-X\nWMT 21 En-X is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model can directly translate English text into 7 other languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de).\n\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\nSince the model was trained with domain tags, you should prepend them to the input as well.\n* \"wmtdata newsdomain\": Use for sentences in the news domain\n* \"wmtdata otherdomain\": Use for sentences in all other domain\n\n\n\nSee the model hub to look for more fine-tuned versions.",
"## Languages covered\nEnglish (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)",
"## BibTeX entry and citation info"
] |
[
86,
288,
37,
10
] |
[
"passage: TAGS\n#transformers #pytorch #m2m_100 #text2text-generation #translation #wmt21 #multilingual #ha #is #ja #cs #ru #zh #de #en #arxiv-2108.03265 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# WMT 21 En-X\nWMT 21 En-X is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model can directly translate English text into 7 other languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de).\n\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\nSince the model was trained with domain tags, you should prepend them to the input as well.\n* \"wmtdata newsdomain\": Use for sentences in the news domain\n* \"wmtdata otherdomain\": Use for sentences in all other domain\n\n\n\nSee the model hub to look for more fine-tuned versions.## Languages covered\nEnglish (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)## BibTeX entry and citation info"
] |
[
-0.07915531098842621,
-0.06320583820343018,
-0.005829841364175081,
-0.043385669589042664,
0.0811150074005127,
-0.023999430239200592,
0.1550889015197754,
0.06152935326099396,
0.09063110500574112,
0.04788454622030258,
0.03862077370285988,
0.026933230459690094,
0.06078450009226799,
0.07933681458234787,
0.11736704409122467,
-0.17796269059181213,
0.0976724624633789,
-0.06221649795770645,
0.06919917464256287,
0.07100360095500946,
0.09513281285762787,
-0.04088994115591049,
0.05889333784580231,
0.024277841672301292,
-0.08912399411201477,
0.056638848036527634,
-0.021503714844584465,
-0.03904357925057411,
0.06558671593666077,
0.02134665660560131,
0.04290814325213432,
0.034558068960905075,
0.019279425963759422,
-0.15433894097805023,
-0.008244947530329227,
0.02784704975783825,
-0.030380642041563988,
-0.043670590966939926,
0.12218988686800003,
0.018589649349451065,
0.10740251839160919,
-0.21691910922527313,
-0.06858345866203308,
0.03583537042140961,
-0.04148063808679581,
-0.16077110171318054,
-0.042818475514650345,
0.15380921959877014,
0.06982535868883133,
0.05103782191872597,
-0.06679395586252213,
0.05229998007416725,
-0.007240671664476395,
0.06974652409553528,
0.10372915863990784,
-0.2642485499382019,
-0.01353203784674406,
0.06091780588030815,
0.02244747057557106,
0.10834086686372757,
-0.012521416880190372,
0.012580864131450653,
-0.0044819144532084465,
0.017083583399653435,
-0.047365982085466385,
-0.03861173614859581,
0.13633491098880768,
-0.036681920289993286,
-0.1685592383146286,
-0.057943448424339294,
0.125650092959404,
-0.04453044757246971,
-0.06858883053064346,
-0.14359153807163239,
-0.04284482076764107,
-0.004889791831374168,
0.033295560628175735,
0.023707561194896698,
0.007911408320069313,
0.03395475447177887,
0.11855486035346985,
-0.0606229193508625,
-0.09801220148801804,
0.0011432046303525567,
-0.03999814763665199,
0.2514191269874573,
0.008651476353406906,
0.02208760194480419,
0.010143219493329525,
0.08722245693206787,
0.03150226175785065,
-0.10890819877386093,
-0.10998940467834473,
-0.09130818396806717,
-0.11581968516111374,
0.02733394131064415,
-0.03865650296211243,
-0.14002227783203125,
0.04748325049877167,
0.015424862504005432,
-0.05714360252022743,
0.06184948608279228,
-0.004976110532879829,
0.05840490758419037,
0.054502375423908234,
0.16756679117679596,
-0.048794426023960114,
-0.043540969491004944,
-0.006442111451178789,
-0.07803721725940704,
0.03822513297200203,
0.025034740567207336,
-0.07898475974798203,
-0.1052057072520256,
0.05951446294784546,
0.0682549700140953,
0.003555397968739271,
0.10257351398468018,
0.02731645666062832,
-0.03209327906370163,
0.10044946521520615,
-0.14712733030319214,
-0.004701770376414061,
0.022170212119817734,
-0.040198516100645065,
0.08773941546678543,
0.00678228959441185,
-0.037376586347818375,
-0.12713702023029327,
0.04809756949543953,
0.02120678685605526,
0.08234900236129761,
-0.0874478667974472,
-0.11856325715780258,
0.032573871314525604,
-0.09650745987892151,
-0.04268261790275574,
-0.13102614879608154,
-0.11916658282279968,
-0.05954219028353691,
0.024520931765437126,
0.03943316265940666,
0.05851983278989792,
-0.04141894727945328,
-0.03989183157682419,
0.01420577336102724,
-0.03633532300591469,
-0.06449109315872192,
-0.05541886389255524,
0.0018396826926618814,
-0.10225503146648407,
0.04680776596069336,
0.01195033174008131,
-0.029378002509474754,
-0.09126351773738861,
0.014643735252320766,
-0.1633123904466629,
0.17806954681873322,
-0.09151490777730942,
0.04597419500350952,
-0.025777293369174004,
0.0002504863659851253,
0.04873231053352356,
0.05480238422751427,
0.011871786788105965,
0.17312543094158173,
-0.14719092845916748,
-0.011206095106899738,
0.24250809848308563,
-0.17201143503189087,
0.010983827523887157,
0.08044736087322235,
0.018553059548139572,
0.10095103085041046,
0.09312548488378525,
0.13010133802890778,
0.0009441683068871498,
-0.18646018207073212,
-0.04956889525055885,
0.02564953826367855,
-0.14655475318431854,
0.05933768302202225,
0.03575238212943077,
-0.0023981330450624228,
0.01571694016456604,
0.04323474317789078,
-0.04866333305835724,
0.04523719847202301,
0.03212874382734299,
-0.016137348487973213,
0.011076127178966999,
-0.02663268893957138,
-0.01110505685210228,
0.0018416570965200663,
0.0038488581776618958,
-0.007762071210891008,
-0.03868523985147476,
0.15241402387619019,
0.06297516822814941,
-0.03693193569779396,
0.04645843431353569,
-0.06176714971661568,
-0.047420140355825424,
-0.047621335834264755,
0.04885612055659294,
-0.1284642070531845,
-0.022518495097756386,
0.048275332897901535,
0.009479045867919922,
0.0890105590224266,
0.06829524785280228,
0.05982454493641853,
0.06139896437525749,
0.028570128604769707,
-0.0727626383304596,
0.06333457678556442,
-0.011318760924041271,
0.01573045179247856,
-0.12837177515029907,
-0.04016896337270737,
-0.03608587011694908,
0.12238656729459763,
-0.03266960754990578,
0.028398290276527405,
0.05418090894818306,
0.023356063291430473,
-0.0464807003736496,
0.015320591628551483,
0.007063627243041992,
0.01963227614760399,
-0.0251969862729311,
-0.008392495103180408,
0.02805660106241703,
0.047581057995557785,
-0.057608529925346375,
0.17463667690753937,
-0.09629005938768387,
-0.1648746132850647,
0.09957172721624374,
-0.03193005546927452,
-0.0344306156039238,
-0.0376206636428833,
-0.0507391095161438,
-0.03664446994662285,
-0.004547415766865015,
-0.0783831849694252,
0.1218872219324112,
0.032187432050704956,
0.0400204174220562,
-0.0934590995311737,
-0.04026123508810997,
0.0029891422018408775,
-0.07908344268798828,
-0.07496899366378784,
0.05393952503800392,
-0.06329706311225891,
-0.18471676111221313,
0.06875591725111008,
-0.031582724303007126,
-0.02806478552520275,
0.28448233008384705,
0.01203129906207323,
-0.06022884324193001,
-0.015913210809230804,
0.09663359075784683,
0.003090819576755166,
0.05954185128211975,
-0.02866402082145214,
-0.06990206986665726,
0.01986902765929699,
0.005926019977778196,
0.016932174563407898,
-0.043998874723911285,
0.05455927550792694,
-0.030710862949490547,
-0.08368077874183655,
-0.002984805731102824,
0.027127068489789963,
-0.001024981844238937,
0.053421538323163986,
-0.03434520959854126,
0.0482388399541378,
-0.0019568810239434242,
-0.06032261252403259,
-0.1359514743089676,
0.08143563568592072,
-0.15343034267425537,
-0.23880766332149506,
-0.1960407793521881,
-0.0719062089920044,
-0.12941338121891022,
0.03204129636287689,
0.07903788238763809,
-0.09246454387903214,
-0.019833585247397423,
-0.027694333344697952,
0.09678336977958679,
0.006174242589622736,
-0.09191492199897766,
-0.07883650809526443,
0.032326579093933105,
0.012466015294194221,
-0.0735752061009407,
-0.03389697149395943,
-0.02973438985645771,
-0.01622939109802246,
0.04421694949269295,
-0.04462265223264694,
0.07206612825393677,
0.06608328968286514,
0.009752223268151283,
0.016924003139138222,
-0.029243312776088715,
0.20134858787059784,
-0.07108759135007858,
0.1119152158498764,
0.10293899476528168,
-0.03019108809530735,
0.0360223688185215,
0.16304536163806915,
0.005966578610241413,
-0.001458747312426567,
-0.006114510353654623,
-0.06584437191486359,
-0.04799012094736099,
-0.1991318017244339,
-0.0936988815665245,
-0.06866980344057083,
-0.048008885234594345,
0.060278963297605515,
0.050111524760723114,
0.05460748076438904,
0.028346896171569824,
-0.08595921844244003,
-0.0015209302073344588,
0.09753558784723282,
0.12787549197673798,
0.0675698071718216,
0.04536799341440201,
0.04297097772359848,
-0.03815203532576561,
-0.0055507090874016285,
0.046496786177158356,
-0.008303574286401272,
0.1292482614517212,
-0.042139165103435516,
0.12145718932151794,
0.08754172921180725,
0.04657818749547005,
0.06515326350927353,
0.09513522684574127,
0.0066042616963386536,
0.036494798958301544,
0.024119267240166664,
-0.11390968412160873,
-0.05191977322101593,
0.11137280613183975,
0.036264222115278244,
-0.04071268439292908,
0.04482965171337128,
0.004285264760255814,
0.08291321247816086,
0.28429070115089417,
-0.07307430356740952,
-0.07610178738832474,
-0.0484132245182991,
0.004461338277906179,
-0.021709173917770386,
-0.06852909177541733,
-0.040926940739154816,
0.044434309005737305,
-0.10168622434139252,
0.14808759093284607,
0.02561121992766857,
0.0431649424135685,
-0.024364661425352097,
0.011541230604052544,
-0.02153649926185608,
0.07335744798183441,
0.03037305735051632,
0.08524002134799957,
-0.25367289781570435,
0.0893610417842865,
0.01664048060774803,
0.0938044860959053,
-0.08536422252655029,
0.037875548005104065,
0.004310329910367727,
0.046221982687711716,
0.11402057111263275,
0.05944668501615524,
-0.12210999429225922,
-0.01815047301352024,
-0.03223415091633797,
-0.033529285341501236,
0.07292923331260681,
0.025419896468520164,
0.04187539219856262,
0.03184088319540024,
-0.03439291566610336,
-0.10386616736650467,
0.0233335979282856,
-0.09558160603046417,
-0.11018852144479752,
0.04206343740224838,
-0.014018327929079533,
0.09193327277898788,
-0.007958711124956608,
-0.007944418117403984,
-0.11475119739770889,
0.171422079205513,
-0.1503627747297287,
-0.13495561480522156,
-0.07291378080844879,
-0.06726367026567459,
0.067528635263443,
-0.06962127983570099,
-0.005736794788390398,
-0.01032478827983141,
0.12637321650981903,
-0.101939857006073,
-0.06527509540319443,
0.018155807629227638,
-0.09525267034769058,
-0.06576608121395111,
0.013744453899562359,
0.11159629374742508,
0.10454097390174866,
0.007550258655101061,
0.04185473918914795,
0.016134625300765038,
0.013077232986688614,
-0.09828530997037888,
-0.0378502756357193,
0.1200752854347229,
-0.04121305048465729,
0.054731037467718124,
-0.13121840357780457,
-0.16638436913490295,
-0.10280235856771469,
-0.05832003802061081,
0.08117324113845825,
0.21170078217983246,
-0.07589684426784515,
0.15495848655700684,
0.17875021696090698,
-0.09307869523763657,
-0.17969058454036713,
-0.08267879486083984,
0.08801838010549545,
0.05014369636774063,
-0.1342162787914276,
-0.14431647956371307,
0.07341509312391281,
0.021458959206938744,
0.01905413530766964,
0.0059619867242872715,
-0.11201319843530655,
-0.0991472527384758,
0.025311151519417763,
0.0005992731894366443,
0.08838486671447754,
-0.05570773035287857,
-0.04515963792800903,
-0.05323711037635803,
-0.025828247889876366,
0.05725964158773422,
-0.06569771468639374,
0.05794008448719978,
0.07516629248857498,
-0.01596873439848423,
0.05599374324083328,
-0.06494099646806717,
0.11141590774059296,
0.0762035921216011,
-0.02554325759410858,
-0.07886423170566559,
0.07790487259626389,
0.07325302064418793,
-0.05012785270810127,
0.17204023897647858,
0.01581084541976452,
-0.06746628880500793,
-0.05837860703468323,
-0.053075384348630905,
-0.10172255337238312,
0.10868731886148453,
-0.04720928519964218,
-0.04181341081857681,
-0.00483943335711956,
0.0529869943857193,
0.09642382711172104,
0.021429486572742462,
0.01795867830514908,
-0.12831516563892365,
-0.0467396080493927,
0.16573815047740936,
0.12552590668201447,
-0.06688757240772247,
0.026640018448233604,
-0.0074446420185267925,
-0.016888929530978203,
0.049620069563388824,
0.05741623416543007,
0.03345426917076111,
0.054996490478515625,
0.003837412456050515,
0.08481316268444061,
-0.0009362097480334342,
-0.13375549018383026,
-0.04337451234459877,
0.06570467352867126,
-0.13271568715572357,
-0.011112398467957973,
-0.02993403747677803,
0.013616611249744892,
0.11603906005620956,
-0.05494610592722893,
0.17363733053207397,
-0.033056698739528656,
-0.004276597872376442,
0.010975510813295841,
0.0032989594619721174,
-0.04319584369659424,
0.06698349118232727,
-0.018476182594895363,
0.045461248606443405,
-0.08257400244474411,
0.0719163790345192,
0.02886783704161644,
-0.03606986626982689,
0.017036229372024536,
0.17002464830875397,
-0.15203039348125458,
-0.06048274785280228,
-0.08065081387758255,
0.058962784707546234,
-0.0790315568447113,
-0.11290263384580612,
-0.02095159888267517,
-0.07869049906730652,
0.005609709769487381,
0.08884593099355698,
0.05854014679789543,
-0.012868845835328102,
0.0286242738366127,
-0.021936900913715363,
-0.013279431499540806,
0.03869644179940224,
0.02755810134112835,
0.009736515581607819,
-0.04668669402599335,
0.13217094540596008,
0.04279784858226776,
0.12598861753940582,
-0.038446053862571716,
-0.027581267058849335,
-0.04812650755047798,
0.014309115707874298,
-0.0875147208571434,
0.06752359122037888,
-0.1052790954709053,
-0.01020985096693039,
-0.005844858940690756,
-0.0008245570934377611,
0.01256532222032547,
0.0402761846780777,
-0.08406680822372437,
0.01533646509051323,
-0.06180291995406151,
0.04587024822831154,
-0.06274282932281494,
-0.0536189042031765,
-0.022882234305143356,
-0.08729790896177292,
0.07000654190778732,
0.10373028367757797,
-0.09635087102651596,
0.04841448366641998,
0.02250072918832302,
0.02666454203426838,
0.011203888803720474,
0.023211564868688583,
0.015259726904332638,
-0.05487514287233353,
0.017069580033421516,
0.06384138762950897,
-0.02274613454937935,
-0.011193858459591866,
-0.03462157025933266,
-0.025663767009973526,
0.1169988289475441,
0.03861736133694649,
-0.029170498251914978,
-0.08984468877315521,
0.07740569859743118,
0.06426677107810974,
0.05161130055785179,
0.1093720942735672,
-0.10105318576097488,
0.04355928301811218,
-0.029495391994714737,
-0.015199728310108185,
0.024248091503977776,
-0.0379798449575901,
-0.0018482677405700088,
-0.08361417800188065,
0.027125326916575432,
0.02019314467906952,
0.16398309171199799,
0.0020835695322602987,
0.028508536517620087,
0.024130215868353844,
-0.05612291768193245,
-0.10675874352455139,
0.034142352640628815,
0.09872086346149445,
-0.036057423800230026,
0.05863610655069351,
-0.003583586076274514,
-0.038872409611940384,
-0.034758247435092926,
0.0515277236700058,
0.07027968764305115,
0.09740722924470901,
0.19560499489307404,
0.09499577432870865,
0.027767503634095192,
-0.0022122219670563936,
-0.08415734767913818,
-0.04448842629790306,
-0.110248863697052,
0.01699022948741913,
-0.06400936841964722,
0.16627109050750732,
0.11857277899980545,
-0.13528992235660553,
0.04711521044373512,
0.06530030071735382,
-0.05295266956090927,
-0.05520673841238022,
-0.1531808078289032,
-0.0014592993538826704,
-0.05865560844540596,
0.009191147051751614,
-0.06856008619070053,
0.037458937615156174,
-0.0394502617418766,
0.07072474807500839,
-0.008422217331826687,
0.07824524492025375,
-0.010104745626449585,
-0.13327230513095856,
0.0705735981464386,
-0.04187776520848274,
0.066081203520298,
0.02037643827497959,
-0.010219848714768887,
0.022547388449311256,
0.0021119238808751106,
0.017922960221767426,
0.10248807072639465,
0.07688857614994049,
-0.027650989592075348,
-0.0448731854557991,
-0.03313373774290085,
0.025586936622858047,
0.029229503124952316,
0.033374540507793427,
0.1610860824584961,
0.05796423181891441,
-0.08770446479320526,
-0.018814774230122566,
0.012520487420260906,
0.012079734355211258,
-0.26806604862213135,
-0.10926295071840286,
0.16034621000289917,
0.01598617620766163,
0.049842458218336105,
-0.031031964346766472,
-0.05734963342547417,
-0.04340602457523346,
0.17394183576107025,
0.13671384751796722,
0.03597169369459152,
0.010580826550722122,
-0.038660384714603424,
0.013666109181940556,
0.011229844763875008,
0.08656418323516846,
-0.025463730096817017,
0.272778183221817,
-0.041086312383413315,
0.14096282422542572,
-0.13042637705802917,
-0.0015330755850300193,
-0.12269292026758194,
0.07690772414207458,
-0.08258473128080368,
-0.09217739850282669,
-0.053994789719581604,
0.08640532195568085,
-0.17282111942768097,
-0.10025914013385773,
-0.025203678756952286,
0.03871947154402733,
-0.03534679114818573,
0.05230318009853363,
0.03742612525820732,
0.0595187209546566,
0.043698519468307495,
-0.006304572802037001,
-0.006864137947559357,
0.11611403524875641,
-0.010952715761959553,
-0.04384233057498932,
-0.013132825493812561,
0.06849141418933868,
-0.08761855959892273,
0.10208852589130402,
-0.03375323861837387,
0.09072598814964294,
0.06927718222141266,
0.04195524752140045,
-0.11871296912431717,
0.024461884051561356,
-0.004591467324644327,
-0.15404629707336426,
0.035339754074811935,
0.1078629121184349,
-0.01074256468564272,
0.03229128569364548,
0.022947240620851517,
-0.07368091493844986,
0.05649776756763458,
0.14222794771194458,
0.05701776593923569,
-0.06323819607496262,
0.09455401450395584,
-0.06964977085590363,
0.113397516310215,
0.11884647607803345,
0.029568111523985863,
-0.010489046573638916,
-0.09389549493789673,
0.03249659016728401,
-0.03352472558617592,
-0.005286990664899349,
-0.018263807520270348,
-0.18468661606311798,
0.0094120679423213,
0.032010678201913834,
0.07041624933481216,
-0.094200998544693,
-0.07016071677207947,
-0.00935534480959177,
-0.01976349949836731,
-0.05696609243750572,
0.10415171831846237,
-0.002488035475835204,
0.035085078328847885,
-0.0006760170217603445,
-0.11043880134820938,
0.047519050538539886,
0.08707401901483536,
-0.10318450629711151,
-0.09042710065841675
] |
null | null |
transformers
|
# WMT 21 X-En
WMT 21 X-En is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.
It was introduced in this [paper](https://arxiv.org/abs/2108.03265) and first released in [this](https://github.com/pytorch/fairseq/tree/main/examples/wmt21) repository.
The model can directly translate text from 7 languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de) to English.
To translate into a target language, the target language id is forced as the first generated token.
To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method.
*Note: `M2M100Tokenizer` depends on `sentencepiece`, so make sure to install it before running the example.*
To install `sentencepiece` run `pip install sentencepiece`
Since the model was trained with domain tags, you should prepend them to the input as well.
* "wmtdata newsdomain": Use for sentences in the news domain
* "wmtdata otherdomain": Use for sentences in all other domain
```python
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/wmt21-dense-24-wide-x-en")
tokenizer = AutoTokenizer.from_pretrained("facebook/wmt21-dense-24-wide-x-en")
# translate German to English
tokenizer.src_lang = "de"
inputs = tokenizer("wmtdata newsdomain Ein Modell für viele Sprachen", return_tensors="pt")
generated_tokens = model.generate(**inputs)
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "A model for many languages"
# translate Icelandic to English
tokenizer.src_lang = "is"
inputs = tokenizer("wmtdata newsdomain Ein fyrirmynd fyrir mörg tungumál", return_tensors="pt")
generated_tokens = model.generate(**inputs)
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "One model for many languages"
```
See the [model hub](https://huggingface.co/models?filter=wmt21) to look for more fine-tuned versions.
## Languages covered
English (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)
## BibTeX entry and citation info
```
@inproceedings{tran2021facebook
title={Facebook AI’s WMT21 News Translation Task Submission},
author={Chau Tran and Shruti Bhosale and James Cross and Philipp Koehn and Sergey Edunov and Angela Fan},
booktitle={Proc. of WMT},
year={2021},
}
```
|
{"language": ["multilingual", "ha", "is", "ja", "cs", "ru", "zh", "de", "en"], "license": "mit", "tags": ["translation", "wmt21"]}
|
translation
|
facebook/wmt21-dense-24-wide-x-en
|
[
"transformers",
"pytorch",
"m2m_100",
"text2text-generation",
"translation",
"wmt21",
"multilingual",
"ha",
"is",
"ja",
"cs",
"ru",
"zh",
"de",
"en",
"arxiv:2108.03265",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2108.03265"
] |
[
"multilingual",
"ha",
"is",
"ja",
"cs",
"ru",
"zh",
"de",
"en"
] |
TAGS
#transformers #pytorch #m2m_100 #text2text-generation #translation #wmt21 #multilingual #ha #is #ja #cs #ru #zh #de #en #arxiv-2108.03265 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# WMT 21 X-En
WMT 21 X-En is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.
It was introduced in this paper and first released in this repository.
The model can directly translate text from 7 languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de) to English.
To translate into a target language, the target language id is forced as the first generated token.
To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.
*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*
To install 'sentencepiece' run 'pip install sentencepiece'
Since the model was trained with domain tags, you should prepend them to the input as well.
* "wmtdata newsdomain": Use for sentences in the news domain
* "wmtdata otherdomain": Use for sentences in all other domain
See the model hub to look for more fine-tuned versions.
## Languages covered
English (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)
## BibTeX entry and citation info
|
[
"# WMT 21 X-En\nWMT 21 X-En is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model can directly translate text from 7 languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de) to English. \n\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\nSince the model was trained with domain tags, you should prepend them to the input as well.\n* \"wmtdata newsdomain\": Use for sentences in the news domain\n* \"wmtdata otherdomain\": Use for sentences in all other domain\n\n\n\nSee the model hub to look for more fine-tuned versions.",
"## Languages covered\nEnglish (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)",
"## BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #m2m_100 #text2text-generation #translation #wmt21 #multilingual #ha #is #ja #cs #ru #zh #de #en #arxiv-2108.03265 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# WMT 21 X-En\nWMT 21 X-En is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model can directly translate text from 7 languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de) to English. \n\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\nSince the model was trained with domain tags, you should prepend them to the input as well.\n* \"wmtdata newsdomain\": Use for sentences in the news domain\n* \"wmtdata otherdomain\": Use for sentences in all other domain\n\n\n\nSee the model hub to look for more fine-tuned versions.",
"## Languages covered\nEnglish (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)",
"## BibTeX entry and citation info"
] |
[
82,
289,
37,
10
] |
[
"passage: TAGS\n#transformers #pytorch #m2m_100 #text2text-generation #translation #wmt21 #multilingual #ha #is #ja #cs #ru #zh #de #en #arxiv-2108.03265 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# WMT 21 X-En\nWMT 21 X-En is a 4.7B multilingual encoder-decoder (seq-to-seq) model trained for one-to-many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model can directly translate text from 7 languages: Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de) to English. \n\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\nSince the model was trained with domain tags, you should prepend them to the input as well.\n* \"wmtdata newsdomain\": Use for sentences in the news domain\n* \"wmtdata otherdomain\": Use for sentences in all other domain\n\n\n\nSee the model hub to look for more fine-tuned versions.## Languages covered\nEnglish (en), Hausa (ha), Icelandic (is), Japanese (ja), Czech (cs), Russian (ru), Chinese (zh), German (de)## BibTeX entry and citation info"
] |
[
-0.08754370361566544,
-0.05082326382398605,
-0.006185873877257109,
-0.029694359749555588,
0.09952100366353989,
-0.02613140270113945,
0.15998685359954834,
0.06880825012922287,
0.08371201157569885,
0.04618105664849281,
0.04832720011472702,
0.04794983193278313,
0.049416523426771164,
0.09218142181634903,
0.10834793746471405,
-0.18845678865909576,
0.09873326867818832,
-0.064597487449646,
0.10780864208936691,
0.07700689882040024,
0.09328733384609222,
-0.04247744753956795,
0.05662040784955025,
0.008427973836660385,
-0.06922998279333115,
0.05595766007900238,
-0.018857410177588463,
-0.04427015408873558,
0.06235206872224808,
0.0073559521697461605,
0.02867596037685871,
0.0391356460750103,
0.01341524813324213,
-0.15211017429828644,
-0.0011419469956308603,
0.027753718197345734,
-0.01448841392993927,
-0.04300856590270996,
0.1258128583431244,
0.01521785743534565,
0.09844809025526047,
-0.2075214385986328,
-0.05727629363536835,
0.03751557320356369,
-0.04963560774922371,
-0.1466352641582489,
-0.039161618798971176,
0.14569254219532013,
0.09165602922439575,
0.05705224350094795,
-0.07048758119344711,
0.0508715882897377,
0.0036506298929452896,
0.08031389862298965,
0.07770516723394394,
-0.24935121834278107,
-0.017056213691830635,
0.07377917319536209,
0.012046108953654766,
0.10699180513620377,
-0.004428498446941376,
0.015727903693914413,
-0.014787943102419376,
0.023024162277579308,
-0.057858943939208984,
-0.041468095034360886,
0.11827044188976288,
-0.04048430547118187,
-0.16159185767173767,
-0.05146200582385063,
0.11309690028429031,
-0.04563704505562782,
-0.07195397466421127,
-0.15721580386161804,
-0.05526462569832802,
-0.013337780721485615,
0.024176476523280144,
0.011609232984483242,
0.020136885344982147,
0.03697442635893822,
0.1105392575263977,
-0.06825760751962662,
-0.09595073759555817,
0.004373150411993265,
-0.02670378051698208,
0.2161150574684143,
-0.0012078831205144525,
0.013996986672282219,
0.012411915697157383,
0.09424920380115509,
0.028526432812213898,
-0.10875806212425232,
-0.11282485723495483,
-0.08155497163534164,
-0.11059370636940002,
0.010076885111629963,
-0.035676755011081696,
-0.14414261281490326,
0.05296066030859947,
-0.0018106084316968918,
-0.07432053238153458,
0.06073647737503052,
0.0021352937910705805,
0.053975217044353485,
0.05827721208333969,
0.1451454907655716,
-0.04704911634325981,
-0.040214501321315765,
-0.0013240542029961944,
-0.07394400984048843,
0.05138219892978668,
0.03079589642584324,
-0.057229310274124146,
-0.11267562210559845,
0.053141944110393524,
0.08087272197008133,
0.007463339250534773,
0.10188716650009155,
0.02974775992333889,
-0.039117179811000824,
0.11322147399187088,
-0.1594313532114029,
-0.008289273828268051,
0.020947471261024475,
-0.04175500571727753,
0.0757913887500763,
0.0033206141088157892,
-0.024273386225104332,
-0.128346249461174,
0.042501021176576614,
0.02975572645664215,
0.08756496012210846,
-0.08396054059267044,
-0.10429220646619797,
0.036013565957546234,
-0.10069587826728821,
-0.04490390792489052,
-0.1244668960571289,
-0.1537749469280243,
-0.06960062682628632,
0.02839600294828415,
0.029568204656243324,
0.054236799478530884,
-0.05391320958733559,
-0.03260675072669983,
0.0011883206898346543,
-0.03971104696393013,
-0.05675123259425163,
-0.0625322088599205,
-0.0018653273582458496,
-0.07710720598697662,
0.05288391560316086,
0.03984437510371208,
-0.026710744947195053,
-0.10348667949438095,
0.0139812296256423,
-0.17416074872016907,
0.1886003613471985,
-0.07041796296834946,
0.04445895552635193,
-0.02354728803038597,
0.0062905908562242985,
0.05355418473482132,
0.05257153883576393,
0.01603405922651291,
0.19075652956962585,
-0.14516599476337433,
-0.0043178643099963665,
0.24466194212436676,
-0.16493776440620422,
0.00312471273355186,
0.0871405228972435,
0.012324366718530655,
0.10580811649560928,
0.10139049589633942,
0.11617396771907806,
0.0027279246132820845,
-0.16331490874290466,
-0.03250958397984505,
0.0071357181295752525,
-0.15587440133094788,
0.0441674180328846,
0.03910420835018158,
-0.006440598051995039,
0.023588230833411217,
0.046888675540685654,
-0.03516172990202904,
0.029591267928481102,
0.036305323243141174,
-0.02265004999935627,
0.009969839826226234,
-0.024255620315670967,
-0.03466322645545006,
0.008754952810704708,
-0.007744519971311092,
-0.02248014323413372,
-0.03237377852201462,
0.15131133794784546,
0.0658203512430191,
-0.04140074923634529,
0.04338161274790764,
-0.056232426315546036,
-0.04585270211100578,
-0.07712192088365555,
0.048022761940956116,
-0.12877339124679565,
-0.025041067972779274,
0.036083608865737915,
0.02789430320262909,
0.09241718798875809,
0.062260303646326065,
0.06262824684381485,
0.06682630628347397,
0.023123791441321373,
-0.07398539036512375,
0.08345624059438705,
-0.01265852153301239,
-0.00027777632931247354,
-0.1220870316028595,
-0.020364027470350266,
-0.04252905398607254,
0.12288341671228409,
-0.05559350922703743,
0.03685220703482628,
0.02875380776822567,
0.014653552323579788,
-0.04112596437335014,
0.009711788967251778,
0.03571317344903946,
0.010411731898784637,
-0.023501437157392502,
-0.018521428108215332,
0.038726307451725006,
0.0479985736310482,
-0.04427061975002289,
0.16803167760372162,
-0.10285404324531555,
-0.1742323338985443,
0.10650300979614258,
-0.03490440174937248,
-0.050716012716293335,
-0.03740767762064934,
-0.04998001828789711,
-0.03439118340611458,
-0.02800896018743515,
-0.04440142214298248,
0.11473984271287918,
0.041119202971458435,
0.05416117236018181,
-0.08741629123687744,
-0.04763815179467201,
0.003881093580275774,
-0.08010204136371613,
-0.07601333409547806,
0.061059970408678055,
-0.057451408356428146,
-0.19816863536834717,
0.06454593688249588,
-0.03973523527383804,
-0.025678526610136032,
0.2750317454338074,
0.029037712141871452,
-0.06404253095388412,
-0.00588799174875021,
0.10402917116880417,
0.0006596355815418065,
0.04144512116909027,
-0.03968041017651558,
-0.062356121838092804,
0.014052923768758774,
0.010316583327949047,
0.020878858864307404,
-0.04220011085271835,
0.049155715852975845,
-0.0195582527667284,
-0.08567080646753311,
-0.01096169650554657,
0.016306936740875244,
-0.00008481572149321437,
0.051363591104745865,
-0.03694191575050354,
0.039391934871673584,
0.005705453921109438,
-0.06044791638851166,
-0.13821306824684143,
0.08499445766210556,
-0.12866051495075226,
-0.2281433641910553,
-0.1811414361000061,
-0.06937023997306824,
-0.14040230214595795,
0.03756982460618019,
0.08294837921857834,
-0.0986223965883255,
-0.017707550898194313,
-0.04669978469610214,
0.08378002047538757,
0.01345834881067276,
-0.09910072386264801,
-0.08302360773086548,
0.04149484634399414,
0.011236617341637611,
-0.07348418980836868,
-0.031732749193906784,
-0.027293194085359573,
-0.009657403454184532,
0.05076144263148308,
-0.035774264484643936,
0.06559593975543976,
0.07318665087223053,
0.00047382444608956575,
0.01830212213099003,
-0.02362312376499176,
0.1989840418100357,
-0.06716066598892212,
0.10319184511899948,
0.12244244664907455,
-0.01522299274802208,
0.03012973442673683,
0.16175511479377747,
0.0051379879005253315,
-0.015823950991034508,
0.005949107930064201,
-0.05729936063289642,
-0.046081677079200745,
-0.19295117259025574,
-0.07931898534297943,
-0.057429514825344086,
-0.02564600296318531,
0.05667153373360634,
0.05153917893767357,
0.038952864706516266,
0.047715116292238235,
-0.0876789391040802,
-0.017731981351971626,
0.10325154662132263,
0.13180053234100342,
0.07137244194746017,
0.04079308733344078,
0.045296721160411835,
-0.03375401720404625,
-0.005509533453732729,
0.04660008102655411,
-0.0016493973089382052,
0.09997441619634628,
-0.041037846356630325,
0.1153365969657898,
0.0806821957230568,
0.02639852836728096,
0.05674513429403305,
0.1060396209359169,
0.0022126503754407167,
0.04259843751788139,
0.02657250128686428,
-0.10949023813009262,
-0.05332675203680992,
0.09872718155384064,
0.023044077679514885,
-0.035332899540662766,
0.03899375721812248,
0.022637322545051575,
0.07454140484333038,
0.29569411277770996,
-0.06415615230798721,
-0.1105116531252861,
-0.05573756620287895,
0.0071494304575026035,
-0.0228965412825346,
-0.07494278997182846,
-0.04127854108810425,
0.03411548584699631,
-0.10085079073905945,
0.14441704750061035,
0.022620024159550667,
0.04609513655304909,
-0.03288993239402771,
0.01498530525714159,
-0.006271069869399071,
0.06745842099189758,
0.027289237827062607,
0.08598603308200836,
-0.23152117431163788,
0.07596471160650253,
0.012458162382245064,
0.10580568760633469,
-0.07631989568471909,
0.04159514233469963,
0.006153722293674946,
0.044476933777332306,
0.12426310032606125,
0.04658244550228119,
-0.1054900735616684,
-0.030979709699749947,
-0.04471326619386673,
-0.03126601129770279,
0.05930137634277344,
0.02012708969414234,
0.05250248685479164,
0.03572898358106613,
-0.04345592483878136,
-0.10466771572828293,
0.007194158621132374,
-0.1120620146393776,
-0.10879325121641159,
0.03114914521574974,
-0.0006402652361430228,
0.10077515244483948,
0.00878559984266758,
-0.003965812269598246,
-0.10865587741136551,
0.20562927424907684,
-0.13363854587078094,
-0.14388394355773926,
-0.07828472554683685,
-0.06323431432247162,
0.06199142336845398,
-0.06649059057235718,
-0.022770414128899574,
-0.019124552607536316,
0.13092640042304993,
-0.09636951237916946,
-0.05629483237862587,
0.030626853927969933,
-0.0970836877822876,
-0.07567664235830307,
0.01607983186841011,
0.10767031461000443,
0.08080621063709259,
-0.00028601710801012814,
0.048043426126241684,
0.03225915879011154,
0.012212321162223816,
-0.09170573949813843,
-0.028695330023765564,
0.11069457978010178,
-0.03189203143119812,
0.039903223514556885,
-0.13120810687541962,
-0.16102991998195648,
-0.11108332127332687,
-0.06668619811534882,
0.09585662931203842,
0.22388944029808044,
-0.06380081921815872,
0.14516282081604004,
0.18687871098518372,
-0.08972331881523132,
-0.16736115515232086,
-0.07716028392314911,
0.0737297534942627,
0.038043007254600525,
-0.1341983675956726,
-0.15098854899406433,
0.09064503759145737,
0.032992564141750336,
0.024279160425066948,
0.020127980038523674,
-0.12261335551738739,
-0.10727319121360779,
0.03154720738530159,
0.019890613853931427,
0.09159358590841293,
-0.06481574475765228,
-0.040799904614686966,
-0.055971819907426834,
-0.013533647172152996,
0.05658348649740219,
-0.072749562561512,
0.0693015307188034,
0.07914260774850845,
-0.009634586982429028,
0.054533347487449646,
-0.06784497201442719,
0.10161013156175613,
0.07183302938938141,
-0.01813570037484169,
-0.07856893539428711,
0.08517547696828842,
0.04794176295399666,
-0.05855564773082733,
0.19736656546592712,
0.03668183833360672,
-0.0676998421549797,
-0.07061634212732315,
-0.0736747682094574,
-0.08854760974645615,
0.1010737344622612,
-0.03832493722438812,
-0.05004802346229553,
0.017661120742559433,
0.047672659158706665,
0.0971602126955986,
0.012979117222130299,
0.006787866819649935,
-0.14131377637386322,
-0.054969921708106995,
0.16381040215492249,
0.13508829474449158,
-0.08362845331430435,
0.03710745647549629,
-0.009776867926120758,
-0.018962537869811058,
0.039313532412052155,
0.06562796980142593,
0.027368685230612755,
0.04889656603336334,
0.002179085975512862,
0.07348426431417465,
0.004317477345466614,
-0.10847820341587067,
-0.033303454518318176,
0.05552259460091591,
-0.1404564082622528,
-0.022806577384471893,
-0.024036938324570656,
-0.01713409647345543,
0.10322491824626923,
-0.05459724739193916,
0.17840512096881866,
-0.04067572206258774,
-0.0009239722276106477,
0.00896506104618311,
0.017428502440452576,
-0.03570267930626869,
0.06901766359806061,
-0.025578340515494347,
0.03750171139836311,
-0.08622516691684723,
0.07666794955730438,
0.036764465272426605,
-0.07025681436061859,
0.01306704618036747,
0.18581174314022064,
-0.14713631570339203,
-0.0677202045917511,
-0.09527094662189484,
0.06729905307292938,
-0.0861663967370987,
-0.11205080896615982,
-0.030834805220365524,
-0.06443951278924942,
0.012770207598805428,
0.0681750699877739,
0.057057011872529984,
-0.01343248225748539,
0.025727245956659317,
-0.026373837143182755,
-0.037558093667030334,
0.05176806449890137,
0.04492165148258209,
0.010792055167257786,
-0.05155980959534645,
0.13244670629501343,
0.037224214524030685,
0.11785851418972015,
-0.036358967423439026,
-0.028825601562857628,
-0.029433496296405792,
0.01933601126074791,
-0.07929831743240356,
0.06621424853801727,
-0.11753034591674805,
-0.00909436959773302,
-0.013593305833637714,
0.015608994290232658,
0.013632504269480705,
0.03780549019575119,
-0.07496882975101471,
0.01138937659561634,
-0.06487498432397842,
0.040129732340574265,
-0.08809229731559753,
-0.04597945883870125,
-0.03294864296913147,
-0.08640145510435104,
0.07173309475183487,
0.10260432958602905,
-0.10148471593856812,
0.023012125864624977,
0.0071634757332503796,
0.022405028343200684,
0.019016582518815994,
0.013653747737407684,
0.016593316569924355,
-0.07488419115543365,
0.023286113515496254,
0.07180295884609222,
-0.023910220712423325,
-0.007933895103633404,
-0.012237569317221642,
-0.04117978364229202,
0.10546833276748657,
0.046106211841106415,
-0.03148879110813141,
-0.09889911860227585,
0.06711986660957336,
0.059481844305992126,
0.049461230635643005,
0.13288931548595428,
-0.11669372767210007,
0.05459674075245857,
-0.04004932940006256,
-0.015871252864599228,
0.025803115218877792,
-0.029652586206793785,
-0.02730436623096466,
-0.05893219634890556,
0.036964353173971176,
0.009395532310009003,
0.15469101071357727,
0.006353529170155525,
0.04177514463663101,
0.03017401322722435,
-0.08210974186658859,
-0.09566379338502884,
0.034888412803411484,
0.09505363553762436,
-0.023050004616379738,
0.04821295291185379,
0.0032667217310518026,
-0.01906677521765232,
-0.01910294219851494,
0.04020065814256668,
0.06742902100086212,
0.12109997868537903,
0.18558581173419952,
0.09829632937908173,
0.017497757449746132,
0.004220107104629278,
-0.073450468480587,
-0.039125390350818634,
-0.11518397182226181,
0.023511314764618874,
-0.056593939661979675,
0.15883532166481018,
0.12782937288284302,
-0.12778140604496002,
0.04052753373980522,
0.05068175122141838,
-0.05405963584780693,
-0.06858058273792267,
-0.1464184671640396,
-0.010726998560130596,
-0.0402001217007637,
0.010948182083666325,
-0.0749533399939537,
0.03578729182481766,
-0.04120960831642151,
0.07882244139909744,
-0.015897411853075027,
0.08077941834926605,
-0.005401428788900375,
-0.11878979951143265,
0.06390265375375748,
-0.04356332868337631,
0.07139895856380463,
0.021732263267040253,
-0.006976940669119358,
0.023583758622407913,
-0.003906794358044863,
0.015244588255882263,
0.10383672267198563,
0.07306316494941711,
-0.025303486734628677,
-0.0430634506046772,
-0.03634381666779518,
0.02298707701265812,
0.03453310206532478,
0.043007366359233856,
0.1516994833946228,
0.06568187475204468,
-0.08430961519479752,
-0.00977343413978815,
0.023646023124456406,
0.0025815549306571484,
-0.2738719582557678,
-0.10756445676088333,
0.17976857721805573,
0.026527058333158493,
0.053191523998975754,
-0.035612646490335464,
-0.07123217731714249,
-0.03930230811238289,
0.1664724498987198,
0.15234217047691345,
0.02635820023715496,
0.004026579204946756,
-0.04344407469034195,
0.013199974782764912,
0.010634898208081722,
0.07280787825584412,
-0.015946457162499428,
0.2633032202720642,
-0.05363669991493225,
0.14918914437294006,
-0.1209152340888977,
-0.006461745128035545,
-0.12953945994377136,
0.07796904444694519,
-0.0730571374297142,
-0.06600335240364075,
-0.04314543306827545,
0.10256415605545044,
-0.17470210790634155,
-0.087519571185112,
-0.028236573562026024,
0.048601191490888596,
-0.031105775386095047,
0.044874705374240875,
0.01338265836238861,
0.053412824869155884,
0.048281844705343246,
0.002966711763292551,
0.00469199800863862,
0.11209682375192642,
-0.004712373949587345,
-0.05427054688334465,
-0.002958687487989664,
0.054745908826589584,
-0.06754716485738754,
0.10233805328607559,
-0.03302944451570511,
0.08314723521471024,
0.06905091553926468,
0.025708140805363655,
-0.14040982723236084,
0.04350443556904793,
0.002653537318110466,
-0.13274100422859192,
0.056816376745700836,
0.11138539761304855,
-0.011246256530284882,
0.02708546258509159,
0.02356121689081192,
-0.07709578424692154,
0.0533151850104332,
0.1369265913963318,
0.0563356839120388,
-0.0763656347990036,
0.09749286621809006,
-0.06126164644956589,
0.10577541589736938,
0.12078497558832169,
0.02780771069228649,
-0.0001726485788822174,
-0.09613912552595139,
0.04377545043826103,
-0.03262384980916977,
-0.03451739251613617,
-0.0160616934299469,
-0.18083831667900085,
0.012236226350069046,
0.039891522377729416,
0.07944301515817642,
-0.1342376470565796,
-0.07186301052570343,
-0.008685516193509102,
-0.014517543837428093,
-0.05696305260062218,
0.09233905375003815,
0.00569520378485322,
0.038838859647512436,
-0.004522885195910931,
-0.09815167635679245,
0.05269310623407364,
0.10163377225399017,
-0.10601409524679184,
-0.09823206812143326
] |
null | null |
transformers
|
# XGLM-1.7B
XGLM-1.7B is a multilingual autoregressive language model (with 1.7 billion parameters) trained on a balanced corpus of a diverse set of languages totaling 500 billion sub-tokens. It was introduced in the paper [Few-shot Learning with Multilingual Language Models](https://arxiv.org/abs/2112.10668) by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in [this repository](https://github.com/pytorch/fairseq/tree/main/examples/xglm).
## Training Data Statistics
The training data statistics of XGLM-1.7B is shown in the table below.
| ISO-639-1| family | name | # tokens | ratio | ratio w/ lowRes upsampling |
|:--------|:-----------------|:------------------------|-------------:|------------:|-------------:|
| en | Indo-European | English | 803526736124 | 0.489906 | 0.3259 |
| ru | Indo-European | Russian | 147791898098 | 0.0901079 | 0.0602 |
| zh | Sino-Tibetan | Chinese | 132770494630 | 0.0809494 | 0.0483 |
| de | Indo-European | German | 89223707856 | 0.0543992 | 0.0363 |
| es | Indo-European | Spanish | 87303083105 | 0.0532282 | 0.0353 |
| fr | Indo-European | French | 77419639775 | 0.0472023 | 0.0313 |
| ja | Japonic | Japanese | 66054364513 | 0.040273 | 0.0269 |
| it | Indo-European | Italian | 41930465338 | 0.0255648 | 0.0171 |
| pt | Indo-European | Portuguese | 36586032444 | 0.0223063 | 0.0297 |
| el | Indo-European | Greek (modern) | 28762166159 | 0.0175361 | 0.0233 |
| ko | Koreanic | Korean | 20002244535 | 0.0121953 | 0.0811 |
| fi | Uralic | Finnish | 16804309722 | 0.0102455 | 0.0681 |
| id | Austronesian | Indonesian | 15423541953 | 0.00940365 | 0.0125 |
| tr | Turkic | Turkish | 12413166065 | 0.00756824 | 0.0101 |
| ar | Afro-Asiatic | Arabic | 12248607345 | 0.00746791 | 0.0099 |
| vi | Austroasiatic | Vietnamese | 11199121869 | 0.00682804 | 0.0091 |
| th | Tai–Kadai | Thai | 10842172807 | 0.00661041 | 0.044 |
| bg | Indo-European | Bulgarian | 9703797869 | 0.00591635 | 0.0393 |
| ca | Indo-European | Catalan | 7075834775 | 0.0043141 | 0.0287 |
| hi | Indo-European | Hindi | 3448390110 | 0.00210246 | 0.014 |
| et | Uralic | Estonian | 3286873851 | 0.00200399 | 0.0133 |
| bn | Indo-European | Bengali, Bangla | 1627447450 | 0.000992245 | 0.0066 |
| ta | Dravidian | Tamil | 1476973397 | 0.000900502 | 0.006 |
| ur | Indo-European | Urdu | 1351891969 | 0.000824241 | 0.0055 |
| sw | Niger–Congo | Swahili | 907516139 | 0.000553307 | 0.0037 |
| te | Dravidian | Telugu | 689316485 | 0.000420272 | 0.0028 |
| eu | Language isolate | Basque | 105304423 | 6.42035e-05 | 0.0043 |
| my | Sino-Tibetan | Burmese | 101358331 | 6.17976e-05 | 0.003 |
| ht | Creole | Haitian, Haitian Creole | 86584697 | 5.27902e-05 | 0.0035 |
| qu | Quechuan | Quechua | 3236108 | 1.97304e-06 | 0.0001 |
## Model card
For intended usage of the model, please refer to the [model card](https://github.com/pytorch/fairseq/blob/main/examples/xglm/model_card.md) released by the XGLM-1.7B development team.
## Example (COPA)
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
```python
import torch
import torch.nn.functional as F
from transformers import XGLMTokenizer, XGLMForCausalLM
tokenizer = XGLMTokenizer.from_pretrained("facebook/xglm-1.7B")
model = XGLMForCausalLM.from_pretrained("facebook/xglm-1.7B")
data_samples = {
'en': [
{
"premise": "I wanted to conserve energy.",
"choice1": "I swept the floor in the unoccupied room.",
"choice2": "I shut off the light in the unoccupied room.",
"question": "effect",
"label": "1"
},
{
"premise": "The flame on the candle went out.",
"choice1": "I blew on the wick.",
"choice2": "I put a match to the wick.",
"question": "cause",
"label": "0"
}
],
'zh': [
{
"premise": "我想节约能源。",
"choice1": "我在空着的房间里扫了地板。",
"choice2": "我把空房间里的灯关了。",
"question": "effect",
"label": "1"
},
{
"premise": "蜡烛上的火焰熄灭了。",
"choice1": "我吹灭了灯芯。",
"choice2": "我把一根火柴放在灯芯上。",
"question": "cause",
"label": "0"
}
],
'hi': [
{
"premise": "M te vle konsève enèji.",
"choice1": "Mwen te fin baleye chanm lib la.",
"choice2": "Mwen te femen limyè nan chanm lib la.",
"question": "effect",
"label": "1"
},
{
"premise": "Flam bouji a te etenn.",
"choice1": "Mwen te soufle bouji a.",
"choice2": "Mwen te limen mèch bouji a.",
"question": "cause",
"label": "0"
}
]
}
def get_logprobs(prompt):
inputs = tokenizer(prompt, return_tensors="pt")
input_ids, output_ids = inputs["input_ids"], inputs["input_ids"][:, 1:]
outputs = model(**inputs, labels=input_ids)
logits = outputs.logits
logprobs = torch.gather(F.log_softmax(logits, dim=2), 2, output_ids.unsqueeze(2))
return logprobs
# Zero-shot evaluation for the Choice of Plausible Alternatives (COPA) task.
# A return value of 0 indicates that the first alternative is more plausible,
# while 1 indicates that the second alternative is more plausible.
def COPA_eval(prompt, alternative1, alternative2):
lprob1 = get_logprobs(prompt + "\n" + alternative1).sum()
lprob2 = get_logprobs(prompt + "\n" + alternative2).sum()
return 0 if lprob1 > lprob2 else 1
for lang in data_samples_long:
for idx, example in enumerate(data_samples_long[lang]):
predict = COPA_eval(example["premise"], example["choice1"], example["choice2"])
print(f'{lang}-{idx}', predict, example['label'])
# en-0 1 1
# en-1 0 0
# zh-0 1 1
# zh-1 0 0
# hi-0 1 1
# hi-1 0 0
```
|
{"language": ["multilingual", "en", "ru", "zh", "de", "es", "fr", "ja", "it", "pt", "el", "ko", "fi", "id", "tr", "ar", "vi", "th", "bg", "ca", "hi", "et", "bn", "ta", "ur", "sw", "te", "eu", "my", "ht", "qu"], "license": "mit", "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png", "inference": false}
|
text-generation
|
facebook/xglm-1.7B
|
[
"transformers",
"pytorch",
"tf",
"xglm",
"text-generation",
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu",
"arxiv:2112.10668",
"license:mit",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2112.10668"
] |
[
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu"
] |
TAGS
#transformers #pytorch #tf #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us
|
XGLM-1.7B
=========
XGLM-1.7B is a multilingual autoregressive language model (with 1.7 billion parameters) trained on a balanced corpus of a diverse set of languages totaling 500 billion sub-tokens. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in this repository.
Training Data Statistics
------------------------
The training data statistics of XGLM-1.7B is shown in the table below.
Model card
----------
For intended usage of the model, please refer to the model card released by the XGLM-1.7B development team.
Example (COPA)
--------------
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
116
] |
[
"passage: TAGS\n#transformers #pytorch #tf #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
-0.021077703684568405,
-0.08647020161151886,
-0.008638527244329453,
0.054381608963012695,
0.0761827901005745,
0.02111450582742691,
0.10445526242256165,
0.07581356912851334,
0.10637114942073822,
0.02193765714764595,
0.11966050416231155,
0.07518605142831802,
0.011234351433813572,
0.07361283898353577,
0.009747390635311604,
-0.27382388710975647,
0.0028755031526088715,
-0.0053015523590147495,
-0.05157127231359482,
0.0961705893278122,
0.09242325276136398,
-0.026941265910863876,
0.1058797687292099,
-0.05131691321730614,
-0.02264511212706566,
0.054014548659324646,
-0.02065850980579853,
-0.021178990602493286,
0.11899805814027786,
0.10804332047700882,
0.045004237443208694,
0.06496842205524445,
-0.011539810337126255,
-0.16245564818382263,
0.038447096943855286,
-0.03393638879060745,
-0.13309180736541748,
-0.00007198999082902446,
0.03879633918404579,
-0.13985468447208405,
0.19244085252285004,
0.00906700361520052,
-0.11990104615688324,
0.05255252495408058,
-0.16523994505405426,
-0.15145614743232727,
-0.06356882303953171,
0.1364603191614151,
-0.0484461709856987,
0.06226811930537224,
-0.033015504479408264,
0.10257496684789658,
-0.10441100597381592,
0.07813199609518051,
0.2049301415681839,
-0.29914048314094543,
-0.040857668966054916,
0.06571070849895477,
0.11218615621328354,
0.1547595113515854,
-0.10621711611747742,
0.09964276105165482,
0.06055803224444389,
0.001230039750225842,
-0.040297962725162506,
-0.0990334004163742,
0.006035502068698406,
0.05879858508706093,
-0.1150192990899086,
-0.006224879529327154,
0.22819986939430237,
-0.0109507841989398,
0.045010264962911606,
0.1340445727109909,
-0.062314510345458984,
-0.1125190407037735,
0.007288266438990831,
-0.01036106888204813,
-0.013694335706532001,
0.03076213225722313,
0.05414558947086334,
-0.05321487411856651,
-0.12743641436100006,
0.0501336008310318,
-0.17794664204120636,
0.21771952509880066,
0.015817783772945404,
-0.012167809531092644,
-0.04651546850800514,
0.046265795826911926,
-0.04824801906943321,
-0.112579844892025,
0.012481356039643288,
-0.05075937882065773,
0.06189218908548355,
0.06072552129626274,
0.01840187795460224,
0.017544198781251907,
0.06970476359128952,
0.09966696798801422,
-0.09684046357870102,
0.044751204550266266,
-0.0015302769606932998,
0.14729976654052734,
0.017108913511037827,
0.04321252554655075,
-0.03583919256925583,
-0.09774701297283173,
-0.06446003168821335,
-0.05654001235961914,
0.00949152372777462,
-0.02972385846078396,
-0.18975169956684113,
-0.05511046573519707,
-0.014924482442438602,
0.037096403539180756,
-0.004376878030598164,
0.07187948375940323,
-0.014117015525698662,
0.04626649618148804,
-0.008489856496453285,
-0.02191147580742836,
0.015946675091981888,
0.019643014296889305,
0.028827209025621414,
0.09459855407476425,
-0.0405716635286808,
-0.018786825239658356,
-0.03834063187241554,
0.046430427581071854,
-0.06371225416660309,
0.06998296827077866,
-0.007828457280993462,
-0.10014212876558304,
0.03380680829286575,
-0.08243468403816223,
0.02729901112616062,
-0.1847134530544281,
0.0177287720143795,
-0.007706641219556332,
-0.013369172811508179,
-0.04867152124643326,
0.029118111357092857,
-0.03734942153096199,
-0.08442867547273636,
0.060735203325748444,
-0.03287503495812416,
-0.027872582897543907,
-0.09482777863740921,
0.10722023993730545,
-0.04939354956150055,
0.08700690418481827,
-0.1587800532579422,
0.037759795784950256,
-0.013114403001964092,
0.027682363986968994,
-0.05654476583003998,
-0.008725597523152828,
-0.06331968307495117,
0.021602528169751167,
-0.006863632705062628,
-0.0638057217001915,
-0.0695672556757927,
0.06594692170619965,
-0.027434324845671654,
0.15526413917541504,
-0.1761987954378128,
-0.10025916248559952,
0.19942976534366608,
-0.07192587107419968,
-0.0975676104426384,
0.14466221630573273,
0.04427091404795647,
0.022928042337298393,
0.005675165448337793,
0.2356800138950348,
0.01027057133615017,
-0.10776127874851227,
-0.0775005966424942,
0.11368532478809357,
-0.02660747803747654,
0.043692003935575485,
0.10239075124263763,
0.05126161500811577,
0.02204788476228714,
0.008491274900734425,
0.030492331832647324,
0.09360811859369278,
-0.04942415654659271,
-0.051849640905857086,
0.032792892307043076,
-0.0016699369298294187,
0.11963781714439392,
0.0294233039021492,
0.03967200592160225,
-0.09223772585391998,
-0.05302176624536514,
-0.04224703460931778,
0.09060050547122955,
0.04583895578980446,
0.04989887401461601,
-0.05249457433819771,
0.13389292359352112,
0.06165534630417824,
-0.00369828287512064,
-0.07938694208860397,
0.08503350615501404,
-0.0702170580625534,
0.1381450593471527,
0.10525738447904587,
0.21911808848381042,
0.08288167417049408,
-0.014699090272188187,
-0.09719429910182953,
-0.04040510952472687,
0.005758093670010567,
-0.008982871659100056,
-0.028763821348547935,
-0.1797376126050949,
0.09188305586576462,
-0.03713732585310936,
0.05168351158499718,
-0.15078970789909363,
0.025893334299325943,
0.15827642381191254,
0.10690037161111832,
-0.03459100425243378,
0.0845966562628746,
-0.08839157223701477,
0.05557738244533539,
-0.08418058604001999,
0.025556301698088646,
0.053706470876932144,
-0.03418565168976784,
-0.10620207339525223,
0.16881389915943146,
-0.11698736250400543,
0.25259163975715637,
0.18880027532577515,
-0.21559233963489532,
-0.03354882448911667,
0.02862909622490406,
-0.041688088327646255,
0.00435509392991662,
0.1377493441104889,
-0.04898730292916298,
0.029705438762903214,
-0.026628023013472557,
0.11110832542181015,
-0.06407739967107773,
-0.02833123691380024,
0.02258007600903511,
-0.07517137378454208,
-0.08935090899467468,
0.17828314006328583,
0.06360192596912384,
-0.16799135506153107,
0.2449401170015335,
0.3285822868347168,
0.02819323167204857,
0.27165380120277405,
0.0241137333214283,
0.010979965329170227,
-0.04309499263763428,
0.00615650974214077,
-0.03513344004750252,
0.1224934309720993,
-0.1534469872713089,
-0.021490486338734627,
-0.0044746981002390385,
-0.00281604309566319,
0.03013765625655651,
-0.0967055931687355,
-0.08544313907623291,
-0.0503423810005188,
-0.024983447045087814,
-0.0089530348777771,
0.10118358582258224,
-0.04884044826030731,
0.12056326121091843,
0.011852939613163471,
-0.09830757975578308,
0.001588578918017447,
0.01762642338871956,
-0.05261397734284401,
0.15696580708026886,
-0.15714971721172333,
-0.21523253619670868,
-0.027711082249879837,
-0.13661442697048187,
-0.04682005196809769,
-0.0017227366333827376,
0.07710511237382889,
-0.11283420026302338,
0.0057023195549845695,
0.013099039904773235,
0.11527473479509354,
-0.16144853830337524,
-0.015764502808451653,
-0.13091351091861725,
0.02019658125936985,
-0.07234479486942291,
-0.060151781886816025,
-0.06397876143455505,
-0.01507828664034605,
-0.07729760557413101,
0.1484110951423645,
-0.11270034313201904,
0.11348895728588104,
0.10749711096286774,
0.012152198702096939,
0.04484795406460762,
-0.05609704181551933,
0.17406174540519714,
-0.14729094505310059,
0.02365477941930294,
0.05618636682629585,
0.017566340044140816,
0.0946057066321373,
0.13878260552883148,
0.03700815513730049,
-0.022574763745069504,
-0.01679288223385811,
0.014678454957902431,
-0.04950615018606186,
-0.15795420110225677,
-0.11701034754514694,
-0.09824314713478088,
0.1107136458158493,
-0.06536988168954849,
0.09049021452665329,
0.08339903503656387,
0.020282555371522903,
-0.03753485158085823,
-0.0431564562022686,
-0.057702165096998215,
0.0356319397687912,
0.16204522550106049,
-0.07398489862680435,
0.08834762126207352,
-0.06245781481266022,
-0.05443356931209564,
0.1323276311159134,
0.10335037112236023,
-0.005407777614891529,
0.06043992191553116,
0.004803138319402933,
0.10422942042350769,
0.12064774334430695,
0.05979812517762184,
-0.01635965146124363,
0.034399863332509995,
-0.0605163611471653,
-0.02064690925180912,
-0.06394694000482559,
0.037445589900016785,
0.04220283776521683,
0.0927807167172432,
-0.06341931968927383,
-0.01164315640926361,
-0.11917989701032639,
0.12864483892917633,
-0.07163043320178986,
0.051150817424058914,
-0.03747592866420746,
0.002892519114539027,
0.07395553588867188,
0.03206474334001541,
-0.06458157300949097,
0.02585681900382042,
0.11099405586719513,
-0.11250700801610947,
0.08671583235263824,
0.05059026926755905,
0.06382491439580917,
0.010248745791614056,
0.09881927073001862,
-0.0946061983704567,
-0.0723404809832573,
-0.03669430688023567,
0.0807824358344078,
-0.34289419651031494,
0.28194549679756165,
0.02378745935857296,
-0.07884951680898666,
0.004750718362629414,
-0.08691509068012238,
0.05012359097599983,
0.222601518034935,
0.12073442339897156,
0.0862947404384613,
0.002050214447081089,
-0.14083147048950195,
0.06551217287778854,
-0.021207578480243683,
0.14670871198177338,
-0.03812144324183464,
0.004291863646358252,
0.0038633381482213736,
0.00318690063431859,
-0.0377960242331028,
0.13289476931095123,
-0.06515619903802872,
-0.11384263634681702,
0.1075083464384079,
-0.013714605942368507,
0.024667911231517792,
-0.014280795119702816,
-0.08172573149204254,
-0.18418389558792114,
0.03584052622318268,
-0.12622329592704773,
-0.0037434687837958336,
-0.07318267226219177,
-0.08304516226053238,
0.013198711909353733,
-0.11801806837320328,
-0.036298368126153946,
-0.028118785470724106,
-0.08584772050380707,
-0.12845273315906525,
-0.027228079736232758,
0.09792478382587433,
-0.06265962868928909,
-0.08575263619422913,
-0.008722609840333462,
0.13691195845603943,
0.022021563723683357,
0.09355100244283676,
-0.05445722118020058,
-0.007662053685635328,
-0.07979816943407059,
-0.12699463963508606,
0.06480962783098221,
-0.08939787745475769,
-0.02471604384481907,
0.014862372539937496,
-0.06913700699806213,
-0.0322769396007061,
-0.04631129279732704,
-0.11491545289754868,
0.13846679031848907,
0.3368052840232849,
-0.04625765606760979,
0.10520168393850327,
0.13126932084560394,
-0.05566029250621796,
-0.31564828753471375,
-0.12038056552410126,
-0.134260356426239,
0.007149347569793463,
-0.0007996213971637189,
-0.13338252902030945,
-0.09708082675933838,
-0.009962118230760098,
-0.011662471108138561,
0.10344768315553665,
-0.26459038257598877,
-0.07477672398090363,
0.09739813953638077,
-0.0031654597260057926,
0.28994306921958923,
-0.1289563626050949,
-0.023600971326231956,
-0.03722766414284706,
-0.06233576685190201,
0.04290361329913139,
-0.08369139581918716,
0.10695452988147736,
-0.04911781847476959,
0.023682748898863792,
-0.011142532341182232,
0.01694054715335369,
0.15858608484268188,
-0.012226587161421776,
-0.007382683921605349,
-0.13310255110263824,
-0.16355301439762115,
0.10056838393211365,
0.022703280672430992,
-0.06959296762943268,
-0.17610417306423187,
-0.06876739859580994,
-0.12082839012145996,
0.022275805473327637,
-0.12334666401147842,
0.0902608186006546,
-0.053485628217458725,
-0.09815836697816849,
-0.10433748364448547,
0.0644790306687355,
-0.03678034245967865,
-0.009077380411326885,
0.18379242718219757,
-0.06553009152412415,
0.14308618009090424,
0.07316110283136368,
0.08957726508378983,
-0.08185525983572006,
0.04707583039999008,
-0.0439862497150898,
-0.0378684401512146,
0.03409425914287567,
-0.10230546444654465,
-0.020422454923391342,
0.12418004870414734,
-0.031530097126960754,
0.07733824849128723,
0.06071780249476433,
-0.10505055636167526,
0.021587444469332695,
0.13244563341140747,
-0.15006451308727264,
-0.19778546690940857,
-0.0734427347779274,
-0.06900697201490402,
0.11968561261892319,
0.032671116292476654,
0.15546128153800964,
-0.03117428347468376,
0.006718778051435947,
0.011613566428422928,
-0.032091446220874786,
-0.03221549093723297,
0.02988799475133419,
0.08970902115106583,
0.0030281913932412863,
-0.06200448051095009,
0.02037264220416546,
0.012974554672837257,
-0.11357967555522919,
-0.014454714022576809,
0.22126436233520508,
-0.07403596490621567,
-0.1504201591014862,
-0.061737753450870514,
0.09569339454174042,
-0.13432927429676056,
-0.034869614988565445,
-0.027323607355356216,
-0.11007630079984665,
0.052844371646642685,
0.25135472416877747,
0.07133063673973083,
0.040243230760097504,
0.016689429059624672,
-0.0027764111291617155,
0.11078377813100815,
0.05923434719443321,
-0.012665614485740662,
0.0015681447694078088,
-0.074038065969944,
0.09499238431453705,
0.0036793078761547804,
0.19931937754154205,
-0.049601610749959946,
-0.051564980298280716,
-0.15585508942604065,
0.009195808321237564,
-0.0786099061369896,
-0.06140999495983124,
-0.07441195100545883,
-0.04807396978139877,
-0.031213779002428055,
-0.12408003211021423,
-0.06221996992826462,
-0.061421655118465424,
-0.10932879894971848,
0.016782140359282494,
0.03667069226503372,
0.11280064284801483,
-0.052310217171907425,
-0.006640233099460602,
0.10115451365709305,
-0.004676357842981815,
0.11106844991445541,
0.10067523270845413,
-0.006200761068612337,
0.11165101826190948,
-0.15625494718551636,
0.0038919509388506413,
0.03500503674149513,
0.033328328281641006,
0.029989134520292282,
0.1151876375079155,
-0.05200899392366409,
-0.04370766505599022,
0.07198967039585114,
0.07665148377418518,
0.04044419527053833,
-0.03149016574025154,
0.13913090527057648,
-0.04195011779665947,
-0.1286436766386032,
-0.02804434299468994,
0.08796510845422745,
0.03520507737994194,
0.017239265143871307,
0.04579295963048935,
-0.06945693492889404,
0.013695722445845604,
-0.05324910953640938,
0.04297230392694473,
0.01489725336432457,
-0.17088045179843903,
-0.010211769491434097,
-0.10300387442111969,
0.009234107099473476,
-0.02558850310742855,
0.08372821658849716,
0.023191753774881363,
-0.05478506535291672,
0.02575424313545227,
0.028258947655558586,
-0.09263580292463303,
-0.014676587656140327,
0.06689358502626419,
0.025683501735329628,
-0.05242329090833664,
-0.16761359572410583,
0.03458936884999275,
-0.011449415236711502,
0.012657771818339825,
0.09496763348579407,
0.08723828196525574,
0.10888747870922089,
0.057096920907497406,
-0.008622417226433754,
-0.0659041777253151,
-0.0748385637998581,
-0.11782589554786682,
0.01741006411612034,
-0.028848381713032722,
-0.08833015710115433,
0.16941769421100616,
0.2129417061805725,
-0.05672015622258186,
0.039297036826610565,
-0.02235519140958786,
-0.011892586946487427,
-0.1125214546918869,
-0.09420233219861984,
0.001030655694194138,
-0.09244435280561447,
-0.020708119496703148,
-0.045243509113788605,
0.06750643253326416,
0.012964545749127865,
0.07824751734733582,
-0.033376242965459824,
0.06634699553251266,
0.019536593928933144,
-0.126076340675354,
0.02515471912920475,
-0.01039188727736473,
0.0754055306315422,
-0.12083887308835983,
0.023069050163030624,
-0.07051311433315277,
-0.07743334025144577,
-0.035981371998786926,
0.0529375858604908,
-0.03818738833069801,
-0.0628536120057106,
-0.12415584921836853,
-0.06635390222072601,
-0.02486995980143547,
0.07578074187040329,
0.027141807600855827,
0.1602202206850052,
0.012614842504262924,
-0.010889437980949879,
0.0013945179525762796,
0.19030451774597168,
-0.02065979316830635,
-0.06566990166902542,
0.04469359666109085,
0.13507530093193054,
0.013377618975937366,
0.10391362011432648,
-0.06707999110221863,
0.0013922142097726464,
-0.04729008302092552,
0.22877314686775208,
0.32063907384872437,
-0.1063365712761879,
0.07615022361278534,
0.044211890548467636,
0.06667669862508774,
0.08094911277294159,
-0.000001248220655725163,
0.08536088466644287,
0.21668578684329987,
-0.10287478566169739,
0.030564529821276665,
-0.10342889279127121,
0.039264287799596786,
-0.033496029675006866,
0.08775350451469421,
0.027614589780569077,
-0.044471848756074905,
-0.04340978339314461,
0.06283750385046005,
-0.15833574533462524,
0.026517873629927635,
-0.04084085673093796,
-0.21025019884109497,
-0.043505556881427765,
0.006965681444853544,
0.11847934871912003,
0.09164498001337051,
0.06487344205379486,
-0.01695699617266655,
-0.06480567902326584,
-0.021457232534885406,
0.04461009427905083,
-0.16530579328536987,
-0.0029749281238764524,
0.07626184076070786,
-0.07891006767749786,
0.06273482739925385,
-0.05927244946360588,
-0.03393690288066864,
0.12501607835292816,
0.0312943160533905,
0.040397610515356064,
0.0659472718834877,
0.08458288758993149,
0.022421926259994507,
-0.08695397526025772,
0.019220920279622078,
0.05259772762656212,
-0.048757728189229965,
0.10853663831949234,
-0.03648676723241806,
0.09121563285589218,
0.07960829138755798,
-0.06600513309240341,
0.017749706283211708,
0.13915055990219116,
-0.08391523361206055,
0.04997771978378296,
0.08738276362419128,
0.031309835612773895,
-0.05471610277891159,
-0.055861830711364746,
-0.08210048079490662,
0.02424595318734646,
-0.011296415701508522,
-0.03334980830550194,
-0.004065347835421562,
-0.048491138964891434,
0.01221686415374279,
0.04858465492725372,
-0.11009740829467773,
-0.03894849494099617,
-0.046949874609708786,
0.07369235157966614,
-0.09855179488658905,
0.07470574229955673,
0.07246504724025726,
-0.008604598231613636,
0.008048875257372856,
-0.13763289153575897,
0.03214281052350998,
0.06248623877763748,
-0.06335343420505524,
-0.02446667291224003
] |
null | null |
transformers
|
# XGLM-2.9B
XGLM-2.9B is a multilingual autoregressive language model (with 2.9 billion parameters) trained on a balanced corpus of a diverse set of languages totaling 500 billion sub-tokens. It was introduced in the paper [Few-shot Learning with Multilingual Language Models](https://arxiv.org/abs/2112.10668) by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in [this repository](https://github.com/pytorch/fairseq/tree/main/examples/xglm).
## Training Data Statistics
The training data statistics of XGLM-2.9B is shown in the table below.
| ISO-639-1| family | name | # tokens | ratio | ratio w/ lowRes upsampling |
|:--------|:-----------------|:------------------------|-------------:|------------:|-------------:|
| en | Indo-European | English | 803526736124 | 0.489906 | 0.3259 |
| ru | Indo-European | Russian | 147791898098 | 0.0901079 | 0.0602 |
| zh | Sino-Tibetan | Chinese | 132770494630 | 0.0809494 | 0.0483 |
| de | Indo-European | German | 89223707856 | 0.0543992 | 0.0363 |
| es | Indo-European | Spanish | 87303083105 | 0.0532282 | 0.0353 |
| fr | Indo-European | French | 77419639775 | 0.0472023 | 0.0313 |
| ja | Japonic | Japanese | 66054364513 | 0.040273 | 0.0269 |
| it | Indo-European | Italian | 41930465338 | 0.0255648 | 0.0171 |
| pt | Indo-European | Portuguese | 36586032444 | 0.0223063 | 0.0297 |
| el | Indo-European | Greek (modern) | 28762166159 | 0.0175361 | 0.0233 |
| ko | Koreanic | Korean | 20002244535 | 0.0121953 | 0.0811 |
| fi | Uralic | Finnish | 16804309722 | 0.0102455 | 0.0681 |
| id | Austronesian | Indonesian | 15423541953 | 0.00940365 | 0.0125 |
| tr | Turkic | Turkish | 12413166065 | 0.00756824 | 0.0101 |
| ar | Afro-Asiatic | Arabic | 12248607345 | 0.00746791 | 0.0099 |
| vi | Austroasiatic | Vietnamese | 11199121869 | 0.00682804 | 0.0091 |
| th | Tai–Kadai | Thai | 10842172807 | 0.00661041 | 0.044 |
| bg | Indo-European | Bulgarian | 9703797869 | 0.00591635 | 0.0393 |
| ca | Indo-European | Catalan | 7075834775 | 0.0043141 | 0.0287 |
| hi | Indo-European | Hindi | 3448390110 | 0.00210246 | 0.014 |
| et | Uralic | Estonian | 3286873851 | 0.00200399 | 0.0133 |
| bn | Indo-European | Bengali, Bangla | 1627447450 | 0.000992245 | 0.0066 |
| ta | Dravidian | Tamil | 1476973397 | 0.000900502 | 0.006 |
| ur | Indo-European | Urdu | 1351891969 | 0.000824241 | 0.0055 |
| sw | Niger–Congo | Swahili | 907516139 | 0.000553307 | 0.0037 |
| te | Dravidian | Telugu | 689316485 | 0.000420272 | 0.0028 |
| eu | Language isolate | Basque | 105304423 | 6.42035e-05 | 0.0043 |
| my | Sino-Tibetan | Burmese | 101358331 | 6.17976e-05 | 0.003 |
| ht | Creole | Haitian, Haitian Creole | 86584697 | 5.27902e-05 | 0.0035 |
| qu | Quechuan | Quechua | 3236108 | 1.97304e-06 | 0.0001 |
## Model card
For intended usage of the model, please refer to the [model card](https://github.com/pytorch/fairseq/blob/main/examples/xglm/model_card.md) released by the XGLM-2.9B development team.
## Example (COPA)
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
```python
import torch
import torch.nn.functional as F
from transformers import XGLMTokenizer, XGLMForCausalLM
tokenizer = XGLMTokenizer.from_pretrained("facebook/xglm-2.9B")
model = XGLMForCausalLM.from_pretrained("facebook/xglm-2.9B")
data_samples = {
'en': [
{
"premise": "I wanted to conserve energy.",
"choice1": "I swept the floor in the unoccupied room.",
"choice2": "I shut off the light in the unoccupied room.",
"question": "effect",
"label": "1"
},
{
"premise": "The flame on the candle went out.",
"choice1": "I blew on the wick.",
"choice2": "I put a match to the wick.",
"question": "cause",
"label": "0"
}
],
'zh': [
{
"premise": "我想节约能源。",
"choice1": "我在空着的房间里扫了地板。",
"choice2": "我把空房间里的灯关了。",
"question": "effect",
"label": "1"
},
{
"premise": "蜡烛上的火焰熄灭了。",
"choice1": "我吹灭了灯芯。",
"choice2": "我把一根火柴放在灯芯上。",
"question": "cause",
"label": "0"
}
],
'hi': [
{
"premise": "M te vle konsève enèji.",
"choice1": "Mwen te fin baleye chanm lib la.",
"choice2": "Mwen te femen limyè nan chanm lib la.",
"question": "effect",
"label": "1"
},
{
"premise": "Flam bouji a te etenn.",
"choice1": "Mwen te soufle bouji a.",
"choice2": "Mwen te limen mèch bouji a.",
"question": "cause",
"label": "0"
}
]
}
def get_logprobs(prompt):
inputs = tokenizer(prompt, return_tensors="pt")
input_ids, output_ids = inputs["input_ids"], inputs["input_ids"][:, 1:]
outputs = model(**inputs, labels=input_ids)
logits = outputs.logits
logprobs = torch.gather(F.log_softmax(logits, dim=2), 2, output_ids.unsqueeze(2))
return logprobs
# Zero-shot evaluation for the Choice of Plausible Alternatives (COPA) task.
# A return value of 0 indicates that the first alternative is more plausible,
# while 1 indicates that the second alternative is more plausible.
def COPA_eval(prompt, alternative1, alternative2):
lprob1 = get_logprobs(prompt + "\n" + alternative1).sum()
lprob2 = get_logprobs(prompt + "\n" + alternative2).sum()
return 0 if lprob1 > lprob2 else 1
for lang in data_samples_long:
for idx, example in enumerate(data_samples_long[lang]):
predict = COPA_eval(example["premise"], example["choice1"], example["choice2"])
print(f'{lang}-{idx}', predict, example['label'])
# en-0 1 1
# en-1 0 0
# zh-0 1 1
# zh-1 0 0
# hi-0 1 1
# hi-1 0 0
```
|
{"language": ["multilingual", "en", "ru", "zh", "de", "es", "fr", "ja", "it", "pt", "el", "ko", "fi", "id", "tr", "ar", "vi", "th", "bg", "ca", "hi", "et", "bn", "ta", "ur", "sw", "te", "eu", "my", "ht", "qu"], "license": "mit", "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png", "inference": false}
|
text-generation
|
facebook/xglm-2.9B
|
[
"transformers",
"pytorch",
"xglm",
"text-generation",
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu",
"arxiv:2112.10668",
"license:mit",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2112.10668"
] |
[
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu"
] |
TAGS
#transformers #pytorch #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us
|
XGLM-2.9B
=========
XGLM-2.9B is a multilingual autoregressive language model (with 2.9 billion parameters) trained on a balanced corpus of a diverse set of languages totaling 500 billion sub-tokens. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in this repository.
Training Data Statistics
------------------------
The training data statistics of XGLM-2.9B is shown in the table below.
Model card
----------
For intended usage of the model, please refer to the model card released by the XGLM-2.9B development team.
Example (COPA)
--------------
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
|
[] |
[
"TAGS\n#transformers #pytorch #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
113
] |
[
"passage: TAGS\n#transformers #pytorch #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
-0.02294096350669861,
-0.0840897336602211,
-0.008790376596152782,
0.040797892957925797,
0.07706532627344131,
0.01802053488790989,
0.10317088663578033,
0.07336292415857315,
0.12203777581453323,
0.02395820803940296,
0.12009905278682709,
0.07689080387353897,
0.01508802454918623,
0.06599585711956024,
0.005764448549598455,
-0.2739368677139282,
0.0018361873226240277,
-0.003720506327226758,
-0.039812419563531876,
0.09642938524484634,
0.08662501722574234,
-0.03614256531000137,
0.10534713417291641,
-0.05073833093047142,
-0.01584417186677456,
0.049842413514852524,
-0.027058955281972885,
-0.020393982529640198,
0.11259093135595322,
0.10257763415575027,
0.04289514571428299,
0.06095734238624573,
-0.013574734330177307,
-0.1794949173927307,
0.03866412490606308,
-0.04587922617793083,
-0.12482477724552155,
-0.00042297824984416366,
0.03772956505417824,
-0.14076979458332062,
0.18372049927711487,
-0.001333900261670351,
-0.12696582078933716,
0.05213996767997742,
-0.17007485032081604,
-0.15708547830581665,
-0.059187039732933044,
0.126675546169281,
-0.03957037627696991,
0.06081775575876236,
-0.036933235824108124,
0.09272806346416473,
-0.10728012770414352,
0.0680338516831398,
0.2077222764492035,
-0.293096661567688,
-0.035472579300403595,
0.06611253321170807,
0.12982794642448425,
0.15147769451141357,
-0.10886004567146301,
0.10371233522891998,
0.054302360862493515,
-0.0016503820661455393,
-0.051190201193094254,
-0.09803400188684464,
0.02858475223183632,
0.06315919756889343,
-0.10969587415456772,
-0.012916702777147293,
0.22052663564682007,
-0.015291346237063408,
0.04441985487937927,
0.12412160634994507,
-0.064156174659729,
-0.11400743573904037,
0.006683855317533016,
-0.0019229642348363996,
-0.01141070295125246,
0.03388737514615059,
0.06227177008986473,
-0.056559327989816666,
-0.1272362470626831,
0.0463264063000679,
-0.1777745932340622,
0.222642719745636,
0.01555652916431427,
-0.013182935304939747,
-0.0536297932267189,
0.04366675391793251,
-0.038515355437994,
-0.11287185549736023,
0.00813204050064087,
-0.05282212048768997,
0.07067105174064636,
0.07109955698251724,
0.017388364300131798,
0.03306879848241806,
0.07525486499071121,
0.10570009052753448,
-0.09673065692186356,
0.04594619572162628,
0.007494255900382996,
0.14874055981636047,
0.027125364169478416,
0.05056144297122955,
-0.013105308637022972,
-0.1019451692700386,
-0.06592395901679993,
-0.05657639726996422,
0.014750945381820202,
-0.03352637588977814,
-0.1877364069223404,
-0.049680445343256,
-0.014782903715968132,
0.04354396462440491,
0.0025647380389273167,
0.06325861811637878,
-0.021058714017271996,
0.04477008432149887,
-0.004427502863109112,
-0.01958216167986393,
0.021236328408122063,
0.02493387646973133,
0.026684045791625977,
0.0918896272778511,
-0.05098790302872658,
-0.012433811090886593,
-0.03513867408037186,
0.03890986740589142,
-0.06166455149650574,
0.07290057092905045,
-0.012116262689232826,
-0.09089849144220352,
0.029829829931259155,
-0.08473201841115952,
0.034564197063446045,
-0.1784765124320984,
0.010842202231287956,
-0.013262835331261158,
-0.016812894493341446,
-0.04326298460364342,
0.02733437716960907,
-0.039563003927469254,
-0.06975965946912766,
0.05756964161992073,
-0.03644159063696861,
-0.026042252779006958,
-0.0936349481344223,
0.108599454164505,
-0.0324164554476738,
0.08602101355791092,
-0.15825241804122925,
0.040658872574567795,
-0.014024204574525356,
0.03524819761514664,
-0.04161188751459122,
-0.007048462983220816,
-0.05948479473590851,
0.01807050220668316,
-0.004534014966338873,
-0.05604939162731171,
-0.06659501045942307,
0.068659707903862,
-0.038121454417705536,
0.16125619411468506,
-0.16866660118103027,
-0.10290312767028809,
0.20019257068634033,
-0.06270524859428406,
-0.09503878653049469,
0.13413488864898682,
0.04403495788574219,
0.014800568111240864,
0.012132389470934868,
0.24399451911449432,
-0.007781651336699724,
-0.09958912432193756,
-0.08086027950048447,
0.11475935578346252,
-0.014875946566462517,
0.034675631672143936,
0.10762525349855423,
0.0436992309987545,
0.028715964406728745,
0.013012348674237728,
0.030971651896834373,
0.08946637064218521,
-0.04807684198021889,
-0.053273193538188934,
0.03245904669165611,
0.00024360035604331642,
0.1152733713388443,
0.026318613439798355,
0.039697229862213135,
-0.08722762018442154,
-0.051772404462099075,
-0.047151390463113785,
0.08759621530771255,
0.05102605000138283,
0.04930264502763748,
-0.05114861950278282,
0.1402992457151413,
0.056018128991127014,
0.0016811543609946966,
-0.08333244174718857,
0.08916598558425903,
-0.07133176922798157,
0.12845908105373383,
0.09684617817401886,
0.20747412741184235,
0.07998373359441757,
-0.01784064993262291,
-0.08550406247377396,
-0.04216129332780838,
0.010868514887988567,
-0.013408194296061993,
-0.03240665793418884,
-0.16649441421031952,
0.08362001180648804,
-0.03995472937822342,
0.054929111152887344,
-0.14784495532512665,
0.026385053992271423,
0.14776581525802612,
0.0964324027299881,
-0.03621501848101616,
0.08070027828216553,
-0.08533388376235962,
0.053281232714653015,
-0.08163384348154068,
0.033779576420784,
0.058103401213884354,
-0.037215836346149445,
-0.11598384380340576,
0.17028096318244934,
-0.12543925642967224,
0.23815825581550598,
0.1886778026819229,
-0.22368909418582916,
-0.009765755385160446,
0.014911788515746593,
-0.03847067430615425,
0.004793065600097179,
0.13621310889720917,
-0.04425708204507828,
0.043830595910549164,
-0.02098808065056801,
0.10670199245214462,
-0.059622496366500854,
-0.022540383040905,
0.014378657564520836,
-0.0872781053185463,
-0.08534865826368332,
0.1752040535211563,
0.06648798286914825,
-0.15549509227275848,
0.24577514827251434,
0.33590105175971985,
0.03943885490298271,
0.2619292736053467,
0.02238626219332218,
0.01438372116535902,
-0.04297265410423279,
0.002157537965103984,
-0.04271605983376503,
0.11673080921173096,
-0.15978863835334778,
-0.024611175060272217,
-0.002101513557136059,
-0.0021383813582360744,
0.03773748129606247,
-0.10005859285593033,
-0.0866435170173645,
-0.050220321863889694,
-0.0204097181558609,
-0.012040792964398861,
0.09818156063556671,
-0.04373789578676224,
0.1093427985906601,
0.013274559751152992,
-0.08516468107700348,
0.0039291661232709885,
0.018796684220433235,
-0.04586261510848999,
0.15796297788619995,
-0.15460807085037231,
-0.21846649050712585,
-0.0312722846865654,
-0.15006159245967865,
-0.04710331931710243,
0.0007985268603079021,
0.08062424510717392,
-0.11680024862289429,
0.008213517256081104,
0.019004132598638535,
0.12410539388656616,
-0.16156937181949615,
-0.022347578778862953,
-0.12771007418632507,
0.01858498714864254,
-0.07922249287366867,
-0.05517590045928955,
-0.06990037858486176,
-0.01806354708969593,
-0.07621584832668304,
0.15124446153640747,
-0.10649682581424713,
0.10971671342849731,
0.10771945863962173,
0.025463717058300972,
0.04064463824033737,
-0.04614318534731865,
0.1783732771873474,
-0.14617948234081268,
0.014035318978130817,
0.06321181356906891,
0.013874887488782406,
0.09763452410697937,
0.1361120492219925,
0.03534647077322006,
-0.028772998601198196,
-0.022745465859770775,
0.01736723817884922,
-0.047793831676244736,
-0.15801191329956055,
-0.11783154308795929,
-0.0979384258389473,
0.1003713384270668,
-0.06924252957105637,
0.09273553639650345,
0.07760421186685562,
0.021090500056743622,
-0.03623080998659134,
-0.04546547308564186,
-0.06382899731397629,
0.03618235141038895,
0.16822683811187744,
-0.07888562977313995,
0.08940166980028152,
-0.05476089194417,
-0.054483283311128616,
0.12345029413700104,
0.11453749239444733,
0.006663334555923939,
0.08096349984407425,
0.015566159971058369,
0.10771776735782623,
0.11919183284044266,
0.07071197032928467,
-0.025339022278785706,
0.03624750301241875,
-0.06000909209251404,
-0.023656101897358894,
-0.06101708114147186,
0.026264384388923645,
0.04032643139362335,
0.10105569660663605,
-0.06510414928197861,
-0.011474519968032837,
-0.11970417946577072,
0.12310444563627243,
-0.08071883767843246,
0.048189327120780945,
-0.0300357174128294,
0.009804662317037582,
0.0903388112783432,
0.031242936849594116,
-0.06574882566928864,
0.025848155841231346,
0.09544028341770172,
-0.11493287980556488,
0.07775561511516571,
0.04850032925605774,
0.069256491959095,
-0.002082277787849307,
0.10479683429002762,
-0.11088511347770691,
-0.07983589917421341,
-0.023394986987113953,
0.07982928305864334,
-0.3311297297477722,
0.28123921155929565,
0.026481563225388527,
-0.08092144131660461,
0.00022479586186818779,
-0.08229584991931915,
0.0468611977994442,
0.21082155406475067,
0.11715720593929291,
0.08609235286712646,
-0.014760177582502365,
-0.14425300061702728,
0.05762530118227005,
-0.012922546826303005,
0.14756199717521667,
-0.04387863725423813,
0.005037189461290836,
0.0038609644398093224,
0.009889046661555767,
-0.04637126252055168,
0.13232137262821198,
-0.06355935335159302,
-0.12361205369234085,
0.10362088680267334,
-0.015029077418148518,
0.035213809460401535,
-0.018225735053420067,
-0.07217516750097275,
-0.17611242830753326,
0.03597951680421829,
-0.12628227472305298,
-0.009389854967594147,
-0.06653837114572525,
-0.08505524694919586,
0.01296123955398798,
-0.11910154670476913,
-0.04228660836815834,
-0.03607754781842232,
-0.0893062874674797,
-0.13239751756191254,
-0.02395695447921753,
0.09645751863718033,
-0.059619151055812836,
-0.08394881337881088,
-0.007398153189569712,
0.13942648470401764,
0.024037254974246025,
0.09394387155771255,
-0.051983870565891266,
-0.002862086985260248,
-0.0949925035238266,
-0.1286463439464569,
0.06774229556322098,
-0.07473044097423553,
-0.024787429720163345,
0.016305655241012573,
-0.0658894032239914,
-0.03156816586852074,
-0.04543724283576012,
-0.11940400302410126,
0.14549927413463593,
0.3454761207103729,
-0.037812795490026474,
0.11192550510168076,
0.14061091840267181,
-0.06504598259925842,
-0.310202419757843,
-0.13903746008872986,
-0.14378289878368378,
0.012719032354652882,
0.0029083448462188244,
-0.1460660994052887,
-0.09465983510017395,
-0.00975304190069437,
-0.01614423841238022,
0.10811427235603333,
-0.27363529801368713,
-0.06462154537439346,
0.10419250279664993,
-0.0112113943323493,
0.3064541518688202,
-0.13352070748806,
-0.0347733199596405,
-0.039120327681303024,
-0.05675800144672394,
0.037244267761707306,
-0.06713645160198212,
0.11532879620790482,
-0.05515044927597046,
0.03444352000951767,
-0.009005719795823097,
0.015228514559566975,
0.17229688167572021,
-0.017728567123413086,
-0.008952876552939415,
-0.12989215552806854,
-0.17730821669101715,
0.08692365884780884,
0.01966228149831295,
-0.07950735837221146,
-0.19361300766468048,
-0.07385345548391342,
-0.13395529985427856,
0.016869790852069855,
-0.11988642811775208,
0.08914313465356827,
-0.05134717747569084,
-0.0936189517378807,
-0.09860058128833771,
0.05681392550468445,
-0.04492238909006119,
0.0017982599092647433,
0.19653378427028656,
-0.06874193996191025,
0.13701412081718445,
0.06755245476961136,
0.07950988411903381,
-0.07874859869480133,
0.04213119298219681,
-0.05379772186279297,
-0.03234681859612465,
0.03599962592124939,
-0.10244202613830566,
-0.02142791450023651,
0.12833340466022491,
-0.03422391414642334,
0.077253058552742,
0.05760922655463219,
-0.10322903841733932,
0.031125910580158234,
0.1289598047733307,
-0.14122794568538666,
-0.2032240778207779,
-0.06997325271368027,
-0.05455528572201729,
0.11663806438446045,
0.02931256778538227,
0.14897732436656952,
-0.018466081470251083,
-0.002023848472163081,
0.006303946487605572,
-0.022812023758888245,
-0.030053934082388878,
0.02781246230006218,
0.08473300188779831,
0.007172154262661934,
-0.06481961160898209,
0.022891338914632797,
0.015549338422715664,
-0.11121359467506409,
-0.015522508881986141,
0.22005322575569153,
-0.0678650438785553,
-0.15019644796848297,
-0.0710986852645874,
0.10318738222122192,
-0.12870755791664124,
-0.03453217074275017,
-0.025575291365385056,
-0.11447843164205551,
0.051949020475149155,
0.22961291670799255,
0.07320072501897812,
0.03941493481397629,
0.017798328772187233,
-0.0029384854715317488,
0.11878244578838348,
0.05105516314506531,
-0.013667344115674496,
-0.0037846555933356285,
-0.064037024974823,
0.07082504034042358,
0.009539357386529446,
0.1982763558626175,
-0.046576496213674545,
-0.06081189587712288,
-0.16002018749713898,
0.01174408569931984,
-0.0997338593006134,
-0.06004466861486435,
-0.0799938514828682,
-0.05220019072294235,
-0.03145388141274452,
-0.11331654340028763,
-0.06102839857339859,
-0.06362365186214447,
-0.11121636629104614,
0.013804384507238865,
0.029895441606640816,
0.11307426542043686,
-0.054702065885066986,
-0.008702273480594158,
0.10458724200725555,
-0.00274676620028913,
0.10720079392194748,
0.10579271614551544,
-0.0008025504066608846,
0.10837171971797943,
-0.156077578663826,
0.0011700987815856934,
0.04366722330451012,
0.03554334491491318,
0.02511286549270153,
0.11150961369276047,
-0.05397804453969002,
-0.039646215736866,
0.07263051718473434,
0.0730462297797203,
0.036420393735170364,
-0.037187665700912476,
0.13746502995491028,
-0.04222838953137398,
-0.13169699907302856,
-0.0257820263504982,
0.07714912295341492,
0.03407567739486694,
0.022260991856455803,
0.045378848910331726,
-0.06408196687698364,
0.023539062589406967,
-0.04935844615101814,
0.047815658152103424,
0.00916222669184208,
-0.1670481413602829,
-0.011634587310254574,
-0.10803371667861938,
0.009458196349442005,
-0.01657247357070446,
0.10336419939994812,
0.01915554143488407,
-0.05761474743485451,
0.02108202502131462,
0.04061652719974518,
-0.08687503635883331,
-0.015802042558789253,
0.07727106660604477,
0.03993947431445122,
-0.05036790668964386,
-0.15073753893375397,
0.04043283313512802,
-0.012365148402750492,
0.012900935485959053,
0.08855101466178894,
0.09492862969636917,
0.12545916438102722,
0.06577549874782562,
-0.011929357424378395,
-0.06595491617918015,
-0.0763486996293068,
-0.12048141658306122,
0.02034173719584942,
-0.03204232454299927,
-0.08856990933418274,
0.17140772938728333,
0.20295360684394836,
-0.051987528800964355,
0.0372222363948822,
-0.02170143835246563,
-0.01033688522875309,
-0.11743791401386261,
-0.09206581115722656,
0.002729605883359909,
-0.08538643270730972,
-0.022291310131549835,
-0.04356653615832329,
0.06401991099119186,
0.016589928418397903,
0.06521348655223846,
-0.03430306911468506,
0.06329986453056335,
0.02844730019569397,
-0.1270252913236618,
0.01346118189394474,
-0.0159861221909523,
0.0746004730463028,
-0.1172826737165451,
0.027760399505496025,
-0.07098238915205002,
-0.08039394021034241,
-0.0293279979377985,
0.055786773562431335,
-0.04394679516553879,
-0.05820678919553757,
-0.1291230022907257,
-0.07263119518756866,
-0.03334438055753708,
0.07483275234699249,
0.025521783158183098,
0.17137157917022705,
0.009027399122714996,
-0.009413990192115307,
0.0037210064474493265,
0.1917780637741089,
-0.018550025299191475,
-0.06129809841513634,
0.059795256704092026,
0.1276845782995224,
0.011627979576587677,
0.10016095638275146,
-0.07131656259298325,
0.005038734991103411,
-0.046699926257133484,
0.22317607700824738,
0.3228292465209961,
-0.10354028642177582,
0.07509936392307281,
0.04303448274731636,
0.06761054694652557,
0.07907655835151672,
-0.006174579728394747,
0.08538781851530075,
0.224729984998703,
-0.10598460584878922,
0.032810527831315994,
-0.10262739658355713,
0.03251698985695839,
-0.04085691273212433,
0.08467122912406921,
0.028402527794241905,
-0.037388984113931656,
-0.043320104479789734,
0.0602838508784771,
-0.17322766780853271,
0.036808259785175323,
-0.03372122347354889,
-0.21406817436218262,
-0.04125659167766571,
0.010391508229076862,
0.1334380805492401,
0.09429479390382767,
0.06841708719730377,
-0.013639396987855434,
-0.06926210969686508,
-0.017665795981884003,
0.04275286942720413,
-0.17387425899505615,
0.0021445921156555414,
0.08031835407018661,
-0.07216359674930573,
0.059708110988140106,
-0.056098487228155136,
-0.034750327467918396,
0.12295584380626678,
0.03068564273416996,
0.03770766407251358,
0.07144840061664581,
0.0814020186662674,
0.014821978285908699,
-0.08352191001176834,
0.016450613737106323,
0.05675453692674637,
-0.054470494389534,
0.1102517694234848,
-0.027265852317214012,
0.09288714826107025,
0.08014126867055893,
-0.06548468768596649,
0.018233921378850937,
0.13392896950244904,
-0.08782373368740082,
0.043213482946157455,
0.07138966768980026,
0.03163139894604683,
-0.058815620839595795,
-0.04945933073759079,
-0.07623022049665451,
0.020549682900309563,
-0.006620882079005241,
-0.040207453072071075,
-0.0048984321765601635,
-0.048977307975292206,
0.031954605132341385,
0.046806760132312775,
-0.11315256357192993,
-0.039495740085840225,
-0.044832032173871994,
0.07718908786773682,
-0.09562487155199051,
0.06908905506134033,
0.07356790453195572,
-0.007321787066757679,
0.006782867480069399,
-0.15468986332416534,
0.033609144389629364,
0.060267891734838486,
-0.06247176229953766,
-0.02353094331920147
] |
null | null |
transformers
|
# XGLM-4.5B
XGLM-4.5B is a multilingual autoregressive language model (with 4.5 billion parameters) trained on a balanced corpus of a diverse set of 134 languages. It was introduced in the paper [Few-shot Learning with Multilingual Language Models](https://arxiv.org/abs/2112.10668) by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in [this repository](https://github.com/pytorch/fairseq/tree/main/examples/xglm).
## Model card
For intended usage of the model, please refer to the [model card](https://github.com/pytorch/fairseq/blob/main/examples/xglm/model_card.md) released by the XGLM-4.5B development team.
## Example (COPA)
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
```python
import torch
import torch.nn.functional as F
from transformers import XGLMTokenizer, XGLMForCausalLM
tokenizer = XGLMTokenizer.from_pretrained("facebook/xglm-4.5B")
model = XGLMForCausalLM.from_pretrained("facebook/xglm-4.5B")
data_samples = {
'en': [
{
"premise": "I wanted to conserve energy.",
"choice1": "I swept the floor in the unoccupied room.",
"choice2": "I shut off the light in the unoccupied room.",
"question": "effect",
"label": "1"
},
{
"premise": "The flame on the candle went out.",
"choice1": "I blew on the wick.",
"choice2": "I put a match to the wick.",
"question": "cause",
"label": "0"
}
],
'zh': [
{
"premise": "我想节约能源。",
"choice1": "我在空着的房间里扫了地板。",
"choice2": "我把空房间里的灯关了。",
"question": "effect",
"label": "1"
},
{
"premise": "蜡烛上的火焰熄灭了。",
"choice1": "我吹灭了灯芯。",
"choice2": "我把一根火柴放在灯芯上。",
"question": "cause",
"label": "0"
}
],
'hi': [
{
"premise": "M te vle konsève enèji.",
"choice1": "Mwen te fin baleye chanm lib la.",
"choice2": "Mwen te femen limyè nan chanm lib la.",
"question": "effect",
"label": "1"
},
{
"premise": "Flam bouji a te etenn.",
"choice1": "Mwen te soufle bouji a.",
"choice2": "Mwen te limen mèch bouji a.",
"question": "cause",
"label": "0"
}
]
}
def get_logprobs(prompt):
inputs = tokenizer(prompt, return_tensors="pt")
input_ids, output_ids = inputs["input_ids"], inputs["input_ids"][:, 1:]
outputs = model(**inputs, labels=input_ids)
logits = outputs.logits
logprobs = torch.gather(F.log_softmax(logits, dim=2), 2, output_ids.unsqueeze(2))
return logprobs
# Zero-shot evaluation for the Choice of Plausible Alternatives (COPA) task.
# A return value of 0 indicates that the first alternative is more plausible,
# while 1 indicates that the second alternative is more plausible.
def COPA_eval(prompt, alternative1, alternative2):
lprob1 = get_logprobs(prompt + "\n" + alternative1).sum()
lprob2 = get_logprobs(prompt + "\n" + alternative2).sum()
return 0 if lprob1 > lprob2 else 1
for lang in data_samples_long:
for idx, example in enumerate(data_samples_long[lang]):
predict = COPA_eval(example["premise"], example["choice1"], example["choice2"])
print(f'{lang}-{idx}', predict, example['label'])
# en-0 1 1
# en-1 0 0
# zh-0 1 1
# zh-1 0 0
# hi-0 1 1
# hi-1 0 0
```
|
{"language": ["multilingual", "en", "ru", "zh", "de", "es", "fr", "ja", "it", "pt", "el", "ko", "fi", "id", "tr", "ar", "vi", "th", "bg", "ca", "hi", "et", "bn", "ta", "ur", "sw", "te", "eu", "my", "ht", "qu"], "license": "mit", "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png", "inference": false}
|
text-generation
|
facebook/xglm-4.5B
|
[
"transformers",
"pytorch",
"safetensors",
"xglm",
"text-generation",
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu",
"arxiv:2112.10668",
"license:mit",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2112.10668"
] |
[
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu"
] |
TAGS
#transformers #pytorch #safetensors #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us
|
# XGLM-4.5B
XGLM-4.5B is a multilingual autoregressive language model (with 4.5 billion parameters) trained on a balanced corpus of a diverse set of 134 languages. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in this repository.
## Model card
For intended usage of the model, please refer to the model card released by the XGLM-4.5B development team.
## Example (COPA)
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
|
[
"# XGLM-4.5B\n\nXGLM-4.5B is a multilingual autoregressive language model (with 4.5 billion parameters) trained on a balanced corpus of a diverse set of 134 languages. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\\* (\\*Equal Contribution). The original implementation was released in this repository.",
"## Model card\n\nFor intended usage of the model, please refer to the model card released by the XGLM-4.5B development team.",
"## Example (COPA)\nThe following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi."
] |
[
"TAGS\n#transformers #pytorch #safetensors #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n",
"# XGLM-4.5B\n\nXGLM-4.5B is a multilingual autoregressive language model (with 4.5 billion parameters) trained on a balanced corpus of a diverse set of 134 languages. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\\* (\\*Equal Contribution). The original implementation was released in this repository.",
"## Model card\n\nFor intended usage of the model, please refer to the model card released by the XGLM-4.5B development team.",
"## Example (COPA)\nThe following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi."
] |
[
118,
196,
28,
54
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n# XGLM-4.5B\n\nXGLM-4.5B is a multilingual autoregressive language model (with 4.5 billion parameters) trained on a balanced corpus of a diverse set of 134 languages. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\\* (\\*Equal Contribution). The original implementation was released in this repository.## Model card\n\nFor intended usage of the model, please refer to the model card released by the XGLM-4.5B development team.## Example (COPA)\nThe following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi."
] |
[
-0.0788286030292511,
0.058479566127061844,
-0.004352767486125231,
0.048045381903648376,
0.08047276735305786,
-0.006786423735320568,
0.17295989394187927,
0.06444358825683594,
0.0005640372983179986,
0.03511234000325203,
0.03624847158789635,
0.045115385204553604,
0.0804390087723732,
0.08332747966051102,
0.015509475022554398,
-0.31047749519348145,
0.02658369019627571,
-0.024154558777809143,
-0.13006028532981873,
0.08769651502370834,
0.09789271652698517,
-0.022336343303322792,
0.08892254531383514,
0.017811795696616173,
0.011287837289273739,
0.00025744401500560343,
-0.10320381820201874,
-0.10238226503133774,
0.06769275665283203,
0.0844208225607872,
0.06247527897357941,
0.06433786451816559,
0.04283778741955757,
-0.20101845264434814,
0.013273380696773529,
-0.034524738788604736,
-0.0326806865632534,
-0.05067665874958038,
0.07556096464395523,
-0.004792818799614906,
0.2639996409416199,
-0.042120568454265594,
0.0006695378106087446,
0.024197716265916824,
-0.10167736560106277,
-0.10575265437364578,
-0.0642639622092247,
0.13004055619239807,
0.054667506366968155,
0.02797849290072918,
-0.04492347314953804,
0.09908688813447952,
-0.050445642322301865,
0.058562491089105606,
0.0996912270784378,
-0.24158811569213867,
-0.08961287885904312,
0.1536165326833725,
0.12435032427310944,
0.08126848191022873,
-0.0785723626613617,
0.03545423224568367,
0.05141217261552811,
0.025718817487359047,
-0.07527860254049301,
-0.12317962944507599,
0.18082471191883087,
0.015420675277709961,
-0.14170797169208527,
0.032048240303993225,
0.18822936713695526,
-0.013516051694750786,
-0.058261699974536896,
-0.10232914984226227,
0.0022460392210632563,
0.09243015944957733,
-0.044678788632154465,
-0.05464309826493263,
-0.0034227329306304455,
0.028029650449752808,
0.10248632729053497,
-0.02674604207277298,
-0.09366593509912491,
0.05589011684060097,
0.04420837014913559,
0.042732641100883484,
0.06204162538051605,
-0.030342407524585724,
-0.02627468854188919,
-0.011643929407000542,
-0.053534120321273804,
-0.09297671914100647,
-0.05065201222896576,
-0.10619649291038513,
-0.03240545094013214,
0.017116185277700424,
0.015302250161767006,
-0.055837973952293396,
0.11558949202299118,
0.09979180991649628,
-0.04235084354877472,
0.10613995045423508,
0.027614295482635498,
0.008767091669142246,
0.034417811781167984,
0.11994489282369614,
-0.03924805670976639,
-0.12728320062160492,
-0.03108205832540989,
0.039651382714509964,
-0.042915333062410355,
-0.01267886720597744,
-0.05714617297053337,
0.04230598360300064,
-0.042250316590070724,
0.03635852411389351,
-0.0490560308098793,
0.058885470032691956,
0.0031859874725341797,
-0.025766674429178238,
0.07132472097873688,
-0.09414935111999512,
-0.009887175634503365,
0.017955267801880836,
-0.0001392153208144009,
0.06930055469274521,
-0.011008632369339466,
0.0030254668090492487,
-0.07832508534193039,
-0.016256581991910934,
-0.01601366326212883,
0.0018961193272843957,
-0.07434018701314926,
-0.08241018652915955,
0.032116539776325226,
-0.08077951520681381,
-0.0035051442682743073,
-0.1388169527053833,
-0.07099498808383942,
-0.07502833753824234,
0.04879031330347061,
-0.03693554922938347,
-0.00048434038762934506,
-0.0765119269490242,
-0.09196509420871735,
-0.016177844256162643,
0.02975204400718212,
-0.03505607321858406,
-0.03757897764444351,
0.02470441907644272,
-0.05198720842599869,
0.006663633976131678,
0.012099898420274258,
0.01019301824271679,
-0.022340668365359306,
0.07980935275554657,
-0.09382796287536621,
0.08722838759422302,
-0.11140306293964386,
0.021082522347569466,
-0.10870454460382462,
-0.06943757086992264,
0.09016483277082443,
0.0704851821064949,
-0.02501523867249489,
0.14189042150974274,
-0.11443120241165161,
-0.0521923191845417,
0.1776864230632782,
-0.13999000191688538,
-0.021036332473158836,
0.13618440926074982,
-0.00080918800085783,
0.050975240767002106,
0.054419051855802536,
0.07959385961294174,
0.10323970764875412,
-0.07623397558927536,
0.019703708589076996,
-0.030727816745638847,
0.12523555755615234,
0.15473738312721252,
0.0750499814748764,
0.004928153939545155,
0.03968891501426697,
0.029876939952373505,
-0.05965952202677727,
-0.010219785384833813,
-0.03592590242624283,
-0.07647813111543655,
0.02782623842358589,
-0.0756358802318573,
0.029024513438344002,
0.023379163816571236,
0.041414741426706314,
-0.02904629148542881,
-0.03651563450694084,
0.0025742391590029,
0.10792206227779388,
-0.043233901262283325,
-0.01894415356218815,
-0.10375282913446426,
0.08421222865581512,
0.05327208712697029,
0.016357431188225746,
-0.09266538172960281,
0.06092483922839165,
0.0472639799118042,
-0.03947397693991661,
0.11492375284433365,
0.1508798599243164,
0.037789445370435715,
0.08688318729400635,
-0.056262608617544174,
-0.00365031068213284,
-0.007532601244747639,
-0.01214201282709837,
0.023613203316926956,
-0.15561066567897797,
0.03516498953104019,
-0.07144715636968613,
0.10234489291906357,
-0.2239733636379242,
0.03876972198486328,
0.019977541640400887,
0.07748551666736603,
0.01925293169915676,
0.005591291468590498,
0.023534661158919334,
0.09539587795734406,
-0.02491660788655281,
-0.02813703380525112,
0.040598951280117035,
-0.0033253671135753393,
-0.08377525210380554,
0.06485970318317413,
-0.12104710191488266,
0.009607336483895779,
0.09115961194038391,
-0.02629605494439602,
-0.06905128061771393,
-0.0007940519135445356,
-0.033485982567071915,
-0.034581344574689865,
0.07163079082965851,
0.05414121225476265,
0.10384008288383484,
-0.01627470552921295,
0.10509639233350754,
-0.10523654520511627,
0.004217587877064943,
0.023328695446252823,
-0.06772679835557938,
-0.059364836663007736,
0.1493433117866516,
0.09300410002470016,
-0.1836843192577362,
0.11553708463907242,
0.040484827011823654,
0.03249548375606537,
0.25499993562698364,
-0.03619389608502388,
-0.028400525450706482,
-0.0535992830991745,
0.09693970531225204,
-0.0030743665993213654,
0.09379842132329941,
-0.12605465948581696,
-0.05052584409713745,
0.007084468845278025,
-0.008928417228162289,
0.002408898901194334,
-0.08690714836120605,
-0.022288141772150993,
-0.017174700275063515,
-0.01725262962281704,
0.07909742742776871,
0.08221860975027084,
-0.047604143619537354,
0.05954134091734886,
-0.04633679240942001,
-0.07648371905088425,
-0.05125775560736656,
-0.02444508858025074,
-0.07458078861236572,
0.15197555720806122,
-0.13082876801490784,
-0.23026970028877258,
-0.021570734679698944,
-0.08358757197856903,
-0.06387444585561752,
-0.0009444607421755791,
0.03727942705154419,
-0.13364194333553314,
-0.07373782992362976,
-0.00711672892794013,
0.08992081135511398,
-0.07225435227155685,
-0.08036068826913834,
-0.027640461921691895,
0.01662609353661537,
-0.043325696140527725,
-0.08741340786218643,
-0.011203060857951641,
0.01452722679823637,
-0.12190662324428558,
0.07741941511631012,
-0.12303004413843155,
0.029429588466882706,
0.09174929559230804,
0.058686938136816025,
-0.01398049108684063,
-0.045143354684114456,
0.21923702955245972,
-0.09747611731290817,
0.08544546365737915,
0.08141974359750748,
-0.051661696285009384,
0.05734269693493843,
0.0832727923989296,
0.011643815785646439,
-0.0611468069255352,
-0.019010966643691063,
0.010096549056470394,
-0.03884956240653992,
-0.23455947637557983,
-0.06173055246472359,
-0.09290428459644318,
0.05600771680474281,
-0.004370831418782473,
0.050960686057806015,
0.03990329056978226,
0.07658511400222778,
-0.07276230305433273,
0.048864271491765976,
0.10173451155424118,
0.0662207305431366,
0.11372187733650208,
-0.016559742391109467,
0.04478752613067627,
-0.04090739041566849,
-0.044339872896671295,
0.07360921800136566,
0.027328725904226303,
0.25154855847358704,
0.08052882552146912,
0.061883412301540375,
0.12838929891586304,
0.07459771633148193,
0.03279994800686836,
-0.039669137448072433,
-0.029464252293109894,
0.03843686357140541,
-0.03091888129711151,
-0.05734601244330406,
0.010663621127605438,
0.10708411037921906,
0.056259334087371826,
-0.0690874233841896,
-0.0022633930202573538,
0.07535351067781448,
0.057808369398117065,
0.12728792428970337,
0.029608381912112236,
-0.043344978243112564,
-0.019280266016721725,
0.05172091722488403,
-0.061365071684122086,
-0.05493055284023285,
0.0711309015750885,
0.08320309966802597,
-0.1397712081670761,
0.1450652927160263,
-0.017427464947104454,
0.06585517525672913,
-0.03698030486702919,
0.007182498928159475,
-0.05640633404254913,
-0.033463578671216965,
0.005968886893242598,
0.06031098961830139,
-0.3125966489315033,
0.2148929387331009,
-0.0020523767452687025,
0.02316933311522007,
-0.055372703820466995,
0.0320783294737339,
0.06382705271244049,
0.057897597551345825,
0.08910131454467773,
0.0396723598241806,
-0.0998552069067955,
-0.030685730278491974,
0.003919797483831644,
0.002261496614664793,
0.10847563296556473,
-0.006826923694461584,
0.08659835159778595,
0.012753309682011604,
-0.004795783199369907,
-0.06216701120138168,
0.05106572061777115,
-0.15378805994987488,
-0.1518305242061615,
0.08363748341798782,
-0.14775846898555756,
-0.04465530067682266,
-0.08000562340021133,
-0.06057851389050484,
-0.1627708524465561,
0.0790792852640152,
-0.07186904549598694,
-0.11292713135480881,
-0.043443888425827026,
-0.007958244532346725,
0.09862113744020462,
-0.05786077305674553,
0.03887112811207771,
-0.014230837114155293,
-0.029214391484856606,
-0.0840989202260971,
-0.005691485479474068,
0.007185916882008314,
-0.038388870656490326,
-0.19809013605117798,
-0.01687082275748253,
0.11161936819553375,
0.13232019543647766,
0.04354553669691086,
0.03642243146896362,
0.016359709203243256,
0.04934976249933243,
-0.11162082850933075,
-0.07277888059616089,
0.03931299224495888,
0.006594623904675245,
0.08209259808063507,
-0.07862541824579239,
-0.15547750890254974,
-0.10973895341157913,
-0.15511095523834229,
0.018372463062405586,
0.2215563952922821,
0.006311962381005287,
0.1292959600687027,
0.12064430117607117,
-0.12826836109161377,
-0.12164004147052765,
-0.11553217470645905,
0.02620965614914894,
0.02961748093366623,
-0.04798470810055733,
-0.2241402268409729,
-0.04157343506813049,
-0.022453658282756805,
0.010755261406302452,
0.030577605590224266,
-0.23764503002166748,
-0.10927990078926086,
0.11506209522485733,
0.0627744272351265,
0.05949298292398453,
-0.20687146484851837,
-0.04819522798061371,
0.013212733902037144,
-0.05925382673740387,
0.030153164640069008,
0.033366356045007706,
0.06778247654438019,
0.0036733471788465977,
-0.023977316915988922,
0.016600996255874634,
-0.024543019011616707,
0.18205495178699493,
0.02036973461508751,
-0.0017544695874676108,
-0.1238493025302887,
-0.04689296334981918,
0.09107906371355057,
0.026886647567152977,
0.01846742071211338,
-0.07926784455776215,
-0.05196565017104149,
-0.16641652584075928,
-0.05721786990761757,
-0.07014451920986176,
0.020886419340968132,
-0.09871804714202881,
-0.06626826524734497,
-0.08863478153944016,
0.14303374290466309,
0.024308545514941216,
0.012856106273829937,
0.021905437111854553,
-0.05911805108189583,
0.08611695468425751,
0.02841816283762455,
0.17876143753528595,
0.0852912962436676,
0.04978949576616287,
-0.08872050791978836,
0.0028741611167788506,
0.035108935087919235,
-0.13173195719718933,
-0.0421103909611702,
0.09057436883449554,
-0.012590473517775536,
0.14354616403579712,
-0.01914411410689354,
-0.1109541803598404,
0.04483247548341751,
0.08793250471353531,
-0.04025639593601227,
-0.18916277587413788,
-0.020355625078082085,
-0.06413695216178894,
0.07358457893133163,
-0.044729314744472504,
0.15675441920757294,
-0.08304478973150253,
-0.04239542409777641,
-0.016495730727910995,
0.06672514975070953,
-0.0318131148815155,
0.10007452219724655,
0.03145758435130119,
0.024991096928715706,
-0.06356250494718552,
0.024649158120155334,
0.0750078409910202,
-0.029928719624876976,
0.033867355436086655,
0.16963237524032593,
-0.08918433636426926,
-0.069570392370224,
-0.020894784480333328,
0.16978667676448822,
-0.057605721056461334,
-0.04567966237664223,
-0.015363882295787334,
-0.13539665937423706,
0.046219807118177414,
0.1234809160232544,
0.04653994366526604,
-0.014643244445323944,
0.03810065984725952,
0.016994385048747063,
0.03405795991420746,
0.06301849335432053,
0.1340274214744568,
-0.03330456465482712,
-0.0854664146900177,
0.05226825922727585,
0.027285508811473846,
0.06569643318653107,
-0.011398665606975555,
-0.06520296633243561,
-0.1715443879365921,
0.007468013092875481,
-0.2080499827861786,
0.0969114899635315,
-0.1441418081521988,
0.020219337195158005,
-0.024154143407940865,
-0.0737566128373146,
-0.03540952876210213,
-0.00445267278701067,
-0.09692303836345673,
0.01226136740297079,
-0.02969452366232872,
0.1391684114933014,
-0.03230800852179527,
-0.03570534288883209,
0.058842435479164124,
-0.08709469437599182,
0.054848555475473404,
-0.009970387443900108,
-0.07139420509338379,
0.11749155819416046,
-0.04871783033013344,
0.0660051479935646,
-0.004900678992271423,
0.06613028049468994,
-0.011593740433454514,
-0.08230580389499664,
0.025119323283433914,
0.009762288071215153,
0.05258902162313461,
0.009993376210331917,
0.03359321132302284,
-0.014003177173435688,
-0.03426346182823181,
-0.06098153069615364,
-0.10809054970741272,
-0.029926413670182228,
0.0600518137216568,
0.057037658989429474,
0.02755613625049591,
0.051863085478544235,
-0.07883262634277344,
0.0783301368355751,
-0.11638572812080383,
-0.018066614866256714,
0.01716025173664093,
-0.08196627348661423,
0.050016384571790695,
-0.08439552783966064,
0.047266703099012375,
0.004135630093514919,
0.12105567753314972,
-0.03977227956056595,
-0.005072071682661772,
-0.0064116111025214195,
-0.07052549719810486,
-0.03354756161570549,
0.05980054289102554,
0.12250439077615738,
0.053629472851753235,
0.027943292632699013,
-0.078069306910038,
-0.010938647203147411,
-0.03139503300189972,
0.004961254540830851,
0.04921764135360718,
0.1375298947095871,
0.09460200369358063,
0.046295344829559326,
0.012802405282855034,
-0.08508935570716858,
-0.029262935742735863,
0.010023226030170918,
-0.014650939963757992,
0.10267923027276993,
-0.09929687529802322,
0.06611542403697968,
0.19041423499584198,
-0.18371547758579254,
0.09546133875846863,
-0.061964813619852066,
-0.07266919314861298,
-0.06673626601696014,
-0.1712096482515335,
-0.05453399941325188,
-0.03191113471984863,
0.04075969010591507,
-0.09843974560499191,
0.007808914873749018,
0.015699904412031174,
0.08780936151742935,
-0.03675302863121033,
0.029163308441638947,
-0.007216005586087704,
-0.09908011555671692,
0.082411028444767,
0.00879452284425497,
0.053968675434589386,
-0.013397203758358955,
-0.00898158922791481,
-0.062083207070827484,
-0.006364453583955765,
-0.03871452808380127,
0.054058417677879333,
-0.021535584703087807,
-0.012384510599076748,
-0.07050967216491699,
-0.018154798075556755,
0.021487759426236153,
0.020005468279123306,
0.0638766661286354,
0.18366798758506775,
0.0016835230635479093,
-0.060000814497470856,
0.020071523264050484,
0.0907217264175415,
0.04215412214398384,
-0.2023850828409195,
-0.17465132474899292,
0.06093519553542137,
-0.031393278390169144,
-0.005282674916088581,
0.05486308038234711,
0.00026807512040250003,
0.021441573277115822,
0.2351415455341339,
0.24149677157402039,
-0.013594060204923153,
0.04578039050102234,
0.042584244161844254,
0.027829531580209732,
0.0032544259447604418,
0.0673782005906105,
0.061479777097702026,
0.29941532015800476,
-0.07314860075712204,
-0.009699815884232521,
-0.08721142262220383,
-0.016630863770842552,
-0.11910266429185867,
0.12219889461994171,
0.06989278644323349,
-0.0454523079097271,
-0.0012994074495509267,
0.11601511389017105,
-0.11971089988946915,
-0.16692669689655304,
0.023836182430386543,
-0.08315391093492508,
-0.1047096848487854,
-0.03141934424638748,
-0.059321191161870956,
0.08845013380050659,
0.023088397458195686,
0.010558626614511013,
-0.0429404117166996,
0.10590695589780807,
0.05910835415124893,
-0.057172585278749466,
-0.0018962426111102104,
0.14343620836734772,
-0.05332670733332634,
0.05623979866504669,
0.009921710006892681,
0.08200277388095856,
0.09052891284227371,
0.034133587032556534,
-0.0046198926866054535,
0.03119087778031826,
0.1103014200925827,
-0.0021013638470321894,
-0.021287623792886734,
0.07191599905490875,
-0.012514971196651459,
0.050554484128952026,
0.07994496822357178,
0.06396976858377457,
0.06899210810661316,
0.15864430367946625,
-0.03298148512840271,
-0.02525435946881771,
0.17548522353172302,
-0.18329037725925446,
0.11065508425235748,
0.15220265090465546,
0.00864165648818016,
0.0015818595420569181,
-0.016653887927532196,
0.08477452397346497,
-0.0578249953687191,
-0.023500997573137283,
-0.0580533929169178,
-0.16757331788539886,
-0.033053893595933914,
-0.1188318058848381,
0.09453938901424408,
-0.12575611472129822,
-0.0602826364338398,
-0.04158955067396164,
0.030670201405882835,
-0.11973680555820465,
0.16135267913341522,
0.07826724648475647,
-0.030307000502943993,
-0.03307114169001579,
-0.11769095808267593,
-0.04170022904872894,
0.08484906703233719,
-0.082814060151577,
-0.01944447122514248
] |
null | null |
transformers
|
# XGLM-564M
XGLM-564M is a multilingual autoregressive language model (with 564 million parameters) trained on a balanced corpus of a diverse set of 30 languages totaling 500 billion sub-tokens. It was introduced in the paper [Few-shot Learning with Multilingual Language Models](https://arxiv.org/abs/2112.10668) by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in [this repository](https://github.com/pytorch/fairseq/tree/main/examples/xglm).
## Training Data Statistics
The training data statistics of XGLM-564M is shown in the table below.
| ISO-639-1| family | name | # tokens | ratio | ratio w/ lowRes upsampling |
|:--------|:-----------------|:------------------------|-------------:|------------:|-------------:|
| en | Indo-European | English | 803526736124 | 0.489906 | 0.3259 |
| ru | Indo-European | Russian | 147791898098 | 0.0901079 | 0.0602 |
| zh | Sino-Tibetan | Chinese | 132770494630 | 0.0809494 | 0.0483 |
| de | Indo-European | German | 89223707856 | 0.0543992 | 0.0363 |
| es | Indo-European | Spanish | 87303083105 | 0.0532282 | 0.0353 |
| fr | Indo-European | French | 77419639775 | 0.0472023 | 0.0313 |
| ja | Japonic | Japanese | 66054364513 | 0.040273 | 0.0269 |
| it | Indo-European | Italian | 41930465338 | 0.0255648 | 0.0171 |
| pt | Indo-European | Portuguese | 36586032444 | 0.0223063 | 0.0297 |
| el | Indo-European | Greek (modern) | 28762166159 | 0.0175361 | 0.0233 |
| ko | Koreanic | Korean | 20002244535 | 0.0121953 | 0.0811 |
| fi | Uralic | Finnish | 16804309722 | 0.0102455 | 0.0681 |
| id | Austronesian | Indonesian | 15423541953 | 0.00940365 | 0.0125 |
| tr | Turkic | Turkish | 12413166065 | 0.00756824 | 0.0101 |
| ar | Afro-Asiatic | Arabic | 12248607345 | 0.00746791 | 0.0099 |
| vi | Austroasiatic | Vietnamese | 11199121869 | 0.00682804 | 0.0091 |
| th | Tai–Kadai | Thai | 10842172807 | 0.00661041 | 0.044 |
| bg | Indo-European | Bulgarian | 9703797869 | 0.00591635 | 0.0393 |
| ca | Indo-European | Catalan | 7075834775 | 0.0043141 | 0.0287 |
| hi | Indo-European | Hindi | 3448390110 | 0.00210246 | 0.014 |
| et | Uralic | Estonian | 3286873851 | 0.00200399 | 0.0133 |
| bn | Indo-European | Bengali, Bangla | 1627447450 | 0.000992245 | 0.0066 |
| ta | Dravidian | Tamil | 1476973397 | 0.000900502 | 0.006 |
| ur | Indo-European | Urdu | 1351891969 | 0.000824241 | 0.0055 |
| sw | Niger–Congo | Swahili | 907516139 | 0.000553307 | 0.0037 |
| te | Dravidian | Telugu | 689316485 | 0.000420272 | 0.0028 |
| eu | Language isolate | Basque | 105304423 | 6.42035e-05 | 0.0043 |
| my | Sino-Tibetan | Burmese | 101358331 | 6.17976e-05 | 0.003 |
| ht | Creole | Haitian, Haitian Creole | 86584697 | 5.27902e-05 | 0.0035 |
| qu | Quechuan | Quechua | 3236108 | 1.97304e-06 | 0.0001 |
## Model card
For intended usage of the model, please refer to the [model card](https://github.com/pytorch/fairseq/blob/main/examples/xglm/model_card.md) released by the XGLM-564M development team.
## Example (COPA)
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
```python
import torch
import torch.nn.functional as F
from transformers import XGLMTokenizer, XGLMForCausalLM
tokenizer = XGLMTokenizer.from_pretrained("facebook/xglm-564M")
model = XGLMForCausalLM.from_pretrained("facebook/xglm-564M")
data_samples = {
'en': [
{
"premise": "I wanted to conserve energy.",
"choice1": "I swept the floor in the unoccupied room.",
"choice2": "I shut off the light in the unoccupied room.",
"question": "effect",
"label": "1"
},
{
"premise": "The flame on the candle went out.",
"choice1": "I blew on the wick.",
"choice2": "I put a match to the wick.",
"question": "cause",
"label": "0"
}
],
'zh': [
{
"premise": "我想节约能源。",
"choice1": "我在空着的房间里扫了地板。",
"choice2": "我把空房间里的灯关了。",
"question": "effect",
"label": "1"
},
{
"premise": "蜡烛上的火焰熄灭了。",
"choice1": "我吹灭了灯芯。",
"choice2": "我把一根火柴放在灯芯上。",
"question": "cause",
"label": "0"
}
],
'hi': [
{
"premise": "M te vle konsève enèji.",
"choice1": "Mwen te fin baleye chanm lib la.",
"choice2": "Mwen te femen limyè nan chanm lib la.",
"question": "effect",
"label": "1"
},
{
"premise": "Flam bouji a te etenn.",
"choice1": "Mwen te soufle bouji a.",
"choice2": "Mwen te limen mèch bouji a.",
"question": "cause",
"label": "0"
}
]
}
def get_logprobs(prompt):
inputs = tokenizer(prompt, return_tensors="pt")
input_ids, output_ids = inputs["input_ids"], inputs["input_ids"][:, 1:]
outputs = model(**inputs, labels=input_ids)
logits = outputs.logits
logprobs = torch.gather(F.log_softmax(logits, dim=2), 2, output_ids.unsqueeze(2))
return logprobs
# Zero-shot evaluation for the Choice of Plausible Alternatives (COPA) task.
# A return value of 0 indicates that the first alternative is more plausible,
# while 1 indicates that the second alternative is more plausible.
def COPA_eval(prompt, alternative1, alternative2):
lprob1 = get_logprobs(prompt + "\n" + alternative1).sum()
lprob2 = get_logprobs(prompt + "\n" + alternative2).sum()
return 0 if lprob1 > lprob2 else 1
for lang in data_samples_long:
for idx, example in enumerate(data_samples_long[lang]):
predict = COPA_eval(example["premise"], example["choice1"], example["choice2"])
print(f'{lang}-{idx}', predict, example['label'])
# en-0 1 1
# en-1 0 0
# zh-0 1 1
# zh-1 0 0
# hi-0 1 1
# hi-1 0 0
```
|
{"language": ["multilingual", "en", "ru", "zh", "de", "es", "fr", "ja", "it", "pt", "el", "ko", "fi", "id", "tr", "ar", "vi", "th", "bg", "ca", "hi", "et", "bn", "ta", "ur", "sw", "te", "eu", "my", "ht", "qu"], "license": "mit", "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png", "inference": false}
|
text-generation
|
facebook/xglm-564M
|
[
"transformers",
"pytorch",
"tf",
"jax",
"xglm",
"text-generation",
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu",
"arxiv:2112.10668",
"license:mit",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2112.10668"
] |
[
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu"
] |
TAGS
#transformers #pytorch #tf #jax #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us
|
XGLM-564M
=========
XGLM-564M is a multilingual autoregressive language model (with 564 million parameters) trained on a balanced corpus of a diverse set of 30 languages totaling 500 billion sub-tokens. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in this repository.
Training Data Statistics
------------------------
The training data statistics of XGLM-564M is shown in the table below.
Model card
----------
For intended usage of the model, please refer to the model card released by the XGLM-564M development team.
Example (COPA)
--------------
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
119
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
-0.021365337073802948,
-0.08463877439498901,
-0.009217020124197006,
0.05512551590800285,
0.06414017826318741,
0.02806887961924076,
0.10953325033187866,
0.08277026563882828,
0.10773840546607971,
0.013506432063877583,
0.11675258725881577,
0.0828232541680336,
0.02025671862065792,
0.06072128191590309,
0.009668183512985706,
-0.26362916827201843,
-0.002515325555577874,
-0.0072627561166882515,
-0.034044306725263596,
0.09296151995658875,
0.09729446470737457,
-0.01386660523712635,
0.11236666142940521,
-0.04576507583260536,
-0.01898912899196148,
0.05524435266852379,
-0.012855928391218185,
-0.025296058505773544,
0.11308128386735916,
0.11882173269987106,
0.04074234515428543,
0.07010570168495178,
-0.017320120707154274,
-0.16155873239040375,
0.04154876992106438,
-0.03275094926357269,
-0.13697181642055511,
-0.001924227923154831,
0.040418364107608795,
-0.15310172736644745,
0.1969650238752365,
0.004760560113936663,
-0.11846883594989777,
0.05675606429576874,
-0.176131471991539,
-0.16692796349525452,
-0.06562815606594086,
0.14227117598056793,
-0.03641778975725174,
0.057140007615089417,
-0.02636580914258957,
0.0958419069647789,
-0.09575927257537842,
0.0675550252199173,
0.2001795470714569,
-0.2866247296333313,
-0.05057193338871002,
0.04784030467271805,
0.12041708827018738,
0.1502106934785843,
-0.10834997892379761,
0.09863036125898361,
0.048149507492780685,
0.0010464874794706702,
-0.031456515192985535,
-0.11143390089273453,
0.007055880036205053,
0.056976184248924255,
-0.10089784115552902,
-0.006769905332475901,
0.23075546324253082,
-0.008546068333089352,
0.030212879180908203,
0.12858261168003082,
-0.055575959384441376,
-0.09997697919607162,
-0.004616316873580217,
-0.014537017792463303,
-0.014532522298395634,
0.03267413750290871,
0.05517076328396797,
-0.057921867817640305,
-0.12207765132188797,
0.0426119789481163,
-0.1726209968328476,
0.20059767365455627,
0.01568535529077053,
-0.01096254214644432,
-0.05592959746718407,
0.03616688400506973,
-0.046367909759283066,
-0.11243396252393723,
-0.001231967587955296,
-0.05140508711338043,
0.06513307243585587,
0.06696344166994095,
0.0301264226436615,
0.02260216884315014,
0.07177780568599701,
0.09869325906038284,
-0.0820964053273201,
0.04706038534641266,
0.004369725473225117,
0.1486717313528061,
0.01744820922613144,
0.03597674146294594,
-0.0163068026304245,
-0.09584669023752213,
-0.056720104068517685,
-0.051741670817136765,
0.014740132726728916,
-0.02992982044816017,
-0.1837977021932602,
-0.05148173123598099,
-0.014127098955214024,
0.042198553681373596,
0.004561091773211956,
0.06136983633041382,
-0.017832539975643158,
0.04741569608449936,
0.0027864037547260523,
-0.01985928602516651,
0.023806141689419746,
0.020998982712626457,
0.042110368609428406,
0.08836016058921814,
-0.04062289744615555,
-0.014781911857426167,
-0.03655215725302696,
0.037078745663166046,
-0.05964396148920059,
0.07116334140300751,
-0.004583463538438082,
-0.09439234435558319,
0.02902098186314106,
-0.0692109763622284,
0.03266284987330437,
-0.18457026779651642,
0.02336033061146736,
-0.0036654993891716003,
-0.0052984170615673065,
-0.04709892347455025,
0.02313149906694889,
-0.03715255856513977,
-0.07462804019451141,
0.06252973526716232,
-0.029629288241267204,
-0.028198791667819023,
-0.09549960494041443,
0.10249809175729752,
-0.03500967100262642,
0.08309222012758255,
-0.15839269757270813,
0.03222009539604187,
-0.010532141663134098,
0.02855522744357586,
-0.0434655137360096,
-0.005669065285474062,
-0.05419708415865898,
0.02782120555639267,
-0.01118166372179985,
-0.060137297958135605,
-0.07407528162002563,
0.06225629895925522,
-0.0357125848531723,
0.16305823624134064,
-0.1837477833032608,
-0.10116707533597946,
0.20878127217292786,
-0.06241442635655403,
-0.10694052278995514,
0.14438678324222565,
0.04353538900613785,
0.030318424105644226,
0.012418868020176888,
0.2241465151309967,
0.01358761265873909,
-0.10955914855003357,
-0.06073598936200142,
0.11463408917188644,
-0.01807398535311222,
0.0484514981508255,
0.10449470579624176,
0.04568822681903839,
0.016870925202965736,
0.002879914129152894,
0.03038662113249302,
0.08936398476362228,
-0.04169311746954918,
-0.056952089071273804,
0.03530937805771828,
0.009423539973795414,
0.09934967011213303,
0.026920003816485405,
0.03388751298189163,
-0.09210429340600967,
-0.054442375898361206,
-0.06345498561859131,
0.08415479212999344,
0.05775141343474388,
0.04525623098015785,
-0.05607989430427551,
0.13044439256191254,
0.07197528332471848,
-0.0031337786931544542,
-0.06522258371114731,
0.09098949283361435,
-0.07503866404294968,
0.14092852175235748,
0.1057070940732956,
0.19858863949775696,
0.07978271692991257,
-0.013857128098607063,
-0.09412357211112976,
-0.03707907348871231,
0.00040862159221433103,
-0.010568324476480484,
-0.021085506305098534,
-0.18434998393058777,
0.08632009476423264,
-0.035659078508615494,
0.05252022296190262,
-0.15393903851509094,
0.024154668673872948,
0.15038155019283295,
0.09150286763906479,
-0.03323547542095184,
0.08102624863386154,
-0.0841493234038353,
0.0489819198846817,
-0.08435459434986115,
0.022072533145546913,
0.055542007088661194,
-0.03250237554311752,
-0.11335994303226471,
0.18205778300762177,
-0.11755235493183136,
0.2499246746301651,
0.18260301649570465,
-0.20375417172908783,
-0.02987983077764511,
0.031716495752334595,
-0.03636004403233528,
0.0036444957368075848,
0.13730286061763763,
-0.03923254832625389,
0.0352800190448761,
-0.02795884944498539,
0.105797179043293,
-0.07261446863412857,
-0.02830662950873375,
0.023086952045559883,
-0.07039693742990494,
-0.08703690767288208,
0.17817422747612,
0.06025286018848419,
-0.1858999878168106,
0.2352703958749771,
0.3368490934371948,
0.03709175065159798,
0.2625896632671356,
0.022807016968727112,
0.020582595840096474,
-0.04667263850569725,
0.007062871940433979,
-0.024166280403733253,
0.12902672588825226,
-0.16248789429664612,
-0.018613174557685852,
-0.010238727554678917,
-0.0056029693223536015,
0.019242484122514725,
-0.09660286456346512,
-0.09329896420240402,
-0.050397150218486786,
-0.026160400360822678,
-0.030398018658161163,
0.10303646326065063,
-0.044782549142837524,
0.1134282797574997,
0.011303532868623734,
-0.09203694015741348,
0.0061190444976091385,
0.013820299878716469,
-0.05067897215485573,
0.15553785860538483,
-0.15258316695690155,
-0.19843868911266327,
-0.01665884628891945,
-0.14872656762599945,
-0.03486570715904236,
-0.007896500639617443,
0.08066363632678986,
-0.09817014634609222,
0.01250929944217205,
0.005758743267506361,
0.10569745302200317,
-0.16589002311229706,
-0.021890835836529732,
-0.14127227663993835,
0.02024395950138569,
-0.07270794361829758,
-0.05587948486208916,
-0.06715396046638489,
-0.0031886149663478136,
-0.07843032479286194,
0.14117248356342316,
-0.11916250735521317,
0.10992589592933655,
0.08627140522003174,
0.011472527869045734,
0.04311506450176239,
-0.05592430755496025,
0.18306003510951996,
-0.1389448046684265,
0.027371132746338844,
0.0531948022544384,
0.019295576959848404,
0.0943131148815155,
0.1315569281578064,
0.04289121553301811,
-0.02234811708331108,
-0.02582976408302784,
0.015148883685469627,
-0.04758569970726967,
-0.14689680933952332,
-0.09521377086639404,
-0.10663314908742905,
0.11009575426578522,
-0.05565287917852402,
0.09375038743019104,
0.0913160890340805,
0.01817447878420353,
-0.046175118535757065,
-0.03162592276930809,
-0.04892677068710327,
0.04196513816714287,
0.14978018403053284,
-0.06317999958992004,
0.07896829396486282,
-0.06374260038137436,
-0.049057215452194214,
0.12786199152469635,
0.09997354447841644,
-0.02559216320514679,
0.06627877801656723,
0.012997457757592201,
0.11342261731624603,
0.14456263184547424,
0.06048780679702759,
-0.02549094520509243,
0.027931611984968185,
-0.06229632720351219,
-0.024450816214084625,
-0.07209222763776779,
0.040625859051942825,
0.03959234058856964,
0.08674801886081696,
-0.05365348607301712,
-0.010216684080660343,
-0.12556643784046173,
0.11760540306568146,
-0.07275513559579849,
0.049877047538757324,
-0.040240831673145294,
0.014775055460631847,
0.08763564378023148,
0.03623306751251221,
-0.07329809665679932,
0.017133913934230804,
0.09608235955238342,
-0.1146564856171608,
0.09739717096090317,
0.04299764707684517,
0.07329408824443817,
0.02371363155543804,
0.08762772381305695,
-0.09464013576507568,
-0.06770775467157364,
-0.03454694524407387,
0.07723686099052429,
-0.34070637822151184,
0.2679567039012909,
0.03230782598257065,
-0.06479322910308838,
0.006367451976984739,
-0.08524248749017715,
0.056975044310092926,
0.22805601358413696,
0.12539727985858917,
0.08805476129055023,
-0.0007231578347273171,
-0.11776278913021088,
0.0620470829308033,
-0.012619527988135815,
0.12926536798477173,
-0.04616536572575569,
-0.0002774959721136838,
0.004533061292022467,
-0.0021990546956658363,
-0.04486783593893051,
0.15203814208507538,
-0.07726217061281204,
-0.12797172367572784,
0.106184221804142,
-0.010691739618778229,
0.03150857239961624,
-0.021468382328748703,
-0.07118135690689087,
-0.15470069646835327,
0.034857604652643204,
-0.11895909160375595,
-0.0010555557673797011,
-0.07499339431524277,
-0.09474266320466995,
-0.00041802579653449357,
-0.11330919712781906,
-0.049218278378248215,
-0.03748258948326111,
-0.08977869898080826,
-0.12387915700674057,
-0.03399015963077545,
0.10088137537240982,
-0.06737218052148819,
-0.07725882530212402,
-0.020441405475139618,
0.14090222120285034,
0.020907096564769745,
0.09139677882194519,
-0.044062577188014984,
-0.01081867329776287,
-0.08423180133104324,
-0.12452162057161331,
0.06389287859201431,
-0.0910198763012886,
-0.03638368472456932,
-0.00025089483824558556,
-0.06837470829486847,
-0.035041145980358124,
-0.05269531160593033,
-0.11708120256662369,
0.12347697466611862,
0.3269904851913452,
-0.03357867896556854,
0.11068951338529587,
0.12035127729177475,
-0.06276506185531616,
-0.2998708188533783,
-0.13193652033805847,
-0.14022427797317505,
0.011862197890877724,
0.008258329704403877,
-0.13104704022407532,
-0.10747627913951874,
-0.02001301385462284,
-0.010861072689294815,
0.11478213220834732,
-0.27305853366851807,
-0.07037680596113205,
0.0814293771982193,
0.00839589349925518,
0.28625208139419556,
-0.1324950009584427,
-0.029622050002217293,
-0.036590661853551865,
-0.059660349041223526,
0.047475509345531464,
-0.0969732478260994,
0.10134653747081757,
-0.05095959082245827,
0.013550742529332638,
-0.019205395132303238,
0.02505706250667572,
0.15742328763008118,
-0.025326617062091827,
-0.007101006805896759,
-0.12623077630996704,
-0.15950456261634827,
0.09744636714458466,
0.018312091007828712,
-0.08360578864812851,
-0.20310762524604797,
-0.07020562142133713,
-0.12604987621307373,
0.025030989199876785,
-0.12625114619731903,
0.0903935581445694,
-0.06343087553977966,
-0.09070784598588943,
-0.09645020216703415,
0.06002185493707657,
-0.035502564162015915,
-0.012794985435903072,
0.19509078562259674,
-0.06191324442625046,
0.15870769321918488,
0.05435483902692795,
0.08679277449846268,
-0.08648471534252167,
0.0423816442489624,
-0.047934576869010925,
-0.03428263962268829,
0.030706040561199188,
-0.09863527119159698,
-0.01840106211602688,
0.10627449303865433,
-0.02516878768801689,
0.07465566694736481,
0.05599920079112053,
-0.11120331287384033,
0.02458852156996727,
0.13843019306659698,
-0.13790418207645416,
-0.20608891546726227,
-0.07258609682321548,
-0.05083100125193596,
0.11719740927219391,
0.011106699705123901,
0.148052379488945,
-0.02803332544863224,
-0.00123187608551234,
0.00963130034506321,
-0.03389005735516548,
-0.0327216312289238,
0.012256641872227192,
0.08120466023683548,
0.003356742672622204,
-0.05693833529949188,
0.014117839746177197,
0.006858654320240021,
-0.11275172978639603,
-0.02283063344657421,
0.2422037273645401,
-0.07119695842266083,
-0.14752575755119324,
-0.059810757637023926,
0.11508528143167496,
-0.11839272826910019,
-0.032275956124067307,
-0.026844441890716553,
-0.11188366264104843,
0.04511088877916336,
0.2533436417579651,
0.059756238013505936,
0.03941414877772331,
0.02090097963809967,
-0.011225208640098572,
0.11810155212879181,
0.05666837468743324,
-0.021750949323177338,
-0.002543056383728981,
-0.05944779887795448,
0.07593725621700287,
0.00924723781645298,
0.20118112862110138,
-0.044891197234392166,
-0.05611235648393631,
-0.15438134968280792,
0.007885124534368515,
-0.06549995392560959,
-0.055278897285461426,
-0.08541589230298996,
-0.04602349177002907,
-0.03138885274529457,
-0.11500293761491776,
-0.0640224739909172,
-0.06932393461465836,
-0.09895731508731842,
0.011614983901381493,
0.03448057547211647,
0.11984685808420181,
-0.04843822866678238,
-0.005786180961877108,
0.10230298340320587,
-0.008921363390982151,
0.10244841873645782,
0.09874196350574493,
-0.005153326317667961,
0.1056663990020752,
-0.16201823949813843,
0.007918902672827244,
0.03132371976971626,
0.034447453916072845,
0.02873438410460949,
0.10485255718231201,
-0.053297750651836395,
-0.04269813001155853,
0.07276318967342377,
0.073635034263134,
0.03720078989863396,
-0.0322737917304039,
0.12605001032352448,
-0.03449456766247749,
-0.13222499191761017,
-0.02155173197388649,
0.09236272424459457,
0.03916657343506813,
0.007994826883077621,
0.051049016416072845,
-0.06889795511960983,
0.01298487652093172,
-0.05852316692471504,
0.04065992683172226,
0.009352182038128376,
-0.17022153735160828,
-0.02690594084560871,
-0.11143270134925842,
0.009083850309252739,
-0.02229766920208931,
0.07648449391126633,
0.020634325221180916,
-0.05799693614244461,
0.029216419905424118,
0.013430115766823292,
-0.08835815638303757,
-0.01648208498954773,
0.0773392915725708,
0.025558583438396454,
-0.054244864732027054,
-0.1594713032245636,
0.030265945941209793,
-0.01297763641923666,
0.01913740299642086,
0.08476301282644272,
0.10042733699083328,
0.1316160410642624,
0.06681519001722336,
-0.01230989396572113,
-0.0558464340865612,
-0.05090371146798134,
-0.12865188717842102,
0.043533094227313995,
-0.028001992031931877,
-0.08217533677816391,
0.15841153264045715,
0.2196667641401291,
-0.04876435548067093,
0.03644999861717224,
-0.022507747635245323,
-0.01483479980379343,
-0.11828307062387466,
-0.08897119760513306,
0.006979148369282484,
-0.08092045783996582,
-0.028855709359049797,
-0.03681613877415657,
0.07208794355392456,
-0.013270622119307518,
0.07272972166538239,
-0.0338415764272213,
0.05138329789042473,
0.03562932461500168,
-0.13303937017917633,
0.028507687151432037,
-0.01294960267841816,
0.08412421494722366,
-0.10797394067049026,
0.03833338990807533,
-0.06764379143714905,
-0.09315092116594315,
-0.0315672867000103,
0.05571772903203964,
-0.032625291496515274,
-0.05212663859128952,
-0.13141362369060516,
-0.07383274286985397,
-0.01511633675545454,
0.07214617729187012,
0.025345008820295334,
0.1628529578447342,
0.023007487878203392,
-0.010025233030319214,
0.007530251983553171,
0.1900320202112198,
-0.014606046490371227,
-0.05478569120168686,
0.048964936286211014,
0.1325155347585678,
0.008618899621069431,
0.0909537598490715,
-0.06344534456729889,
0.012499233707785606,
-0.034254711121320724,
0.221396803855896,
0.3153991401195526,
-0.10151934623718262,
0.07236602902412415,
0.035114847123622894,
0.06417416781187057,
0.07577227056026459,
-0.01092674769461155,
0.07699290663003922,
0.2162773758172989,
-0.10440264642238617,
0.04526380077004433,
-0.10293688625097275,
0.04523254185914993,
-0.038856141269207,
0.0846850797533989,
0.03305475041270256,
-0.041342273354530334,
-0.028193438425660133,
0.07215066999197006,
-0.151052787899971,
0.0302087664604187,
-0.023722784593701363,
-0.20330551266670227,
-0.03708885982632637,
-0.004281365312635899,
0.11369859427213669,
0.09633605182170868,
0.06669022887945175,
-0.014541308395564556,
-0.0607573576271534,
-0.017576640471816063,
0.040746238082647324,
-0.16795921325683594,
-0.011705053970217705,
0.07873816788196564,
-0.07171200215816498,
0.062472354620695114,
-0.060582250356674194,
-0.03986248001456261,
0.1268123984336853,
0.02919875644147396,
0.039622724056243896,
0.0622575581073761,
0.08384309709072113,
0.04147547110915184,
-0.08833321183919907,
0.014570733532309532,
0.05786426365375519,
-0.055359385907649994,
0.11042275279760361,
-0.014104641042649746,
0.08932977169752121,
0.05470199137926102,
-0.06960543245077133,
0.016361581161618233,
0.1487671434879303,
-0.08365001529455185,
0.04312533512711525,
0.08221294730901718,
0.03261339291930199,
-0.05941402539610863,
-0.05364712327718735,
-0.07995400577783585,
0.03475004434585571,
-0.016256224364042282,
-0.03325747698545456,
0.01588456518948078,
-0.05116444453597069,
0.01829240843653679,
0.04914404824376106,
-0.12728261947631836,
-0.037374574691057205,
-0.053439974784851074,
0.06476879864931107,
-0.09447246044874191,
0.07278493046760559,
0.07922866940498352,
-0.004568306263536215,
0.008083858527243137,
-0.16094940900802612,
0.03865133970975876,
0.0638563483953476,
-0.058342643082141876,
-0.02210300788283348
] |
null | null |
transformers
|
# XGLM-7.5B
XGLM-7.5B is a multilingual autoregressive language model (with 7.5 billion parameters) trained on a balanced corpus of a diverse set of languages totaling 500 billion sub-tokens. It was introduced in the paper [Few-shot Learning with Multilingual Language Models](https://arxiv.org/abs/2112.10668) by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in [this repository](https://github.com/pytorch/fairseq/tree/main/examples/xglm).
## Training Data Statistics
The training data statistics of XGLM-7.5B is shown in the table below.
| ISO-639-1| family | name | # tokens | ratio | ratio w/ lowRes upsampling |
|:--------|:-----------------|:------------------------|-------------:|------------:|-------------:|
| en | Indo-European | English | 803526736124 | 0.489906 | 0.3259 |
| ru | Indo-European | Russian | 147791898098 | 0.0901079 | 0.0602 |
| zh | Sino-Tibetan | Chinese | 132770494630 | 0.0809494 | 0.0483 |
| de | Indo-European | German | 89223707856 | 0.0543992 | 0.0363 |
| es | Indo-European | Spanish | 87303083105 | 0.0532282 | 0.0353 |
| fr | Indo-European | French | 77419639775 | 0.0472023 | 0.0313 |
| ja | Japonic | Japanese | 66054364513 | 0.040273 | 0.0269 |
| it | Indo-European | Italian | 41930465338 | 0.0255648 | 0.0171 |
| pt | Indo-European | Portuguese | 36586032444 | 0.0223063 | 0.0297 |
| el | Indo-European | Greek (modern) | 28762166159 | 0.0175361 | 0.0233 |
| ko | Koreanic | Korean | 20002244535 | 0.0121953 | 0.0811 |
| fi | Uralic | Finnish | 16804309722 | 0.0102455 | 0.0681 |
| id | Austronesian | Indonesian | 15423541953 | 0.00940365 | 0.0125 |
| tr | Turkic | Turkish | 12413166065 | 0.00756824 | 0.0101 |
| ar | Afro-Asiatic | Arabic | 12248607345 | 0.00746791 | 0.0099 |
| vi | Austroasiatic | Vietnamese | 11199121869 | 0.00682804 | 0.0091 |
| th | Tai–Kadai | Thai | 10842172807 | 0.00661041 | 0.044 |
| bg | Indo-European | Bulgarian | 9703797869 | 0.00591635 | 0.0393 |
| ca | Indo-European | Catalan | 7075834775 | 0.0043141 | 0.0287 |
| hi | Indo-European | Hindi | 3448390110 | 0.00210246 | 0.014 |
| et | Uralic | Estonian | 3286873851 | 0.00200399 | 0.0133 |
| bn | Indo-European | Bengali, Bangla | 1627447450 | 0.000992245 | 0.0066 |
| ta | Dravidian | Tamil | 1476973397 | 0.000900502 | 0.006 |
| ur | Indo-European | Urdu | 1351891969 | 0.000824241 | 0.0055 |
| sw | Niger–Congo | Swahili | 907516139 | 0.000553307 | 0.0037 |
| te | Dravidian | Telugu | 689316485 | 0.000420272 | 0.0028 |
| eu | Language isolate | Basque | 105304423 | 6.42035e-05 | 0.0043 |
| my | Sino-Tibetan | Burmese | 101358331 | 6.17976e-05 | 0.003 |
| ht | Creole | Haitian, Haitian Creole | 86584697 | 5.27902e-05 | 0.0035 |
| qu | Quechuan | Quechua | 3236108 | 1.97304e-06 | 0.0001 |
## Model card
For intended usage of the model, please refer to the [model card](https://github.com/pytorch/fairseq/blob/main/examples/xglm/model_card.md) released by the XGLM-7.5B development team.
## Example (COPA)
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
```python
import torch
import torch.nn.functional as F
from transformers import XGLMTokenizer, XGLMForCausalLM
tokenizer = XGLMTokenizer.from_pretrained("facebook/xglm-7.5B")
model = XGLMForCausalLM.from_pretrained("facebook/xglm-7.5B")
data_samples = {
'en': [
{
"premise": "I wanted to conserve energy.",
"choice1": "I swept the floor in the unoccupied room.",
"choice2": "I shut off the light in the unoccupied room.",
"question": "effect",
"label": "1"
},
{
"premise": "The flame on the candle went out.",
"choice1": "I blew on the wick.",
"choice2": "I put a match to the wick.",
"question": "cause",
"label": "0"
}
],
'zh': [
{
"premise": "我想节约能源。",
"choice1": "我在空着的房间里扫了地板。",
"choice2": "我把空房间里的灯关了。",
"question": "effect",
"label": "1"
},
{
"premise": "蜡烛上的火焰熄灭了。",
"choice1": "我吹灭了灯芯。",
"choice2": "我把一根火柴放在灯芯上。",
"question": "cause",
"label": "0"
}
],
'hi': [
{
"premise": "M te vle konsève enèji.",
"choice1": "Mwen te fin baleye chanm lib la.",
"choice2": "Mwen te femen limyè nan chanm lib la.",
"question": "effect",
"label": "1"
},
{
"premise": "Flam bouji a te etenn.",
"choice1": "Mwen te soufle bouji a.",
"choice2": "Mwen te limen mèch bouji a.",
"question": "cause",
"label": "0"
}
]
}
def get_logprobs(prompt):
inputs = tokenizer(prompt, return_tensors="pt")
input_ids, output_ids = inputs["input_ids"], inputs["input_ids"][:, 1:]
outputs = model(**inputs, labels=input_ids)
logits = outputs.logits
logprobs = torch.gather(F.log_softmax(logits, dim=2), 2, output_ids.unsqueeze(2))
return logprobs
# Zero-shot evaluation for the Choice of Plausible Alternatives (COPA) task.
# A return value of 0 indicates that the first alternative is more plausible,
# while 1 indicates that the second alternative is more plausible.
def COPA_eval(prompt, alternative1, alternative2):
lprob1 = get_logprobs(prompt + "\n" + alternative1).sum()
lprob2 = get_logprobs(prompt + "\n" + alternative2).sum()
return 0 if lprob1 > lprob2 else 1
for lang in data_samples_long:
for idx, example in enumerate(data_samples_long[lang]):
predict = COPA_eval(example["premise"], example["choice1"], example["choice2"])
print(f'{lang}-{idx}', predict, example['label'])
# en-0 1 1
# en-1 0 0
# zh-0 1 1
# zh-1 0 0
# hi-0 1 1
# hi-1 0 0
```
|
{"language": ["multilingual", "en", "ru", "zh", "de", "es", "fr", "ja", "it", "pt", "el", "ko", "fi", "id", "tr", "ar", "vi", "th", "bg", "ca", "hi", "et", "bn", "ta", "ur", "sw", "te", "eu", "my", "ht", "qu"], "license": "mit", "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png", "inference": false}
|
text-generation
|
facebook/xglm-7.5B
|
[
"transformers",
"pytorch",
"xglm",
"text-generation",
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu",
"arxiv:2112.10668",
"license:mit",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2112.10668"
] |
[
"multilingual",
"en",
"ru",
"zh",
"de",
"es",
"fr",
"ja",
"it",
"pt",
"el",
"ko",
"fi",
"id",
"tr",
"ar",
"vi",
"th",
"bg",
"ca",
"hi",
"et",
"bn",
"ta",
"ur",
"sw",
"te",
"eu",
"my",
"ht",
"qu"
] |
TAGS
#transformers #pytorch #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us
|
XGLM-7.5B
=========
XGLM-7.5B is a multilingual autoregressive language model (with 7.5 billion parameters) trained on a balanced corpus of a diverse set of languages totaling 500 billion sub-tokens. It was introduced in the paper Few-shot Learning with Multilingual Language Models by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in this repository.
Training Data Statistics
------------------------
The training data statistics of XGLM-7.5B is shown in the table below.
Model card
----------
For intended usage of the model, please refer to the model card released by the XGLM-7.5B development team.
Example (COPA)
--------------
The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi.
|
[] |
[
"TAGS\n#transformers #pytorch #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
113
] |
[
"passage: TAGS\n#transformers #pytorch #xglm #text-generation #multilingual #en #ru #zh #de #es #fr #ja #it #pt #el #ko #fi #id #tr #ar #vi #th #bg #ca #hi #et #bn #ta #ur #sw #te #eu #my #ht #qu #arxiv-2112.10668 #license-mit #autotrain_compatible #has_space #region-us \n"
] |
[
-0.02294096350669861,
-0.0840897336602211,
-0.008790376596152782,
0.040797892957925797,
0.07706532627344131,
0.01802053488790989,
0.10317088663578033,
0.07336292415857315,
0.12203777581453323,
0.02395820803940296,
0.12009905278682709,
0.07689080387353897,
0.01508802454918623,
0.06599585711956024,
0.005764448549598455,
-0.2739368677139282,
0.0018361873226240277,
-0.003720506327226758,
-0.039812419563531876,
0.09642938524484634,
0.08662501722574234,
-0.03614256531000137,
0.10534713417291641,
-0.05073833093047142,
-0.01584417186677456,
0.049842413514852524,
-0.027058955281972885,
-0.020393982529640198,
0.11259093135595322,
0.10257763415575027,
0.04289514571428299,
0.06095734238624573,
-0.013574734330177307,
-0.1794949173927307,
0.03866412490606308,
-0.04587922617793083,
-0.12482477724552155,
-0.00042297824984416366,
0.03772956505417824,
-0.14076979458332062,
0.18372049927711487,
-0.001333900261670351,
-0.12696582078933716,
0.05213996767997742,
-0.17007485032081604,
-0.15708547830581665,
-0.059187039732933044,
0.126675546169281,
-0.03957037627696991,
0.06081775575876236,
-0.036933235824108124,
0.09272806346416473,
-0.10728012770414352,
0.0680338516831398,
0.2077222764492035,
-0.293096661567688,
-0.035472579300403595,
0.06611253321170807,
0.12982794642448425,
0.15147769451141357,
-0.10886004567146301,
0.10371233522891998,
0.054302360862493515,
-0.0016503820661455393,
-0.051190201193094254,
-0.09803400188684464,
0.02858475223183632,
0.06315919756889343,
-0.10969587415456772,
-0.012916702777147293,
0.22052663564682007,
-0.015291346237063408,
0.04441985487937927,
0.12412160634994507,
-0.064156174659729,
-0.11400743573904037,
0.006683855317533016,
-0.0019229642348363996,
-0.01141070295125246,
0.03388737514615059,
0.06227177008986473,
-0.056559327989816666,
-0.1272362470626831,
0.0463264063000679,
-0.1777745932340622,
0.222642719745636,
0.01555652916431427,
-0.013182935304939747,
-0.0536297932267189,
0.04366675391793251,
-0.038515355437994,
-0.11287185549736023,
0.00813204050064087,
-0.05282212048768997,
0.07067105174064636,
0.07109955698251724,
0.017388364300131798,
0.03306879848241806,
0.07525486499071121,
0.10570009052753448,
-0.09673065692186356,
0.04594619572162628,
0.007494255900382996,
0.14874055981636047,
0.027125364169478416,
0.05056144297122955,
-0.013105308637022972,
-0.1019451692700386,
-0.06592395901679993,
-0.05657639726996422,
0.014750945381820202,
-0.03352637588977814,
-0.1877364069223404,
-0.049680445343256,
-0.014782903715968132,
0.04354396462440491,
0.0025647380389273167,
0.06325861811637878,
-0.021058714017271996,
0.04477008432149887,
-0.004427502863109112,
-0.01958216167986393,
0.021236328408122063,
0.02493387646973133,
0.026684045791625977,
0.0918896272778511,
-0.05098790302872658,
-0.012433811090886593,
-0.03513867408037186,
0.03890986740589142,
-0.06166455149650574,
0.07290057092905045,
-0.012116262689232826,
-0.09089849144220352,
0.029829829931259155,
-0.08473201841115952,
0.034564197063446045,
-0.1784765124320984,
0.010842202231287956,
-0.013262835331261158,
-0.016812894493341446,
-0.04326298460364342,
0.02733437716960907,
-0.039563003927469254,
-0.06975965946912766,
0.05756964161992073,
-0.03644159063696861,
-0.026042252779006958,
-0.0936349481344223,
0.108599454164505,
-0.0324164554476738,
0.08602101355791092,
-0.15825241804122925,
0.040658872574567795,
-0.014024204574525356,
0.03524819761514664,
-0.04161188751459122,
-0.007048462983220816,
-0.05948479473590851,
0.01807050220668316,
-0.004534014966338873,
-0.05604939162731171,
-0.06659501045942307,
0.068659707903862,
-0.038121454417705536,
0.16125619411468506,
-0.16866660118103027,
-0.10290312767028809,
0.20019257068634033,
-0.06270524859428406,
-0.09503878653049469,
0.13413488864898682,
0.04403495788574219,
0.014800568111240864,
0.012132389470934868,
0.24399451911449432,
-0.007781651336699724,
-0.09958912432193756,
-0.08086027950048447,
0.11475935578346252,
-0.014875946566462517,
0.034675631672143936,
0.10762525349855423,
0.0436992309987545,
0.028715964406728745,
0.013012348674237728,
0.030971651896834373,
0.08946637064218521,
-0.04807684198021889,
-0.053273193538188934,
0.03245904669165611,
0.00024360035604331642,
0.1152733713388443,
0.026318613439798355,
0.039697229862213135,
-0.08722762018442154,
-0.051772404462099075,
-0.047151390463113785,
0.08759621530771255,
0.05102605000138283,
0.04930264502763748,
-0.05114861950278282,
0.1402992457151413,
0.056018128991127014,
0.0016811543609946966,
-0.08333244174718857,
0.08916598558425903,
-0.07133176922798157,
0.12845908105373383,
0.09684617817401886,
0.20747412741184235,
0.07998373359441757,
-0.01784064993262291,
-0.08550406247377396,
-0.04216129332780838,
0.010868514887988567,
-0.013408194296061993,
-0.03240665793418884,
-0.16649441421031952,
0.08362001180648804,
-0.03995472937822342,
0.054929111152887344,
-0.14784495532512665,
0.026385053992271423,
0.14776581525802612,
0.0964324027299881,
-0.03621501848101616,
0.08070027828216553,
-0.08533388376235962,
0.053281232714653015,
-0.08163384348154068,
0.033779576420784,
0.058103401213884354,
-0.037215836346149445,
-0.11598384380340576,
0.17028096318244934,
-0.12543925642967224,
0.23815825581550598,
0.1886778026819229,
-0.22368909418582916,
-0.009765755385160446,
0.014911788515746593,
-0.03847067430615425,
0.004793065600097179,
0.13621310889720917,
-0.04425708204507828,
0.043830595910549164,
-0.02098808065056801,
0.10670199245214462,
-0.059622496366500854,
-0.022540383040905,
0.014378657564520836,
-0.0872781053185463,
-0.08534865826368332,
0.1752040535211563,
0.06648798286914825,
-0.15549509227275848,
0.24577514827251434,
0.33590105175971985,
0.03943885490298271,
0.2619292736053467,
0.02238626219332218,
0.01438372116535902,
-0.04297265410423279,
0.002157537965103984,
-0.04271605983376503,
0.11673080921173096,
-0.15978863835334778,
-0.024611175060272217,
-0.002101513557136059,
-0.0021383813582360744,
0.03773748129606247,
-0.10005859285593033,
-0.0866435170173645,
-0.050220321863889694,
-0.0204097181558609,
-0.012040792964398861,
0.09818156063556671,
-0.04373789578676224,
0.1093427985906601,
0.013274559751152992,
-0.08516468107700348,
0.0039291661232709885,
0.018796684220433235,
-0.04586261510848999,
0.15796297788619995,
-0.15460807085037231,
-0.21846649050712585,
-0.0312722846865654,
-0.15006159245967865,
-0.04710331931710243,
0.0007985268603079021,
0.08062424510717392,
-0.11680024862289429,
0.008213517256081104,
0.019004132598638535,
0.12410539388656616,
-0.16156937181949615,
-0.022347578778862953,
-0.12771007418632507,
0.01858498714864254,
-0.07922249287366867,
-0.05517590045928955,
-0.06990037858486176,
-0.01806354708969593,
-0.07621584832668304,
0.15124446153640747,
-0.10649682581424713,
0.10971671342849731,
0.10771945863962173,
0.025463717058300972,
0.04064463824033737,
-0.04614318534731865,
0.1783732771873474,
-0.14617948234081268,
0.014035318978130817,
0.06321181356906891,
0.013874887488782406,
0.09763452410697937,
0.1361120492219925,
0.03534647077322006,
-0.028772998601198196,
-0.022745465859770775,
0.01736723817884922,
-0.047793831676244736,
-0.15801191329956055,
-0.11783154308795929,
-0.0979384258389473,
0.1003713384270668,
-0.06924252957105637,
0.09273553639650345,
0.07760421186685562,
0.021090500056743622,
-0.03623080998659134,
-0.04546547308564186,
-0.06382899731397629,
0.03618235141038895,
0.16822683811187744,
-0.07888562977313995,
0.08940166980028152,
-0.05476089194417,
-0.054483283311128616,
0.12345029413700104,
0.11453749239444733,
0.006663334555923939,
0.08096349984407425,
0.015566159971058369,
0.10771776735782623,
0.11919183284044266,
0.07071197032928467,
-0.025339022278785706,
0.03624750301241875,
-0.06000909209251404,
-0.023656101897358894,
-0.06101708114147186,
0.026264384388923645,
0.04032643139362335,
0.10105569660663605,
-0.06510414928197861,
-0.011474519968032837,
-0.11970417946577072,
0.12310444563627243,
-0.08071883767843246,
0.048189327120780945,
-0.0300357174128294,
0.009804662317037582,
0.0903388112783432,
0.031242936849594116,
-0.06574882566928864,
0.025848155841231346,
0.09544028341770172,
-0.11493287980556488,
0.07775561511516571,
0.04850032925605774,
0.069256491959095,
-0.002082277787849307,
0.10479683429002762,
-0.11088511347770691,
-0.07983589917421341,
-0.023394986987113953,
0.07982928305864334,
-0.3311297297477722,
0.28123921155929565,
0.026481563225388527,
-0.08092144131660461,
0.00022479586186818779,
-0.08229584991931915,
0.0468611977994442,
0.21082155406475067,
0.11715720593929291,
0.08609235286712646,
-0.014760177582502365,
-0.14425300061702728,
0.05762530118227005,
-0.012922546826303005,
0.14756199717521667,
-0.04387863725423813,
0.005037189461290836,
0.0038609644398093224,
0.009889046661555767,
-0.04637126252055168,
0.13232137262821198,
-0.06355935335159302,
-0.12361205369234085,
0.10362088680267334,
-0.015029077418148518,
0.035213809460401535,
-0.018225735053420067,
-0.07217516750097275,
-0.17611242830753326,
0.03597951680421829,
-0.12628227472305298,
-0.009389854967594147,
-0.06653837114572525,
-0.08505524694919586,
0.01296123955398798,
-0.11910154670476913,
-0.04228660836815834,
-0.03607754781842232,
-0.0893062874674797,
-0.13239751756191254,
-0.02395695447921753,
0.09645751863718033,
-0.059619151055812836,
-0.08394881337881088,
-0.007398153189569712,
0.13942648470401764,
0.024037254974246025,
0.09394387155771255,
-0.051983870565891266,
-0.002862086985260248,
-0.0949925035238266,
-0.1286463439464569,
0.06774229556322098,
-0.07473044097423553,
-0.024787429720163345,
0.016305655241012573,
-0.0658894032239914,
-0.03156816586852074,
-0.04543724283576012,
-0.11940400302410126,
0.14549927413463593,
0.3454761207103729,
-0.037812795490026474,
0.11192550510168076,
0.14061091840267181,
-0.06504598259925842,
-0.310202419757843,
-0.13903746008872986,
-0.14378289878368378,
0.012719032354652882,
0.0029083448462188244,
-0.1460660994052887,
-0.09465983510017395,
-0.00975304190069437,
-0.01614423841238022,
0.10811427235603333,
-0.27363529801368713,
-0.06462154537439346,
0.10419250279664993,
-0.0112113943323493,
0.3064541518688202,
-0.13352070748806,
-0.0347733199596405,
-0.039120327681303024,
-0.05675800144672394,
0.037244267761707306,
-0.06713645160198212,
0.11532879620790482,
-0.05515044927597046,
0.03444352000951767,
-0.009005719795823097,
0.015228514559566975,
0.17229688167572021,
-0.017728567123413086,
-0.008952876552939415,
-0.12989215552806854,
-0.17730821669101715,
0.08692365884780884,
0.01966228149831295,
-0.07950735837221146,
-0.19361300766468048,
-0.07385345548391342,
-0.13395529985427856,
0.016869790852069855,
-0.11988642811775208,
0.08914313465356827,
-0.05134717747569084,
-0.0936189517378807,
-0.09860058128833771,
0.05681392550468445,
-0.04492238909006119,
0.0017982599092647433,
0.19653378427028656,
-0.06874193996191025,
0.13701412081718445,
0.06755245476961136,
0.07950988411903381,
-0.07874859869480133,
0.04213119298219681,
-0.05379772186279297,
-0.03234681859612465,
0.03599962592124939,
-0.10244202613830566,
-0.02142791450023651,
0.12833340466022491,
-0.03422391414642334,
0.077253058552742,
0.05760922655463219,
-0.10322903841733932,
0.031125910580158234,
0.1289598047733307,
-0.14122794568538666,
-0.2032240778207779,
-0.06997325271368027,
-0.05455528572201729,
0.11663806438446045,
0.02931256778538227,
0.14897732436656952,
-0.018466081470251083,
-0.002023848472163081,
0.006303946487605572,
-0.022812023758888245,
-0.030053934082388878,
0.02781246230006218,
0.08473300188779831,
0.007172154262661934,
-0.06481961160898209,
0.022891338914632797,
0.015549338422715664,
-0.11121359467506409,
-0.015522508881986141,
0.22005322575569153,
-0.0678650438785553,
-0.15019644796848297,
-0.0710986852645874,
0.10318738222122192,
-0.12870755791664124,
-0.03453217074275017,
-0.025575291365385056,
-0.11447843164205551,
0.051949020475149155,
0.22961291670799255,
0.07320072501897812,
0.03941493481397629,
0.017798328772187233,
-0.0029384854715317488,
0.11878244578838348,
0.05105516314506531,
-0.013667344115674496,
-0.0037846555933356285,
-0.064037024974823,
0.07082504034042358,
0.009539357386529446,
0.1982763558626175,
-0.046576496213674545,
-0.06081189587712288,
-0.16002018749713898,
0.01174408569931984,
-0.0997338593006134,
-0.06004466861486435,
-0.0799938514828682,
-0.05220019072294235,
-0.03145388141274452,
-0.11331654340028763,
-0.06102839857339859,
-0.06362365186214447,
-0.11121636629104614,
0.013804384507238865,
0.029895441606640816,
0.11307426542043686,
-0.054702065885066986,
-0.008702273480594158,
0.10458724200725555,
-0.00274676620028913,
0.10720079392194748,
0.10579271614551544,
-0.0008025504066608846,
0.10837171971797943,
-0.156077578663826,
0.0011700987815856934,
0.04366722330451012,
0.03554334491491318,
0.02511286549270153,
0.11150961369276047,
-0.05397804453969002,
-0.039646215736866,
0.07263051718473434,
0.0730462297797203,
0.036420393735170364,
-0.037187665700912476,
0.13746502995491028,
-0.04222838953137398,
-0.13169699907302856,
-0.0257820263504982,
0.07714912295341492,
0.03407567739486694,
0.022260991856455803,
0.045378848910331726,
-0.06408196687698364,
0.023539062589406967,
-0.04935844615101814,
0.047815658152103424,
0.00916222669184208,
-0.1670481413602829,
-0.011634587310254574,
-0.10803371667861938,
0.009458196349442005,
-0.01657247357070446,
0.10336419939994812,
0.01915554143488407,
-0.05761474743485451,
0.02108202502131462,
0.04061652719974518,
-0.08687503635883331,
-0.015802042558789253,
0.07727106660604477,
0.03993947431445122,
-0.05036790668964386,
-0.15073753893375397,
0.04043283313512802,
-0.012365148402750492,
0.012900935485959053,
0.08855101466178894,
0.09492862969636917,
0.12545916438102722,
0.06577549874782562,
-0.011929357424378395,
-0.06595491617918015,
-0.0763486996293068,
-0.12048141658306122,
0.02034173719584942,
-0.03204232454299927,
-0.08856990933418274,
0.17140772938728333,
0.20295360684394836,
-0.051987528800964355,
0.0372222363948822,
-0.02170143835246563,
-0.01033688522875309,
-0.11743791401386261,
-0.09206581115722656,
0.002729605883359909,
-0.08538643270730972,
-0.022291310131549835,
-0.04356653615832329,
0.06401991099119186,
0.016589928418397903,
0.06521348655223846,
-0.03430306911468506,
0.06329986453056335,
0.02844730019569397,
-0.1270252913236618,
0.01346118189394474,
-0.0159861221909523,
0.0746004730463028,
-0.1172826737165451,
0.027760399505496025,
-0.07098238915205002,
-0.08039394021034241,
-0.0293279979377985,
0.055786773562431335,
-0.04394679516553879,
-0.05820678919553757,
-0.1291230022907257,
-0.07263119518756866,
-0.03334438055753708,
0.07483275234699249,
0.025521783158183098,
0.17137157917022705,
0.009027399122714996,
-0.009413990192115307,
0.0037210064474493265,
0.1917780637741089,
-0.018550025299191475,
-0.06129809841513634,
0.059795256704092026,
0.1276845782995224,
0.011627979576587677,
0.10016095638275146,
-0.07131656259298325,
0.005038734991103411,
-0.046699926257133484,
0.22317607700824738,
0.3228292465209961,
-0.10354028642177582,
0.07509936392307281,
0.04303448274731636,
0.06761054694652557,
0.07907655835151672,
-0.006174579728394747,
0.08538781851530075,
0.224729984998703,
-0.10598460584878922,
0.032810527831315994,
-0.10262739658355713,
0.03251698985695839,
-0.04085691273212433,
0.08467122912406921,
0.028402527794241905,
-0.037388984113931656,
-0.043320104479789734,
0.0602838508784771,
-0.17322766780853271,
0.036808259785175323,
-0.03372122347354889,
-0.21406817436218262,
-0.04125659167766571,
0.010391508229076862,
0.1334380805492401,
0.09429479390382767,
0.06841708719730377,
-0.013639396987855434,
-0.06926210969686508,
-0.017665795981884003,
0.04275286942720413,
-0.17387425899505615,
0.0021445921156555414,
0.08031835407018661,
-0.07216359674930573,
0.059708110988140106,
-0.056098487228155136,
-0.034750327467918396,
0.12295584380626678,
0.03068564273416996,
0.03770766407251358,
0.07144840061664581,
0.0814020186662674,
0.014821978285908699,
-0.08352191001176834,
0.016450613737106323,
0.05675453692674637,
-0.054470494389534,
0.1102517694234848,
-0.027265852317214012,
0.09288714826107025,
0.08014126867055893,
-0.06548468768596649,
0.018233921378850937,
0.13392896950244904,
-0.08782373368740082,
0.043213482946157455,
0.07138966768980026,
0.03163139894604683,
-0.058815620839595795,
-0.04945933073759079,
-0.07623022049665451,
0.020549682900309563,
-0.006620882079005241,
-0.040207453072071075,
-0.0048984321765601635,
-0.048977307975292206,
0.031954605132341385,
0.046806760132312775,
-0.11315256357192993,
-0.039495740085840225,
-0.044832032173871994,
0.07718908786773682,
-0.09562487155199051,
0.06908905506134033,
0.07356790453195572,
-0.007321787066757679,
0.006782867480069399,
-0.15468986332416534,
0.033609144389629364,
0.060267891734838486,
-0.06247176229953766,
-0.02353094331920147
] |
null | null |
transformers
|
# XLM-RoBERTa-XL (xlarge-sized model)
XLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Larger-Scale Transformers for Multilingual Masked Language Modeling](https://arxiv.org/abs/2105.00572) by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/xlmr).
Disclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta-xl) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='facebook/xlm-roberta-xl')
>>> unmasker("Europe is a <mask> continent.")
[{'score': 0.08562745153903961,
'token': 38043,
'token_str': 'living',
'sequence': 'Europe is a living continent.'},
{'score': 0.0799778401851654,
'token': 103494,
'token_str': 'dead',
'sequence': 'Europe is a dead continent.'},
{'score': 0.046154674142599106,
'token': 72856,
'token_str': 'lost',
'sequence': 'Europe is a lost continent.'},
{'score': 0.04358183592557907,
'token': 19336,
'token_str': 'small',
'sequence': 'Europe is a small continent.'},
{'score': 0.040570393204689026,
'token': 34923,
'token_str': 'beautiful',
'sequence': 'Europe is a beautiful continent.'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('facebook/xlm-roberta-xl')
model = AutoModelForMaskedLM.from_pretrained("facebook/xlm-roberta-xl")
# prepare input
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
# forward pass
output = model(**encoded_input)
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-00572,
author = {Naman Goyal and
Jingfei Du and
Myle Ott and
Giri Anantharaman and
Alexis Conneau},
title = {Larger-Scale Transformers for Multilingual Masked Language Modeling},
journal = {CoRR},
volume = {abs/2105.00572},
year = {2021},
url = {https://arxiv.org/abs/2105.00572},
eprinttype = {arXiv},
eprint = {2105.00572},
timestamp = {Wed, 12 May 2021 15:54:31 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-00572.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"language": ["multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", false, "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh"], "license": "mit"}
|
fill-mask
|
facebook/xlm-roberta-xl
|
[
"transformers",
"pytorch",
"xlm-roberta-xl",
"fill-mask",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:2105.00572",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.00572"
] |
[
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh"
] |
TAGS
#transformers #pytorch #xlm-roberta-xl #fill-mask #multilingual #af #am #ar #as #az #be #bg #bn #br #bs #ca #cs #cy #da #de #el #en #eo #es #et #eu #fa #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #hu #hy #id #is #it #ja #jv #ka #kk #km #kn #ko #ku #ky #la #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #om #or #pa #pl #ps #pt #ro #ru #sa #sd #si #sk #sl #so #sq #sr #su #sv #sw #ta #te #th #tl #tr #ug #uk #ur #uz #vi #xh #yi #zh #arxiv-2105.00572 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# XLM-RoBERTa-XL (xlarge-sized model)
XLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository.
Disclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
Here is how to use this model to get the features of a given text in PyTorch:
### BibTeX entry and citation info
|
[
"# XLM-RoBERTa-XL (xlarge-sized model) \n\nXLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository. \n\nDisclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nXLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. \n\nRoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.\n\nMore precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.\n\nThis way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.",
"## Intended uses & limitations\n\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.\n\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.",
"## Usage\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #xlm-roberta-xl #fill-mask #multilingual #af #am #ar #as #az #be #bg #bn #br #bs #ca #cs #cy #da #de #el #en #eo #es #et #eu #fa #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #hu #hy #id #is #it #ja #jv #ka #kk #km #kn #ko #ku #ky #la #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #om #or #pa #pl #ps #pt #ro #ru #sa #sd #si #sk #sl #so #sq #sr #su #sv #sw #ta #te #th #tl #tr #ug #uk #ur #uz #vi #xh #yi #zh #arxiv-2105.00572 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# XLM-RoBERTa-XL (xlarge-sized model) \n\nXLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository. \n\nDisclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nXLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. \n\nRoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.\n\nMore precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.\n\nThis way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.",
"## Intended uses & limitations\n\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.\n\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.",
"## Usage\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:",
"### BibTeX entry and citation info"
] |
[
253,
145,
313,
129,
40,
11
] |
[
"passage: TAGS\n#transformers #pytorch #xlm-roberta-xl #fill-mask #multilingual #af #am #ar #as #az #be #bg #bn #br #bs #ca #cs #cy #da #de #el #en #eo #es #et #eu #fa #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #hu #hy #id #is #it #ja #jv #ka #kk #km #kn #ko #ku #ky #la #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #om #or #pa #pl #ps #pt #ro #ru #sa #sd #si #sk #sl #so #sq #sr #su #sv #sw #ta #te #th #tl #tr #ug #uk #ur #uz #vi #xh #yi #zh #arxiv-2105.00572 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# XLM-RoBERTa-XL (xlarge-sized model) \n\nXLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository. \n\nDisclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.062494099140167236,
-0.0038251082878559828,
-0.006299053784459829,
0.03416461870074272,
0.06636992841959,
0.05696665868163109,
0.1301594227552414,
0.1164410337805748,
0.058299727737903595,
0.08418241888284683,
0.10177101939916611,
0.07592888921499252,
0.11451225727796555,
0.14999082684516907,
0.00632118945941329,
-0.2565872371196747,
0.022373545914888382,
-0.030876193195581436,
-0.01940525509417057,
0.10134967416524887,
0.09455230087041855,
-0.020032642409205437,
0.1288054883480072,
0.0019617092330008745,
0.0019186610588803887,
-0.018064050003886223,
-0.01044241338968277,
-0.04957197979092598,
0.03927458077669144,
0.09960562735795975,
0.021508578211069107,
0.07323254644870758,
0.06253741681575775,
-0.209028422832489,
0.027498213574290276,
0.02184625342488289,
-0.07087818533182144,
0.029988139867782593,
0.0486619770526886,
-0.07312130928039551,
0.19593526422977448,
-0.0803399533033371,
-0.02380966767668724,
0.02754760906100273,
-0.1633739024400711,
-0.21147073805332184,
-0.10059225559234619,
0.11731649935245514,
0.04728607460856438,
0.041561998426914215,
-0.04622643440961838,
0.0841120034456253,
-0.03895140066742897,
0.052673742175102234,
0.1717076599597931,
-0.24943208694458008,
-0.05992211028933525,
0.07219153642654419,
0.08119920641183853,
0.0022563966922461987,
-0.07123278826475143,
0.06148035451769829,
0.07349608093500137,
0.012264646589756012,
-0.03370879963040352,
-0.039590902626514435,
0.2000400871038437,
-0.020589729771018028,
-0.10789548605680466,
0.005351935978978872,
0.12274196743965149,
0.030863383784890175,
-0.014287113212049007,
-0.04011308029294014,
-0.02103699930012226,
-0.04420257732272148,
-0.06190486624836922,
-0.018714534118771553,
0.01678578369319439,
-0.0014273476554080844,
0.037252027541399,
0.01580386608839035,
-0.07467076927423477,
0.01124521717429161,
-0.03666142374277115,
0.14679522812366486,
0.011951815336942673,
0.02163039706647396,
-0.06073626130819321,
-0.01619267277419567,
-0.1030573844909668,
-0.12402933835983276,
-0.020227057859301567,
-0.023542601615190506,
-0.010385839268565178,
-0.0036910143680870533,
0.04112301394343376,
0.025416506454348564,
0.06894956529140472,
0.11760881543159485,
-0.07672950625419617,
0.08685097843408585,
0.08248325437307358,
0.04641102999448776,
0.054049745202064514,
0.07288331538438797,
-0.14289042353630066,
-0.04593624174594879,
-0.07895466685295105,
0.02301895059645176,
0.03163956478238106,
-0.035240497440099716,
-0.05231831967830658,
0.03916659578680992,
-0.03300473839044571,
0.016598330810666084,
0.019159434363245964,
0.0625925213098526,
-0.04564689099788666,
0.002423426602035761,
0.13584518432617188,
-0.09081418067216873,
0.0204501673579216,
0.03136508911848068,
-0.0007980200462043285,
0.08031373471021652,
-0.02897348441183567,
-0.00027415965450927615,
0.00235982914455235,
0.1098211482167244,
-0.06080854684114456,
0.010701235383749008,
-0.01692984625697136,
-0.09210781753063202,
0.03576607629656792,
-0.08817661553621292,
-0.014847959391772747,
-0.0965229794383049,
-0.07938709110021591,
-0.06615571677684784,
0.03274563327431679,
-0.06797954440116882,
0.0028900483157485723,
-0.0393710695207119,
-0.0925958901643753,
0.07704904675483704,
0.0321783572435379,
0.019463129341602325,
-0.08332357555627823,
0.05058818683028221,
-0.03868896886706352,
0.10630594938993454,
-0.0004997015930712223,
0.02110297605395317,
-0.06307670474052429,
0.08573749661445618,
-0.08382035046815872,
0.08121678233146667,
-0.07864230871200562,
-0.0065132929012179375,
-0.12456539273262024,
-0.10179170221090317,
-0.050978660583496094,
0.04022429510951042,
0.022975394502282143,
0.18337470293045044,
-0.23428773880004883,
-0.11182475835084915,
0.23232406377792358,
-0.10072336345911026,
-0.027205873280763626,
0.12928925454616547,
0.02988590858876705,
0.025489674881100655,
0.06489376723766327,
0.11085496842861176,
0.0640689879655838,
-0.08203423023223877,
-0.06748336553573608,
0.02981441468000412,
0.01762395165860653,
0.1680532991886139,
0.08891500532627106,
0.0044713737443089485,
0.004313644487410784,
0.02263595722615719,
-0.07204821705818176,
-0.002869725227355957,
-0.07213692367076874,
-0.07499665766954422,
0.06887935847043991,
-0.030923597514629364,
0.10544705390930176,
0.054777540266513824,
-0.011272947303950787,
-0.030852532014250755,
-0.07991395890712738,
-0.10628307610750198,
0.11542127281427383,
0.020058657974004745,
-0.012244822457432747,
-0.10810084640979767,
0.10547082126140594,
-0.009281889535486698,
0.027529405429959297,
-0.10846345871686935,
0.0798831582069397,
-0.01496859174221754,
-0.04341665282845497,
0.05597150698304176,
0.05604430288076401,
0.09182978421449661,
0.018378889188170433,
-0.060607437044382095,
-0.029213929548859596,
0.0557214692234993,
-0.025187203660607338,
-0.0403110533952713,
-0.22271735966205597,
0.00023898218933027238,
-0.055908117443323135,
0.1628248542547226,
-0.1922646015882492,
0.007102642673999071,
0.03615245223045349,
0.10506809502840042,
0.014301776885986328,
-0.00018749263836070895,
-0.01474738959223032,
0.04951201006770134,
-0.023861052468419075,
-0.012656520120799541,
0.045812420547008514,
-0.02432969957590103,
-0.0948258563876152,
0.05567638948559761,
-0.10256616026163101,
0.023898400366306305,
0.09312425553798676,
-0.023547925055027008,
-0.11709395796060562,
0.09910742938518524,
0.005512219853699207,
-0.018971124663949013,
0.03765974938869476,
0.021456170827150345,
0.10093653202056885,
0.010661591775715351,
0.06496905535459518,
-0.0863957405090332,
-0.015683790668845177,
0.06467501074075699,
-0.07676982134580612,
-0.09153709560632706,
0.17828771471977234,
0.05946417152881622,
-0.2465742528438568,
0.14432963728904724,
0.1364348828792572,
0.05408993363380432,
0.2435937523841858,
0.02929018996655941,
-0.040607038885354996,
-0.08768590539693832,
-0.0061775255016982555,
0.027668217197060585,
0.06534766405820847,
-0.06210345774888992,
-0.012552318163216114,
0.003493169555440545,
-0.03174758329987526,
0.004117319360375404,
-0.07836294174194336,
-0.03080267272889614,
-0.026381079107522964,
-0.060352079570293427,
-0.02004684880375862,
0.07411885261535645,
-0.05299335718154907,
0.1177123561501503,
0.03839608654379845,
-0.0366319939494133,
-0.040449656546115875,
-0.02537987381219864,
-0.08971752226352692,
0.17863373458385468,
-0.11329920589923859,
-0.21104571223258972,
-0.05194613337516785,
-0.060028668493032455,
-0.02755272015929222,
0.00714656850323081,
-0.0014938990352675319,
-0.14258459210395813,
-0.022074168547987938,
-0.02201024256646633,
0.04330702871084213,
-0.05345204100012779,
-0.04728563874959946,
0.011572929099202156,
-0.01364734023809433,
0.006173641420900822,
-0.06763320416212082,
-0.012729127891361713,
-0.030500220134854317,
-0.05587341636419296,
0.08013859391212463,
-0.07754071056842804,
0.081380195915699,
0.12041553109884262,
0.025588830932974815,
0.008693232201039791,
0.013926527462899685,
0.21182918548583984,
-0.14090730249881744,
0.03476168215274811,
0.0366566926240921,
0.00976085290312767,
0.0655006468296051,
0.14468473196029663,
0.04466839134693146,
-0.05772219970822334,
-0.050746168941259384,
0.008282095193862915,
-0.054460614919662476,
-0.1788630187511444,
-0.08154258877038956,
-0.04349314793944359,
0.02657310478389263,
0.022024571895599365,
0.06316030770540237,
-0.026359908282756805,
0.037410613149404526,
-0.08590462803840637,
-0.024801082909107208,
0.04079986736178398,
0.0027918857522308826,
0.0070197610184550285,
-0.011082179844379425,
0.049106910824775696,
-0.04058948531746864,
-0.03776882588863373,
0.09735281020402908,
-0.005191294010728598,
0.05926962569355965,
0.07472958415746689,
0.1353045403957367,
0.0759337916970253,
0.059275638312101364,
0.05279464274644852,
0.024809863418340683,
0.023016011342406273,
-0.00908448826521635,
-0.002722029108554125,
-0.06571643799543381,
-0.004377863835543394,
0.04780374467372894,
0.0819612368941307,
-0.014167084358632565,
-0.005962229333817959,
0.05530140921473503,
0.06236828863620758,
0.12955905497074127,
0.09107454866170883,
-0.1704234927892685,
-0.032540399581193924,
0.05921773239970207,
0.04607786983251572,
-0.06685822457075119,
-0.006363990716636181,
0.02177007868885994,
-0.0950809046626091,
0.1923172026872635,
0.026838738471269608,
0.05585535615682602,
-0.11237508058547974,
0.010529172606766224,
0.02953500673174858,
0.04383612051606178,
-0.01093632634729147,
0.06955881416797638,
-0.21541357040405273,
0.2554686367511749,
0.02890622988343239,
0.022792814299464226,
-0.037219349294900894,
-0.01699504628777504,
0.04410627856850624,
0.04608243331313133,
0.17729489505290985,
0.061918072402477264,
-0.0982118472456932,
-0.14782162010669708,
-0.05357882380485535,
-0.015748705714941025,
0.12307858467102051,
-0.06394213438034058,
0.0729108601808548,
0.0188305601477623,
-0.03627642244100571,
-0.04662488028407097,
0.06323505938053131,
-0.11607915163040161,
-0.0885646864771843,
0.12215527892112732,
-0.06902789324522018,
-0.08186326175928116,
-0.015082609839737415,
-0.08901262283325195,
-0.171999990940094,
0.08649759739637375,
-0.09842848777770996,
-0.06720190495252609,
-0.08083605766296387,
0.008677934296429157,
0.09346635639667511,
-0.1385149508714676,
-0.0413907952606678,
-0.011494182050228119,
0.04237287491559982,
-0.035590264946222305,
-0.059996481984853745,
0.055587753653526306,
-0.06606338173151016,
-0.19307903945446014,
-0.0011095603695139289,
0.10008495301008224,
0.08956278860569,
0.07941152155399323,
0.0009959484450519085,
0.04698944091796875,
0.022550037130713463,
-0.11802267283201218,
0.13693338632583618,
0.07598751783370972,
-0.08998169749975204,
0.069193035364151,
-0.02835562452673912,
-0.1398714929819107,
-0.09283730387687683,
-0.057904958724975586,
0.06110997870564461,
0.31990817189216614,
-0.05783596262335777,
0.10568848252296448,
0.17541973292827606,
-0.10852853208780289,
-0.22820988297462463,
-0.11065642535686493,
0.003063111798837781,
0.05032797530293465,
-0.03839590400457382,
-0.18078556656837463,
-0.03368889540433884,
0.05361710116267204,
-0.006775050424039364,
0.07581652700901031,
-0.24600572884082794,
-0.08887169510126114,
0.10302062332630157,
-0.018503818660974503,
0.07811420410871506,
-0.17050841450691223,
-0.08276593685150146,
-0.07929015159606934,
-0.05741233378648758,
-0.00942149292677641,
-0.006704387255012989,
0.07408565282821655,
-0.0237873662263155,
0.0023099060636013746,
0.002068890258669853,
0.005531714763492346,
0.18930894136428833,
0.005847578402608633,
0.028618084266781807,
-0.11199808865785599,
-0.12647075951099396,
0.03151135891675949,
0.008972864598035812,
0.03436387702822685,
-0.06483320146799088,
-0.0264410562813282,
-0.042226631194353104,
-0.006409467197954655,
-0.14065006375312805,
0.030111070722341537,
-0.06406357139348984,
-0.01686016283929348,
-0.05842901021242142,
0.0930706337094307,
0.0382891446352005,
-0.002538869855925441,
0.07889261841773987,
-0.07346273213624954,
0.15503382682800293,
0.04914204031229019,
0.17443786561489105,
0.028862295672297478,
-0.0028750665951520205,
-0.05162693187594414,
-0.02245546691119671,
0.03673659265041351,
-0.10623311251401901,
0.0038962820544838905,
0.09813372045755386,
0.05093109980225563,
0.12105856090784073,
0.021597890183329582,
-0.13458415865898132,
0.0065917205065488815,
0.11581559479236603,
-0.09544973075389862,
-0.21407024562358856,
-0.014078846201300621,
-0.054945845156908035,
-0.027034256607294083,
-0.024097608402371407,
0.12629514932632446,
-0.049865808337926865,
-0.033153027296066284,
-0.015956683084368706,
0.10015901178121567,
-0.04299561306834221,
0.13911637663841248,
0.0557425357401371,
0.018893787637352943,
-0.11168204993009567,
0.04885232821106911,
0.0669289380311966,
0.010236788541078568,
0.014731268398463726,
0.11809783428907394,
-0.07523738592863083,
-0.08215706795454025,
-0.053880009800195694,
0.1832069456577301,
-0.023060165345668793,
-0.036027755588293076,
0.006555831991136074,
-0.11953108757734299,
0.044424958527088165,
0.1517438292503357,
0.023176947608590126,
0.014978494495153427,
0.04982519894838333,
-0.024594413116574287,
0.016781942918896675,
0.06628355383872986,
0.08990798145532608,
-0.008448177017271519,
-0.07897194474935532,
0.052265945822000504,
-0.005800903774797916,
0.10912897437810898,
-0.02690013311803341,
-0.014728429727256298,
-0.13008242845535278,
-0.0029889766592532396,
-0.07610146701335907,
0.04920016601681709,
-0.13524220883846283,
0.00020625341858249158,
-0.003831433365121484,
-0.034640535712242126,
-0.058823104947805405,
-0.06121295690536499,
-0.09645507484674454,
-0.020396534353494644,
-0.020027335733175278,
0.09690167754888535,
-0.06785746663808823,
-0.033447228372097015,
0.0466480478644371,
-0.054517991840839386,
0.060507502406835556,
0.0478585846722126,
0.020487817004323006,
0.05921626836061478,
-0.16582897305488586,
-0.01415135245770216,
0.0013743223389610648,
0.02008754387497902,
0.006935999728739262,
-0.13129891455173492,
-0.023921888321638107,
-0.04582209140062332,
-0.02459082007408142,
0.03523515909910202,
-0.013277947902679443,
-0.045156948268413544,
0.10063540190458298,
-0.022079389542341232,
-0.056928690522909164,
-0.0350620299577713,
0.07781611382961273,
0.0859067365527153,
0.023885183036327362,
0.06286393851041794,
-0.07520203292369843,
0.06596412509679794,
-0.13780927658081055,
0.01994020864367485,
-0.03519390523433685,
-0.09672345221042633,
0.019993003457784653,
-0.02334820106625557,
0.07446318864822388,
-0.029836799949407578,
0.09926789253950119,
0.03143152594566345,
-0.038125261664390564,
-0.009768541902303696,
-0.021978896111249924,
-0.09266339242458344,
0.028480974957346916,
0.03968369960784912,
0.0470578595995903,
-0.030840836465358734,
-0.06512323021888733,
0.03310113027691841,
-0.052994124591350555,
0.058889176696538925,
0.10601745545864105,
0.07771546393632889,
0.15388983488082886,
0.057755086570978165,
0.014925681985914707,
-0.10836130380630493,
0.01408084575086832,
0.014500427059829235,
-0.10694164782762527,
0.04935803636908531,
-0.0003326562000438571,
0.053098611533641815,
0.20061181485652924,
-0.1660783737897873,
0.07622285932302475,
-0.08436250686645508,
-0.0695284828543663,
-0.1231294497847557,
-0.227254718542099,
-0.0848255455493927,
-0.012163649313151836,
0.024776212871074677,
-0.09748143702745438,
0.04474278911948204,
0.06002670153975487,
0.04990702122449875,
-0.0013842870248481631,
0.024324549362063408,
-0.06758413463830948,
-0.07551030069589615,
0.10667066276073456,
0.013020632788538933,
-0.009083311073482037,
-0.08819861710071564,
0.018168991431593895,
0.020220911130309105,
-0.028498748317360878,
-0.00893402099609375,
0.02053804323077202,
-0.02155814878642559,
0.05387669801712036,
-0.056114062666893005,
-0.12224555760622025,
0.001954448176547885,
0.019050709903240204,
0.07868566364049911,
0.07115118950605392,
0.021444477140903473,
0.0016670082695782185,
0.01717834174633026,
0.045537013560533524,
-0.011082176119089127,
-0.03830553963780403,
-0.11230119317770004,
0.12235134840011597,
-0.07697612047195435,
0.01830812729895115,
-0.015255320817232132,
-0.05455584079027176,
0.024817684665322304,
0.21876715123653412,
0.28007417917251587,
-0.028270861133933067,
0.04643788933753967,
-0.011924796737730503,
0.029703661799430847,
0.040315333753824234,
-0.011066320352256298,
0.03797401115298271,
0.2286624014377594,
-0.10516472905874252,
0.055616021156311035,
-0.07257446646690369,
-0.004988181870430708,
-0.016875246539711952,
0.05361349135637283,
0.040436141192913055,
0.00385407661087811,
-0.04601507633924484,
0.10760686546564102,
-0.1529710292816162,
-0.1334383338689804,
0.013138627633452415,
-0.15522615611553192,
-0.09165322780609131,
-0.03600698709487915,
-0.03608673810958862,
0.13924233615398407,
0.09309016913175583,
-0.01275898888707161,
-0.0482083298265934,
0.014500413089990616,
0.04460331052541733,
-0.10652196407318115,
-0.03517324849963188,
0.05437618866562843,
-0.0329815112054348,
0.10015743970870972,
-0.01790512166917324,
0.059063248336315155,
0.10648669302463531,
0.0022814241237938404,
-0.035999882966279984,
0.03479825705289841,
0.048102833330631256,
-0.0008621381712146103,
-0.001542576472274959,
0.11505889892578125,
-0.01292412355542183,
-0.02430308237671852,
0.10974796116352081,
-0.03833521157503128,
0.06970617920160294,
0.09366972744464874,
-0.038309935480356216,
-0.016656920313835144,
0.1639266312122345,
-0.09925523400306702,
0.13824239373207092,
0.1974625438451767,
-0.017290731891989708,
-0.04944015294313431,
-0.01891198754310608,
0.031685277819633484,
-0.030308611690998077,
0.005065766628831625,
-0.04321431741118431,
-0.1473640650510788,
-0.008345672860741615,
-0.06737768650054932,
0.08126212656497955,
-0.10405061393976212,
-0.0276009663939476,
-0.019881904125213623,
-0.017727868631482124,
-0.03854215145111084,
0.09965263307094574,
0.08701872080564499,
0.023816578090190887,
-0.03946123644709587,
-0.08644832670688629,
0.022816715762019157,
0.12343748658895493,
-0.09123372286558151,
-0.052597057074308395
] |
null | null |
transformers
|
# XLM-RoBERTa-XL (xxlarge-sized model)
XLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Larger-Scale Transformers for Multilingual Masked Language Modeling](https://arxiv.org/abs/2105.00572) by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/xlmr).
Disclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta-xl) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='facebook/xlm-roberta-xxl')
>>> unmasker("Europe is a <mask> continent.")
[{'score': 0.22996895015239716,
'token': 28811,
'token_str': 'European',
'sequence': 'Europe is a European continent.'},
{'score': 0.14307449758052826,
'token': 21334,
'token_str': 'large',
'sequence': 'Europe is a large continent.'},
{'score': 0.12239163368940353,
'token': 19336,
'token_str': 'small',
'sequence': 'Europe is a small continent.'},
{'score': 0.07025063782930374,
'token': 18410,
'token_str': 'vast',
'sequence': 'Europe is a vast continent.'},
{'score': 0.032869212329387665,
'token': 6957,
'token_str': 'big',
'sequence': 'Europe is a big continent.'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('facebook/xlm-roberta-xxl')
model = AutoModelForMaskedLM.from_pretrained("facebook/xlm-roberta-xxl")
# prepare input
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
# forward pass
output = model(**encoded_input)
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-00572,
author = {Naman Goyal and
Jingfei Du and
Myle Ott and
Giri Anantharaman and
Alexis Conneau},
title = {Larger-Scale Transformers for Multilingual Masked Language Modeling},
journal = {CoRR},
volume = {abs/2105.00572},
year = {2021},
url = {https://arxiv.org/abs/2105.00572},
eprinttype = {arXiv},
eprint = {2105.00572},
timestamp = {Wed, 12 May 2021 15:54:31 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-00572.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"language": ["multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", false, "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh"], "license": "mit"}
|
fill-mask
|
facebook/xlm-roberta-xxl
|
[
"transformers",
"pytorch",
"xlm-roberta-xl",
"fill-mask",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:2105.00572",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.00572"
] |
[
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh"
] |
TAGS
#transformers #pytorch #xlm-roberta-xl #fill-mask #multilingual #af #am #ar #as #az #be #bg #bn #br #bs #ca #cs #cy #da #de #el #en #eo #es #et #eu #fa #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #hu #hy #id #is #it #ja #jv #ka #kk #km #kn #ko #ku #ky #la #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #om #or #pa #pl #ps #pt #ro #ru #sa #sd #si #sk #sl #so #sq #sr #su #sv #sw #ta #te #th #tl #tr #ug #uk #ur #uz #vi #xh #yi #zh #arxiv-2105.00572 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# XLM-RoBERTa-XL (xxlarge-sized model)
XLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository.
Disclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
Here is how to use this model to get the features of a given text in PyTorch:
### BibTeX entry and citation info
|
[
"# XLM-RoBERTa-XL (xxlarge-sized model) \n\nXLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository. \n\nDisclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nXLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. \n\nRoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.\n\nMore precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.\n\nThis way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.",
"## Intended uses & limitations\n\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.\n\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.",
"## Usage\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #xlm-roberta-xl #fill-mask #multilingual #af #am #ar #as #az #be #bg #bn #br #bs #ca #cs #cy #da #de #el #en #eo #es #et #eu #fa #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #hu #hy #id #is #it #ja #jv #ka #kk #km #kn #ko #ku #ky #la #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #om #or #pa #pl #ps #pt #ro #ru #sa #sd #si #sk #sl #so #sq #sr #su #sv #sw #ta #te #th #tl #tr #ug #uk #ur #uz #vi #xh #yi #zh #arxiv-2105.00572 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# XLM-RoBERTa-XL (xxlarge-sized model) \n\nXLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository. \n\nDisclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nXLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. \n\nRoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.\n\nMore precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.\n\nThis way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.",
"## Intended uses & limitations\n\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.\n\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.",
"## Usage\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:",
"### BibTeX entry and citation info"
] |
[
253,
145,
313,
129,
40,
11
] |
[
"passage: TAGS\n#transformers #pytorch #xlm-roberta-xl #fill-mask #multilingual #af #am #ar #as #az #be #bg #bn #br #bs #ca #cs #cy #da #de #el #en #eo #es #et #eu #fa #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #hu #hy #id #is #it #ja #jv #ka #kk #km #kn #ko #ku #ky #la #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #om #or #pa #pl #ps #pt #ro #ru #sa #sd #si #sk #sl #so #sq #sr #su #sv #sw #ta #te #th #tl #tr #ug #uk #ur #uz #vi #xh #yi #zh #arxiv-2105.00572 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# XLM-RoBERTa-XL (xxlarge-sized model) \n\nXLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository. \n\nDisclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.06247327849268913,
-0.0030417500529438257,
-0.006325969472527504,
0.034344885498285294,
0.06629052013158798,
0.05736212059855461,
0.1303596794605255,
0.11633801460266113,
0.058336637914180756,
0.08433792740106583,
0.102188341319561,
0.07565192878246307,
0.11458214372396469,
0.15051302313804626,
0.00602504750713706,
-0.2569107413291931,
0.022386033087968826,
-0.031208585947752,
-0.019259123131632805,
0.10130716115236282,
0.09435983002185822,
-0.02009795978665352,
0.12886370718479156,
0.0016067246906459332,
0.0022234644275158644,
-0.018518537282943726,
-0.010382896289229393,
-0.04921151325106621,
0.03905529901385307,
0.09966877102851868,
0.02169724926352501,
0.0733298659324646,
0.06258644908666611,
-0.20893624424934387,
0.02760183997452259,
0.021958040073513985,
-0.07095493376255035,
0.029907483607530594,
0.048553723841905594,
-0.07329222559928894,
0.19592322409152985,
-0.0800134614109993,
-0.02364473044872284,
0.027277540415525436,
-0.16343487799167633,
-0.21199674904346466,
-0.10041844844818115,
0.11736238747835159,
0.047219082713127136,
0.041525084525346756,
-0.04609537497162819,
0.08396109938621521,
-0.038816262036561966,
0.052696388214826584,
0.170911967754364,
-0.2483552247285843,
-0.06017778068780899,
0.0723828375339508,
0.08170019835233688,
0.0023908931761980057,
-0.07091695070266724,
0.06157403066754341,
0.0734943151473999,
0.012487481348216534,
-0.03386927396059036,
-0.03979133442044258,
0.20002959668636322,
-0.020747611299157143,
-0.10784444957971573,
0.0051537551917135715,
0.12270768731832504,
0.03105456940829754,
-0.014608359895646572,
-0.03981705382466316,
-0.021016739308834076,
-0.04393145442008972,
-0.06265420466661453,
-0.018681390210986137,
0.017022166401147842,
-0.0015816697850823402,
0.03754265606403351,
0.015959633514285088,
-0.07455925643444061,
0.01118345744907856,
-0.03670014068484306,
0.14640997350215912,
0.011632120236754417,
0.02196987345814705,
-0.06107979267835617,
-0.01694926992058754,
-0.10341900587081909,
-0.12420675158500671,
-0.019799180328845978,
-0.02361523173749447,
-0.010296092368662357,
-0.0035836314782500267,
0.04098125919699669,
0.025442034006118774,
0.06906775385141373,
0.11752943694591522,
-0.07707935571670532,
0.08734678477048874,
0.08229666203260422,
0.046406809240579605,
0.0539943166077137,
0.07302461564540863,
-0.14342963695526123,
-0.04580513760447502,
-0.07923296093940735,
0.023152152076363564,
0.03213188797235489,
-0.03554593771696091,
-0.05254097282886505,
0.03939121961593628,
-0.03328600525856018,
0.016531312838196754,
0.019720347598195076,
0.0628264918923378,
-0.045672256499528885,
0.0025144952815026045,
0.13592249155044556,
-0.09066221863031387,
0.02053336426615715,
0.03161195293068886,
-0.0009985578944906592,
0.07998761534690857,
-0.029400523751974106,
-0.0003854601236525923,
0.002467537298798561,
0.11026069521903992,
-0.060887306928634644,
0.01045683491975069,
-0.016679178923368454,
-0.092393659055233,
0.03596866503357887,
-0.08805093914270401,
-0.014958033338189125,
-0.0964408814907074,
-0.07915981858968735,
-0.06616948544979095,
0.033292774111032486,
-0.06825054436922073,
0.0030210758559405804,
-0.03914825618267059,
-0.09256252646446228,
0.07750993967056274,
0.03223588317632675,
0.019167382270097733,
-0.08330903947353363,
0.05037519335746765,
-0.03852250799536705,
0.10684293508529663,
-0.0005120993591845036,
0.021109111607074738,
-0.0631922259926796,
0.08585890382528305,
-0.08446912467479706,
0.0807761549949646,
-0.07850152254104614,
-0.006074385717511177,
-0.1244739517569542,
-0.1018853560090065,
-0.05098949745297432,
0.04039083421230316,
0.023004377260804176,
0.18367046117782593,
-0.23478977382183075,
-0.11197149753570557,
0.23248036205768585,
-0.10041544586420059,
-0.027485553175210953,
0.12955455482006073,
0.030059926211833954,
0.024758314713835716,
0.06454843282699585,
0.111171655356884,
0.0641767755150795,
-0.08192873001098633,
-0.06730778515338898,
0.029923975467681885,
0.01738794706761837,
0.16821615397930145,
0.08879201114177704,
0.004644615575671196,
0.004384279251098633,
0.022645413875579834,
-0.07194488495588303,
-0.0031348159536719322,
-0.07209140062332153,
-0.07511448115110397,
0.06913306564092636,
-0.03091355971992016,
0.10554701834917068,
0.055401407182216644,
-0.011625297367572784,
-0.031223809346556664,
-0.07999216765165329,
-0.1070246696472168,
0.11555308103561401,
0.020297985523939133,
-0.012194671668112278,
-0.10797454416751862,
0.1058245450258255,
-0.009100411087274551,
0.027717921882867813,
-0.10819893330335617,
0.07950924336910248,
-0.01495121605694294,
-0.04322243854403496,
0.05574164539575577,
0.05543886125087738,
0.09194882959127426,
0.018437707796692848,
-0.06070574000477791,
-0.02948700077831745,
0.0567966103553772,
-0.025317858904600143,
-0.04005272686481476,
-0.22294989228248596,
0.0003397249092813581,
-0.055730175226926804,
0.1628773957490921,
-0.19247226417064667,
0.00698138726875186,
0.035381756722927094,
0.1048770546913147,
0.014094962738454342,
-0.0003396297979634255,
-0.01453623827546835,
0.049368392676115036,
-0.02381065860390663,
-0.012681595981121063,
0.04582609236240387,
-0.024659132584929466,
-0.09451772272586823,
0.05627445504069328,
-0.1028958186507225,
0.024719644337892532,
0.09323520213365555,
-0.02391963265836239,
-0.11722049117088318,
0.10046155005693436,
0.005991507321596146,
-0.018872683867812157,
0.03829936310648918,
0.021724125370383263,
0.10038286447525024,
0.010749845765531063,
0.0649908035993576,
-0.08618803322315216,
-0.014894203282892704,
0.06498943269252777,
-0.0768919289112091,
-0.09206774830818176,
0.17849524319171906,
0.0596381239593029,
-0.24649177491664886,
0.1444063037633896,
0.13646532595157623,
0.05408097431063652,
0.24390661716461182,
0.029507987201213837,
-0.040509242564439774,
-0.08795025944709778,
-0.006550498306751251,
0.02781863324344158,
0.06508607417345047,
-0.061856627464294434,
-0.012431432493031025,
0.0032585624139755964,
-0.03191781044006348,
0.003832681803032756,
-0.0781238004565239,
-0.031296927481889725,
-0.026263929903507233,
-0.06038260459899902,
-0.0208728164434433,
0.07398602366447449,
-0.05289330706000328,
0.11787036806344986,
0.03881392627954483,
-0.037575867027044296,
-0.04050979018211365,
-0.025431035086512566,
-0.08958584070205688,
0.1789804846048355,
-0.11330776661634445,
-0.2110825479030609,
-0.05190529674291611,
-0.060898471623659134,
-0.02742769755423069,
0.007092046085745096,
-0.0018346315482631326,
-0.1425878405570984,
-0.0218660905957222,
-0.022201796993613243,
0.04280909150838852,
-0.05417785048484802,
-0.04738572984933853,
0.011400630697607994,
-0.013643288053572178,
0.006140102166682482,
-0.06757132709026337,
-0.012839512899518013,
-0.030426032841205597,
-0.05557556822896004,
0.08045344799757004,
-0.0777270495891571,
0.08206089586019516,
0.12090103328227997,
0.02527698129415512,
0.00868650060147047,
0.014002549462020397,
0.211665078997612,
-0.14101959764957428,
0.034683361649513245,
0.0366332121193409,
0.0102137615904212,
0.06584062427282333,
0.14553694427013397,
0.04482826218008995,
-0.057662710547447205,
-0.05079168826341629,
0.008000248111784458,
-0.05434528365731239,
-0.17882400751113892,
-0.08157327026128769,
-0.0433107390999794,
0.0273650623857975,
0.02175888605415821,
0.06305505335330963,
-0.026368945837020874,
0.03710141032934189,
-0.0858982503414154,
-0.02462054416537285,
0.04051822051405907,
0.002665620530024171,
0.007075909525156021,
-0.011060954071581364,
0.049337029457092285,
-0.040734875947237015,
-0.03773646056652069,
0.09726115316152573,
-0.00555364228785038,
0.05884649232029915,
0.07482443004846573,
0.1357698291540146,
0.07589003443717957,
0.058834098279476166,
0.052636150270700455,
0.024740835651755333,
0.023408107459545135,
-0.00916662160307169,
-0.002725066151469946,
-0.06573651731014252,
-0.004340904764831066,
0.047634854912757874,
0.08220817893743515,
-0.014354618266224861,
-0.006143535487353802,
0.055107735097408295,
0.06202499568462372,
0.12914437055587769,
0.09157828986644745,
-0.17086215317249298,
-0.0324942022562027,
0.059280794113874435,
0.046682361513376236,
-0.06690497696399689,
-0.006745919585227966,
0.021404962986707687,
-0.09488952159881592,
0.1924930363893509,
0.02708286978304386,
0.055875808000564575,
-0.11225877702236176,
0.010992254130542278,
0.029782548546791077,
0.04376042261719704,
-0.011268261820077896,
0.06952883303165436,
-0.21554872393608093,
0.2559066414833069,
0.029180940240621567,
0.02260999009013176,
-0.03681269288063049,
-0.01720583438873291,
0.044165968894958496,
0.046372298151254654,
0.1774025559425354,
0.061715006828308105,
-0.09938844293355942,
-0.14885127544403076,
-0.053387630730867386,
-0.015557684935629368,
0.12333747744560242,
-0.0635046511888504,
0.07294414937496185,
0.018931489437818527,
-0.03677915781736374,
-0.04653048515319824,
0.06342760473489761,
-0.11633408814668655,
-0.08834396302700043,
0.12241668254137039,
-0.06902420520782471,
-0.08302948623895645,
-0.014875848777592182,
-0.08917254954576492,
-0.1712263524532318,
0.08609281480312347,
-0.09764930605888367,
-0.06696611642837524,
-0.08083529770374298,
0.0075656031258404255,
0.0938115119934082,
-0.13905370235443115,
-0.04157944396138191,
-0.01176021620631218,
0.0421774797141552,
-0.035333577543497086,
-0.06027448549866676,
0.05577290430665016,
-0.06615301221609116,
-0.1926334947347641,
-0.0015105963684618473,
0.09952281415462494,
0.08922448754310608,
0.07944653928279877,
0.0007855803123675287,
0.047053005546331406,
0.0225735642015934,
-0.11783922463655472,
0.1376083493232727,
0.07571666687726974,
-0.08936002105474472,
0.06908039003610611,
-0.027805205434560776,
-0.13998399674892426,
-0.09268765151500702,
-0.0575336255133152,
0.06064692884683609,
0.3200981318950653,
-0.05773920565843582,
0.1051451563835144,
0.17591723799705505,
-0.10867993533611298,
-0.22881697118282318,
-0.11089060455560684,
0.002874394180253148,
0.050183624029159546,
-0.038473568856716156,
-0.18128280341625214,
-0.03425021097064018,
0.05343560501933098,
-0.006840151268988848,
0.0762215331196785,
-0.2460140734910965,
-0.08852758258581161,
0.10327494889497757,
-0.018318193033337593,
0.07772952318191528,
-0.17085711658000946,
-0.08275922387838364,
-0.07975128293037415,
-0.05795326083898544,
-0.009620643220841885,
-0.006003994029015303,
0.07386379688978195,
-0.024100475013256073,
0.002128955675289035,
0.0019974319729954004,
0.005579217802733183,
0.18931877613067627,
0.005339342635124922,
0.028719063848257065,
-0.1121099516749382,
-0.12713958323001862,
0.031626686453819275,
0.008882337249815464,
0.03482441231608391,
-0.06514302641153336,
-0.026515033096075058,
-0.04184050112962723,
-0.006360624451190233,
-0.14074216783046722,
0.029747670516371727,
-0.06411789357662201,
-0.016624778509140015,
-0.058203645050525665,
0.09312915056943893,
0.03794632479548454,
-0.00258028251118958,
0.07865485548973083,
-0.07336794584989548,
0.1548120677471161,
0.04944056272506714,
0.17471124231815338,
0.028642108663916588,
-0.0026606041938066483,
-0.05134151875972748,
-0.022540584206581116,
0.036527518182992935,
-0.10650347918272018,
0.0038425209932029247,
0.09825508296489716,
0.05056876689195633,
0.12094522267580032,
0.021815119311213493,
-0.13457059860229492,
0.006422107107937336,
0.11595181375741959,
-0.09526339173316956,
-0.21467290818691254,
-0.01401043776422739,
-0.05547869950532913,
-0.02744322642683983,
-0.02432374656200409,
0.12590783834457397,
-0.04993051663041115,
-0.03312071040272713,
-0.01627095229923725,
0.1005307286977768,
-0.04254155233502388,
0.1391461342573166,
0.0560498982667923,
0.019143467769026756,
-0.1115531250834465,
0.04884212091565132,
0.06733661144971848,
0.010241859592497349,
0.014893448911607265,
0.11843407154083252,
-0.0747966468334198,
-0.08243238180875778,
-0.053678806871175766,
0.18351435661315918,
-0.022642631083726883,
-0.036205045878887177,
0.0065507967956364155,
-0.11931094527244568,
0.044364266097545624,
0.1521528661251068,
0.023061664775013924,
0.014934757724404335,
0.04975263774394989,
-0.02499816007912159,
0.016629058867692947,
0.06684955954551697,
0.08993896842002869,
-0.008388051763176918,
-0.07883404940366745,
0.05202610418200493,
-0.006020053755491972,
0.10928919166326523,
-0.02683410793542862,
-0.014843827113509178,
-0.13034477829933167,
-0.002880311803892255,
-0.07673059403896332,
0.04911332204937935,
-0.1349271684885025,
0.00018011752399615943,
-0.004087062552571297,
-0.03406836837530136,
-0.05902467668056488,
-0.06169356033205986,
-0.09629666060209274,
-0.020636271685361862,
-0.019897572696208954,
0.09685412794351578,
-0.068220354616642,
-0.03374597057700157,
0.046639565378427505,
-0.05427369102835655,
0.06038263812661171,
0.04799952358007431,
0.020627817139029503,
0.058997802436351776,
-0.16630510985851288,
-0.014362422749400139,
0.0008707016822881997,
0.020176703110337257,
0.006968237459659576,
-0.13125382363796234,
-0.023973548784852028,
-0.04606427252292633,
-0.02491449937224388,
0.03524618223309517,
-0.013692626729607582,
-0.04500569403171539,
0.10095706582069397,
-0.02184792421758175,
-0.05727582052350044,
-0.035146646201610565,
0.07807305455207825,
0.08599650114774704,
0.023590292781591415,
0.06273329257965088,
-0.07491624355316162,
0.06603683531284332,
-0.1382743865251541,
0.01981092058122158,
-0.03534752503037453,
-0.09699329733848572,
0.019856758415699005,
-0.02315664105117321,
0.07454037666320801,
-0.02992250956594944,
0.09880252927541733,
0.03169214725494385,
-0.03814083710312843,
-0.010079894214868546,
-0.021799908950924873,
-0.09176121652126312,
0.028767090290784836,
0.039149925112724304,
0.046842753887176514,
-0.03131279721856117,
-0.0649333968758583,
0.03333703801035881,
-0.05274258181452751,
0.05939468368887901,
0.10564038157463074,
0.07763362675905228,
0.1547354757785797,
0.057952094823122025,
0.015058784745633602,
-0.10833203047513962,
0.014153086580336094,
0.014484254643321037,
-0.10713177174329758,
0.04938783869147301,
-0.00014147711044643074,
0.05324389785528183,
0.20069539546966553,
-0.16601873934268951,
0.0763629823923111,
-0.0851583331823349,
-0.06972376257181168,
-0.12348733097314835,
-0.2279522716999054,
-0.08492587506771088,
-0.012177087366580963,
0.024624593555927277,
-0.09764304012060165,
0.04451988637447357,
0.06086843088269234,
0.049726732075214386,
-0.0017108841566368937,
0.02371671050786972,
-0.06693737208843231,
-0.07484743744134903,
0.1066933125257492,
0.013130951672792435,
-0.00924479030072689,
-0.08711951225996017,
0.01860460452735424,
0.020276468247175217,
-0.029006559401750565,
-0.00862431526184082,
0.02040119841694832,
-0.02198026143014431,
0.05456981807947159,
-0.056230317801237106,
-0.12265291064977646,
0.0019402005709707737,
0.0191938579082489,
0.07866062223911285,
0.07080946117639542,
0.02151460386812687,
0.0018899701535701752,
0.017350953072309494,
0.045441560447216034,
-0.010932782664895058,
-0.03744261711835861,
-0.11184096336364746,
0.12327960133552551,
-0.07709413766860962,
0.018578331917524338,
-0.01557885855436325,
-0.054504383355379105,
0.024445367977023125,
0.21853716671466827,
0.280845046043396,
-0.02868669666349888,
0.04607502371072769,
-0.011584012769162655,
0.029750237241387367,
0.04011337459087372,
-0.010571800172328949,
0.03829513490200043,
0.22872021794319153,
-0.10509636253118515,
0.05565553158521652,
-0.07236559689044952,
-0.0047697992995381355,
-0.016843093559145927,
0.05336308851838112,
0.040598101913928986,
0.004280619323253632,
-0.04579423367977142,
0.10796502232551575,
-0.1530371755361557,
-0.1326187402009964,
0.012844007462263107,
-0.15555182099342346,
-0.09172756224870682,
-0.036029595881700516,
-0.03584170341491699,
0.13940098881721497,
0.09339557588100433,
-0.01299364771693945,
-0.04815560579299927,
0.014269080013036728,
0.044424768537282944,
-0.10671672224998474,
-0.03552057594060898,
0.05389175936579704,
-0.03325005993247032,
0.10079047828912735,
-0.017887430265545845,
0.0587136410176754,
0.10679586976766586,
0.0023575846571475267,
-0.0357428714632988,
0.03514321148395538,
0.04845433309674263,
-0.0004229622718412429,
-0.0012663056841120124,
0.11537069827318192,
-0.013475059531629086,
-0.024376997724175453,
0.11006861180067062,
-0.03825730085372925,
0.06987374275922775,
0.09285034239292145,
-0.03835272789001465,
-0.0165121927857399,
0.1639259159564972,
-0.09930015355348587,
0.13787038624286652,
0.19806864857673645,
-0.017321346327662468,
-0.049699388444423676,
-0.018789302557706833,
0.031353458762168884,
-0.030101241543889046,
0.005438589490950108,
-0.04349799081683159,
-0.14674577116966248,
-0.008501471020281315,
-0.06736935675144196,
0.08153713494539261,
-0.10434875637292862,
-0.0270986370742321,
-0.019700586795806885,
-0.017715629190206528,
-0.03823797404766083,
0.09944828599691391,
0.08703698962926865,
0.023811213672161102,
-0.03972003608942032,
-0.08563485741615295,
0.02288019098341465,
0.12377107888460159,
-0.0910901203751564,
-0.05283612012863159
] |
null | null |
fairseq
|
# xm_transformer_600m-en_ar-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-Arabic
- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-ar-cv7](https://huggingface.co/facebook/tts_transformer-ar-cv7)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.speech_to_text.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_ar-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-ar-cv7",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-ar", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c", "covost2"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_ar-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"dataset:covost2",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-ar"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_ar-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-Arabic
- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-ar-cv7
## Usage
|
[
"# xm_transformer_600m-en_ar-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Arabic\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-ar-cv7",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_ar-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Arabic\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-ar-cv7",
"## Usage"
] |
[
59,
97,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_ar-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Arabic\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-ar-cv7## Usage"
] |
[
-0.13781318068504333,
0.057699501514434814,
-0.0032623254228383303,
-0.03870885819196701,
0.045947231352329254,
-0.061812739819288254,
0.10144266486167908,
0.04029031842947006,
-0.10396043211221695,
-0.023872850462794304,
-0.009536956436932087,
-0.00018824527796823531,
0.08107534050941467,
0.09032902121543884,
-0.04617816582322121,
-0.18189221620559692,
0.0444161631166935,
-0.06195599213242531,
-0.0807681530714035,
0.07418765872716904,
0.131181001663208,
-0.037582650780677795,
0.036082249134778976,
0.02613440901041031,
-0.0756760761141777,
0.028691399842500687,
0.02083406038582325,
-0.13624395430088043,
0.07395409792661667,
0.13227210938930511,
0.056372519582509995,
0.07915206253528595,
0.07666957378387451,
-0.15305711328983307,
0.0457763709127903,
-0.02285231091082096,
-0.006533468142151833,
-0.0014036523643881083,
-0.007526350673288107,
-0.003677494591102004,
0.10348471254110336,
-0.026981141418218613,
-0.05570453032851219,
0.07205495238304138,
-0.10833773016929626,
-0.04545893520116806,
-0.009107696823775768,
-0.01607336662709713,
0.05795471370220184,
0.05917186290025711,
-0.0556960292160511,
0.004051064141094685,
-0.08250360935926437,
0.08680784702301025,
0.016815219074487686,
-0.23403143882751465,
-0.02797458879649639,
-0.045057978481054306,
0.05943121388554573,
0.09661808609962463,
-0.04211711883544922,
0.08722220361232758,
0.004764944314956665,
-0.01398980151861906,
-0.1490427851676941,
-0.14665424823760986,
-0.17143407464027405,
-0.033507123589515686,
-0.11956160515546799,
0.02289438247680664,
0.2992020547389984,
0.05436025187373161,
-0.03825163096189499,
0.03822089359164238,
-0.018803991377353668,
0.08190834522247314,
-0.02758260816335678,
-0.04216337949037552,
-0.037786178290843964,
0.05349966511130333,
-0.05238230526447296,
-0.12073682248592377,
-0.12676052749156952,
-0.025401629507541656,
-0.14627063274383545,
0.10409008711576462,
-0.01883593201637268,
0.022317642346024513,
-0.027313942089676857,
-0.039989106357097626,
-0.06200384348630905,
-0.026381058618426323,
0.03111192025244236,
-0.05499374121427536,
-0.08039305359125137,
-0.0018606098601594567,
-0.019642626866698265,
-0.1555677205324173,
0.08503493666648865,
-0.1818453073501587,
-0.13142774999141693,
0.016309652477502823,
-0.08894241601228714,
0.0831131562590599,
0.00315784546546638,
0.005412133876234293,
-0.12244582921266556,
-0.047393910586833954,
0.01443463284522295,
-0.009512661956250668,
-0.018680313602089882,
-0.012698893435299397,
-0.12872140109539032,
-0.024545881897211075,
-0.14110803604125977,
0.08539001643657684,
-0.02443966642022133,
0.02668028697371483,
-0.03424714505672455,
-0.015692690387368202,
0.08720909059047699,
-0.05211577191948891,
-0.0327642485499382,
0.06611604243516922,
-0.0037085036747157574,
0.034292902797460556,
0.07334744185209274,
0.10144396871328354,
-0.06751451641321182,
-0.13862374424934387,
0.015761373564600945,
0.07411317527294159,
0.05019201338291168,
-0.09990113228559494,
0.054478719830513,
0.05156485736370087,
-0.04148951545357704,
-0.19070914387702942,
0.02035144343972206,
-0.04420442879199982,
-0.0795174390077591,
0.05498683080077171,
-0.030475787818431854,
-0.14516738057136536,
-0.010282930918037891,
0.03582022711634636,
-0.06809210777282715,
-0.03560508042573929,
-0.05930318310856819,
0.027274472638964653,
-0.076444111764431,
0.07691074907779694,
-0.14419132471084595,
0.06676336377859116,
0.01952831633388996,
-0.014810127206146717,
-0.09341803938150406,
0.176150381565094,
-0.0361962653696537,
-0.03804852440953255,
-0.10125251859426498,
-0.01699971780180931,
-0.07472916692495346,
0.051906656473875046,
-0.019607679918408394,
0.13568484783172607,
-0.24073836207389832,
-0.08188220858573914,
0.1214590072631836,
-0.005510612390935421,
-0.01321488432586193,
0.16614662110805511,
0.03520888090133667,
0.023409729823470116,
0.09460315108299255,
0.2026754468679428,
0.08333051204681396,
-0.1598474085330963,
0.008706184104084969,
0.1046549379825592,
-0.001036471570841968,
-0.012577956542372704,
0.09756239503622055,
-0.07899715006351471,
0.04297346621751785,
-0.04439273849129677,
0.13817954063415527,
0.09118560701608658,
-0.04406459629535675,
-0.051230255514383316,
0.07149186730384827,
-0.07942679524421692,
0.1066112145781517,
-0.06175735592842102,
0.0030411432962864637,
-0.02169383130967617,
-0.05081403627991676,
0.007006464991718531,
0.07489220052957535,
-0.07896581292152405,
0.10659730434417725,
-0.15281473100185394,
0.020795289427042007,
-0.09014252573251724,
0.03020302765071392,
-0.09844624996185303,
0.06515523046255112,
-0.07955870777368546,
0.1444394588470459,
0.22037111222743988,
0.1783154159784317,
-0.025514328852295876,
0.0027832360938191414,
-0.0714711919426918,
0.07256901264190674,
0.04620097205042839,
0.060464389622211456,
-0.03706168010830879,
-0.14566262066364288,
0.11669939011335373,
-0.06140206381678581,
0.1078258827328682,
-0.02680574730038643,
-0.046887800097465515,
0.15575538575649261,
0.0723935067653656,
0.0019202417461201549,
0.035384077578783035,
0.08896879851818085,
0.09708872437477112,
0.043975990265607834,
0.027390165254473686,
0.027840256690979004,
-0.010483972728252411,
-0.1402283012866974,
0.2551611363887787,
-0.18949079513549805,
0.0718584731221199,
0.1341422200202942,
-0.13457496464252472,
0.016535114496946335,
0.028550906106829643,
0.01828065514564514,
0.011396552436053753,
0.07323062419891357,
-0.026380637660622597,
0.2520655393600464,
-0.060080766677856445,
0.12118306756019592,
-0.07747456431388855,
0.07500654458999634,
0.01139812171459198,
-0.06404522061347961,
-0.020368732511997223,
0.10846446454524994,
-0.02669387310743332,
-0.20198647677898407,
0.037990882992744446,
0.20398300886154175,
0.019104551523923874,
0.2514587342739105,
-0.01798742450773716,
-0.008251946419477463,
-0.011820435523986816,
-0.0014829447027295828,
-0.04129030182957649,
0.02536223642528057,
-0.12878428399562836,
-0.027006562799215317,
0.004053815267980099,
0.06735394895076752,
0.08217913657426834,
-0.08148893713951111,
0.020868243649601936,
0.00538186589255929,
-0.11990507692098618,
-0.12627358734607697,
0.027674419805407524,
0.024686720222234726,
0.06817489117383957,
-0.047231175005435944,
-0.10067357122898102,
0.023062510415911674,
-0.04944497346878052,
-0.11860445141792297,
0.07511996477842331,
-0.22279344499111176,
-0.2998063266277313,
-0.08192280679941177,
-0.009356019087135792,
-0.031723752617836,
0.04972057789564133,
0.09377698600292206,
-0.07801167666912079,
0.0006507813232019544,
-0.04762145131826401,
0.11034385114908218,
-0.0559857003390789,
-0.0030665507074445486,
-0.01658984087407589,
-0.0037857454735785723,
0.008746672421693802,
-0.10042325407266617,
0.03187962621450424,
0.0004895366728305817,
-0.04359665885567665,
0.00203515961766243,
-0.07223749905824661,
0.049868907779455185,
0.17715582251548767,
0.059550296515226364,
-0.016020501032471657,
-0.09426066279411316,
0.1296730637550354,
-0.11309235543012619,
-0.00404055742546916,
0.10863623768091202,
-0.03656579554080963,
-0.006336280610412359,
0.1868702620267868,
0.012715311720967293,
0.001911736442707479,
-0.0012828422477468848,
-0.04968303442001343,
-0.04366084188222885,
-0.18000881373882294,
-0.1424030214548111,
-0.1412326991558075,
-0.004844406619668007,
-0.15415553748607635,
-0.0060270256362855434,
-0.024591190740466118,
-0.050157077610492706,
-0.04353715851902962,
-0.04648968577384949,
0.09794146567583084,
-0.026898328214883804,
0.2336466908454895,
-0.06134611368179321,
0.07781215012073517,
-0.08888345211744308,
-0.06587578356266022,
0.10954374074935913,
0.06993847340345383,
0.021374966949224472,
0.1157679334282875,
0.14970019459724426,
0.04490220546722412,
0.04998840391635895,
0.12055936455726624,
0.018804213032126427,
0.07200855761766434,
0.007932859472930431,
-0.03609312325716019,
-0.06355807930231094,
-0.003016029018908739,
0.03575543314218521,
0.38775205612182617,
-0.10367465764284134,
0.00008041538967518136,
0.015353645198047161,
0.06558545678853989,
0.045059964060783386,
0.0879867672920227,
-0.06693591177463531,
0.03228034824132919,
0.028054961934685707,
-0.03783052787184715,
-0.02057403326034546,
0.07656864076852798,
0.2111770063638687,
-0.02985311485826969,
0.1024874597787857,
0.10991587489843369,
0.061583012342453,
-0.03474514186382294,
0.023318346589803696,
-0.12879428267478943,
-0.043319933116436005,
-0.014807073399424553,
0.023633994162082672,
-0.17584228515625,
0.18328256905078888,
0.0910988301038742,
0.013632861897349358,
-0.0037143288645893335,
-0.006280945613980293,
0.04900128021836281,
0.13744710385799408,
0.1255287379026413,
0.011589855886995792,
-0.0701713114976883,
-0.11747144907712936,
-0.09426996111869812,
0.010692750103771687,
0.12951388955116272,
0.11024578660726547,
-0.031611524522304535,
0.044457804411649704,
-0.045684490352869034,
0.018489232286810875,
0.036966465413570404,
-0.2179509401321411,
-0.12630341947078705,
0.05720491707324982,
0.23607560992240906,
0.06934162229299545,
-0.0038768714293837547,
-0.05681418254971504,
-0.18011082708835602,
0.050589434802532196,
-0.12291037291288376,
-0.00826866365969181,
-0.09587492793798447,
-0.09334160387516022,
0.10293906927108765,
-0.04300616309046745,
0.018452847376465797,
0.009079809300601482,
-0.024484608322381973,
-0.08563355356454849,
-0.03784158080816269,
0.12124544382095337,
-0.07300102710723877,
-0.01562969759106636,
-0.013612921349704266,
0.2350330650806427,
0.0141261862590909,
0.10419899225234985,
0.042285747826099396,
-0.03215399011969566,
0.02896127849817276,
-0.03827986121177673,
-0.006490983068943024,
-0.0008969881455413997,
-0.07075153291225433,
0.10451776534318924,
0.02406657300889492,
-0.17979520559310913,
-0.04268058016896248,
-0.056078631430864334,
0.2376186102628708,
0.11731141060590744,
-0.02789088524878025,
0.11527804285287857,
0.2166184037923813,
-0.02576858550310135,
-0.26366326212882996,
-0.0622493177652359,
0.012866515666246414,
0.08277679979801178,
-0.07852601259946823,
-0.10998039692640305,
-0.024644920602440834,
-0.07014323025941849,
-0.005515393801033497,
-0.006366587243974209,
-0.1880519688129425,
-0.13372305035591125,
0.1710566133260727,
-0.12888599932193756,
0.14924731850624084,
-0.0322934053838253,
-0.05925839766860008,
-0.08105100691318512,
0.059232234954833984,
0.11338858306407928,
-0.222527876496315,
0.1245526596903801,
0.13101933896541595,
0.055699970573186874,
0.004021278116852045,
0.07432495802640915,
0.12475070357322693,
0.019538428634405136,
-0.04207424819469452,
-0.0006324044661596417,
0.00650151027366519,
0.03886941447854042,
0.06084500253200531,
-0.031048910692334175,
-0.01845609024167061,
-0.013947960920631886,
-0.07250425964593887,
-0.042718857526779175,
-0.061504021286964417,
0.05767998471856117,
0.06460046023130417,
-0.007483368273824453,
-0.038347356021404266,
-0.06191685050725937,
-0.014390993863344193,
0.01740957237780094,
0.08778989315032959,
-0.20639638602733612,
0.00232691690325737,
0.2061227262020111,
0.24233020842075348,
-0.1165817454457283,
0.10864616185426712,
0.005408952012658119,
-0.08296480774879456,
0.07283242046833038,
-0.05118108540773392,
0.035694945603609085,
0.0590478852391243,
-0.06051407381892204,
0.1210520938038826,
-0.014227633364498615,
-0.05588191747665405,
0.12688329815864563,
0.0815645307302475,
-0.03191158175468445,
-0.12754011154174805,
-0.05952160432934761,
0.0261639766395092,
0.05474806949496269,
0.03147238492965698,
0.26570749282836914,
-0.012591972947120667,
0.040169257670640945,
-0.0628635361790657,
-0.018347041681408882,
-0.14228399097919464,
0.16491208970546722,
0.08040593564510345,
-0.002521057613193989,
-0.11570128798484802,
0.10233435779809952,
0.05362074077129364,
-0.10193038731813431,
-0.02476540207862854,
0.06308040022850037,
-0.06852999329566956,
-0.10517997294664383,
-0.16860134899616241,
0.02218778431415558,
0.006211500149220228,
-0.14043688774108887,
0.009830188937485218,
-0.16598758101463318,
0.0020274000708013773,
0.2193080186843872,
-0.03971242159605026,
0.03953114151954651,
-0.07202738523483276,
-0.03190137818455696,
0.08127421885728836,
0.01149604469537735,
0.004701542668044567,
-0.05539879575371742,
-0.05256865546107292,
0.17878341674804688,
-0.05708760768175125,
0.147242471575737,
-0.05521715059876442,
-0.02146083302795887,
-0.007984984666109085,
0.06998235732316971,
-0.06826663762331009,
0.001804131199605763,
-0.05938923731446266,
0.005277858581393957,
0.02089955471456051,
-0.06108168885111809,
-0.051676470786333084,
0.0014423530083149672,
-0.07979479432106018,
0.04640520364046097,
-0.00975718256086111,
0.06421490013599396,
-0.06316099315881729,
-0.009684301912784576,
-0.008144276216626167,
0.029576845467090607,
0.11324167251586914,
0.08623728901147842,
-0.10297545790672302,
0.09826477617025375,
-0.21658702194690704,
-0.08378579467535019,
0.1362094283103943,
0.07670022547245026,
0.03812461346387863,
0.05843755602836609,
-0.006608947645872831,
0.11065497249364853,
0.06087835878133774,
-0.030142098665237427,
0.0401742123067379,
0.016020139679312706,
-0.013459019362926483,
-0.15003816783428192,
-0.010832737199962139,
-0.012360790744423866,
-0.01023805607110262,
0.11456974595785141,
0.0859532579779625,
0.06553356349468231,
-0.0618513748049736,
0.014162297360599041,
0.00806578528136015,
0.021829022094607353,
-0.006533414125442505,
-0.08212422579526901,
-0.01935880444943905,
-0.11688479036092758,
0.10569725930690765,
-0.02548913285136223,
0.05504130199551582,
-0.02417093887925148,
-0.007557050324976444,
0.006119115278124809,
-0.07092882692813873,
-0.030725961551070213,
0.044771622866392136,
0.06832829117774963,
0.10270160436630249,
-0.02592262625694275,
-0.08878976851701736,
-0.04347154498100281,
0.055715009570121765,
0.09796246141195297,
-0.025164272636175156,
0.07160653173923492,
0.14314448833465576,
0.12275142222642899,
0.07558512687683105,
-0.05887117609381676,
0.08708342909812927,
0.03980173543095589,
-0.09306278079748154,
-0.028168875724077225,
-0.11467303335666656,
-0.024231337010860443,
0.09024762362241745,
-0.061123933643102646,
-0.013285398483276367,
0.007303477264940739,
-0.06251636892557144,
-0.15057116746902466,
-0.013572045601904392,
-0.0761757642030716,
-0.08518926799297333,
-0.010277587920427322,
-0.06728744506835938,
0.06441056728363037,
0.024331990629434586,
0.06623513251543045,
0.03293995559215546,
0.13970893621444702,
-0.11306258291006088,
-0.16543295979499817,
0.11106254160404205,
-0.060882069170475006,
0.12697046995162964,
-0.03460981696844101,
-0.041531454771757126,
0.14561650156974792,
0.00517697399482131,
0.060180094093084335,
0.01638958230614662,
-0.05737556889653206,
0.056854184716939926,
-0.11400149762630463,
-0.039694029837846756,
-0.03944722190499306,
0.023724131286144257,
0.05734283849596977,
0.11942467093467712,
0.12362030148506165,
-0.060771577060222626,
0.04567765071988106,
0.02526923455297947,
-0.04592284932732582,
-0.15411415696144104,
-0.09710703045129776,
-0.08732078224420547,
-0.0312911719083786,
0.13826899230480194,
-0.07478068023920059,
-0.06518292427062988,
-0.05072021484375,
0.08921661972999573,
0.2941422164440155,
-0.1877366453409195,
0.057880010455846786,
-0.013382346369326115,
0.028910428285598755,
-0.02860296703875065,
0.019504593685269356,
0.06855367124080658,
0.10990435630083084,
0.04193473607301712,
-0.06722420454025269,
-0.03978363052010536,
-0.031313396990299225,
-0.029664598405361176,
0.03953990712761879,
-0.06939228624105453,
-0.05738522857427597,
0.05691688880324364,
0.14735952019691467,
-0.11958280205726624,
-0.11769334226846695,
-0.12088951468467712,
-0.10254781693220139,
-0.03430129960179329,
0.008475111797451973,
0.025563674047589302,
0.2055465579032898,
-0.042759258300065994,
-0.06793706864118576,
-0.06006703898310661,
0.03046172484755516,
0.011204113252460957,
-0.10166426748037338,
-0.01456631999462843,
0.027358196675777435,
-0.19133174419403076,
-0.09485597163438797,
-0.020030278712511063,
0.14038576185703278,
-0.007482519373297691,
0.09135676920413971,
0.08337479829788208,
0.1659044325351715,
0.06725703924894333,
-0.043758682906627655,
0.03953327611088753,
0.09554649144411087,
-0.04847080633044243,
0.2486894428730011,
0.05051245912909508,
-0.03819217532873154,
0.09336716681718826,
-0.01229847501963377,
-0.0518050491809845,
-0.01528877578675747,
0.012589825317263603,
-0.10005888342857361,
0.08368895947933197,
-0.006774569395929575,
-0.008111897855997086,
0.02986162155866623,
0.027766436338424683,
-0.04891323298215866,
0.005060364026576281,
-0.04189019277691841,
-0.07489248365163803,
-0.14183269441127777,
-0.04218011349439621,
-0.12179606407880783,
0.09472768753767014,
-0.06620080024003983,
0.00422448618337512,
-0.12417741864919662,
0.0323159396648407,
-0.016148537397384644,
0.09869187325239182,
0.02683223783969879,
-0.04530687630176544,
0.020254675298929214,
-0.07996811717748642,
0.06633488833904266,
0.07364758849143982,
-0.07780735939741135,
-0.06822843849658966
] |
null | null |
fairseq
|
# xm_transformer_600m-en_es-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-Spanish
- Trained on MuST-C, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-es-css10](https://huggingface.co/facebook/tts_transformer-es-css10)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.text_to_speech.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_es-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-es-css10",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-es", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c", "europarl_st", "voxpopuli"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_es-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"dataset:europarl_st",
"dataset:voxpopuli",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-es"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_es-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-Spanish
- Trained on MuST-C, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-es-css10
## Usage
|
[
"# xm_transformer_600m-en_es-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Spanish\n- Trained on MuST-C, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-es-css10",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_es-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Spanish\n- Trained on MuST-C, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-es-css10",
"## Usage"
] |
[
68,
103,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_es-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Spanish\n- Trained on MuST-C, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-es-css10## Usage"
] |
[
-0.1642054319381714,
0.08602429181337357,
-0.004270542878657579,
-0.003303993260487914,
0.04439609497785568,
-0.06145546957850456,
0.09148839861154556,
0.03614772856235504,
-0.03347483277320862,
0.027145789936184883,
-0.03747959062457085,
0.05016554519534111,
0.03870770335197449,
0.07378629595041275,
-0.02999776415526867,
-0.18361897766590118,
0.07272117584943771,
-0.03759702667593956,
-0.05610246956348419,
0.04744834452867508,
0.12483517825603485,
-0.04398114234209061,
0.021008361130952835,
0.036407895386219025,
-0.0396704263985157,
0.07707815617322922,
0.019197622314095497,
-0.12409570068120956,
0.10877470672130585,
0.07261112332344055,
-0.005301275290548801,
0.043623216450214386,
0.07801823318004608,
-0.12915055453777313,
0.026355532929301262,
-0.01975368894636631,
-0.0126816276460886,
0.007278745993971825,
0.0402938574552536,
-0.06035709008574486,
0.10097038000822067,
-0.014388780109584332,
-0.03808893635869026,
0.06802809983491898,
-0.0982217863202095,
-0.12208304554224014,
-0.02722293511033058,
-0.06970983743667603,
0.03377517685294151,
0.05862151458859444,
-0.07703245431184769,
0.00377984088845551,
-0.07915600389242172,
0.08867966383695602,
0.04291939362883568,
-0.2413986623287201,
-0.016048144549131393,
-0.08417577296495438,
0.0670165941119194,
0.09431762993335724,
-0.010056236758828163,
0.08125539869070053,
0.030918581411242485,
0.008000274188816547,
-0.1299908459186554,
-0.1274440735578537,
-0.19713172316551208,
-0.03528568521142006,
-0.12852521240711212,
0.043845247477293015,
0.3630221486091614,
0.050453413277864456,
-0.0462348572909832,
0.0004250112106092274,
0.0026498911902308464,
0.10243400931358337,
-0.02549297735095024,
-0.04347687587141991,
-0.0048045204021036625,
0.028544139117002487,
-0.010855896398425102,
-0.09397619962692261,
-0.09021174907684326,
-0.025999249890446663,
-0.09277385473251343,
0.09491205215454102,
0.008984961546957493,
0.0059022558853030205,
0.011273718439042568,
-0.031332001090049744,
-0.11649166792631149,
-0.01890152506530285,
0.043245844542980194,
-0.012482396326959133,
-0.05574183911085129,
0.003937960602343082,
0.005420398898422718,
-0.21126815676689148,
0.08781085908412933,
-0.1154920905828476,
-0.08842441439628601,
-0.00844979751855135,
-0.09325726330280304,
0.06237635388970375,
0.07345615327358246,
-0.020255960524082184,
-0.1299145519733429,
-0.11313658952713013,
0.02085583098232746,
-0.044072117656469345,
0.011853868141770363,
-0.0008516045054420829,
-0.15196020901203156,
-0.04651794582605362,
-0.1164136677980423,
0.08890829980373383,
0.007146943360567093,
-0.022658513858914375,
-0.07637802511453629,
-0.012287406250834465,
0.04311806336045265,
-0.06095389276742935,
-0.004370085429400206,
0.03422938659787178,
-0.022802885621786118,
0.11201875656843185,
0.031111741438508034,
0.06979542970657349,
-0.08759834617376328,
-0.07409002631902695,
-0.001052539679221809,
0.047657161951065063,
0.020783651620149612,
-0.1349027454853058,
0.06502953171730042,
-0.02101357840001583,
-0.010706430301070213,
-0.11423680931329727,
-0.04896969348192215,
-0.0749153345823288,
-0.06945064663887024,
0.005826725624501705,
-0.042284831404685974,
-0.17385783791542053,
-0.005910919979214668,
0.00782537367194891,
-0.07575591653585434,
-0.06621517986059189,
-0.07965663075447083,
0.021086394786834717,
-0.09170296043157578,
0.0894741639494896,
-0.1503632813692093,
0.06206845864653587,
-0.025003110989928246,
-0.04246140643954277,
-0.16890601813793182,
0.17199207842350006,
-0.08316609263420105,
-0.052834074944257736,
-0.09109321981668472,
-0.05111108347773552,
-0.05991131812334061,
0.10263437777757645,
0.011760813184082508,
0.13142654299736023,
-0.27654263377189636,
-0.08218971639871597,
0.1253078132867813,
-0.03849916532635689,
-0.00903607252985239,
0.1800074279308319,
0.022544261068105698,
0.054181359708309174,
0.12027890980243683,
0.2762272357940674,
0.03281320631504059,
-0.16963286697864532,
-0.022558894008398056,
0.05615970119833946,
-0.00432487390935421,
-0.04200058802962303,
0.07008827477693558,
-0.10223568975925446,
0.09656322002410889,
-0.027928946539759636,
0.09216045588254929,
0.03904528170824051,
-0.04202945902943611,
-0.004852024372667074,
0.059074386954307556,
-0.06601886451244354,
0.11111456900835037,
-0.05250509828329086,
-0.014999755658209324,
-0.06441805511713028,
-0.04391765221953392,
0.15543457865715027,
0.054453279823064804,
-0.0962945967912674,
0.09554953873157501,
-0.12081344425678253,
0.08201903104782104,
-0.09480636566877365,
0.026185618713498116,
-0.1288297474384308,
0.07030539214611053,
-0.0560116246342659,
0.12221168726682663,
0.2203451246023178,
0.19163186848163605,
-0.02636704407632351,
0.0018679059576243162,
-0.07435711473226547,
0.057110901921987534,
0.015907879918813705,
0.041250888258218765,
-0.01151877362281084,
-0.18892425298690796,
0.08157261461019516,
-0.07911376655101776,
0.03246213123202324,
-0.00727114686742425,
-0.04635658860206604,
0.18271194398403168,
0.07670789957046509,
0.0019061975181102753,
0.004072186537086964,
0.05823987349867821,
0.10308559983968735,
0.017748424783349037,
0.04090498015284538,
0.02132730931043625,
-0.014585225842893124,
-0.08174815028905869,
0.21079446375370026,
-0.11229893565177917,
0.05881838500499725,
0.12878501415252686,
-0.14053024351596832,
-0.00002121725628967397,
0.05619528144598007,
0.019165970385074615,
0.026198776438832283,
0.007711581885814667,
-0.09257210791110992,
0.23359423875808716,
-0.043781910091638565,
0.11126518994569778,
-0.11690643429756165,
0.022849300876259804,
0.005011091940104961,
-0.06293857097625732,
-0.03375069797039032,
0.13437829911708832,
-0.03306398540735245,
-0.08586960285902023,
0.06420547515153885,
0.12136486172676086,
0.0036951196379959583,
0.2782931923866272,
-0.030483022332191467,
-0.02753852866590023,
-0.011509057134389877,
-0.005993807688355446,
-0.036049604415893555,
0.07479383051395416,
-0.2060449719429016,
-0.014903235249221325,
0.010356590151786804,
0.09043317288160324,
0.09240605682134628,
-0.08248884975910187,
0.01861000619828701,
-0.01933283545076847,
-0.12825898826122284,
-0.11924847960472107,
0.04415978118777275,
0.01449876744300127,
0.08349396288394928,
-0.06763911992311478,
-0.08606389164924622,
0.01045791245996952,
-0.049539413303136826,
-0.11180097609758377,
0.06432174891233444,
-0.2484617829322815,
-0.2336869239807129,
-0.11790275573730469,
0.012483233585953712,
-0.06466156989336014,
0.0527455136179924,
0.09622152149677277,
-0.08548247069120407,
-0.042287860065698624,
-0.03588464856147766,
0.11568433791399002,
-0.024325549602508545,
-0.05701671540737152,
-0.01672428846359253,
0.015072181820869446,
0.0036415860522538424,
-0.12168572843074799,
0.02390265092253685,
-0.023371951654553413,
0.006244075484573841,
-0.05490792542695999,
-0.03887683525681496,
0.09592702239751816,
0.16007624566555023,
0.0505833774805069,
-0.015538929961621761,
-0.06615135818719864,
0.16482673585414886,
-0.11032374948263168,
0.01595795713365078,
0.1551622450351715,
-0.00019965824321843684,
-0.02227301523089409,
0.14085060358047485,
0.029436886310577393,
-0.02328556403517723,
-0.004722088109701872,
-0.02730361931025982,
-0.04323248192667961,
-0.24167989194393158,
-0.17272703349590302,
-0.1279173046350479,
-0.0015982090262696147,
-0.14421729743480682,
-0.003733621211722493,
-0.04963601008057594,
-0.032734841108322144,
-0.01161602046340704,
-0.03430340066552162,
0.1081535816192627,
-0.005693409591913223,
0.2724480628967285,
-0.08905553072690964,
0.07902733236551285,
-0.07858946919441223,
-0.07653001695871353,
0.13820548355579376,
0.0686064064502716,
0.032350752502679825,
0.09005212783813477,
0.16585883498191833,
0.052467480301856995,
0.016646305099129677,
0.05395236238837242,
0.05563196912407875,
0.04229646176099777,
0.030878083780407906,
-0.03698199987411499,
-0.06458210200071335,
0.04017401114106178,
-0.00123835529666394,
0.3188548684120178,
-0.07590343803167343,
-0.024724051356315613,
-0.010529895313084126,
0.0786907747387886,
0.06735768169164658,
0.09258738905191422,
-0.06043008342385292,
0.03496188670396805,
0.028677066788077354,
-0.024599550291895866,
-0.03811899572610855,
0.08935405313968658,
0.2414177805185318,
-0.028912991285324097,
0.11258593946695328,
0.12445098161697388,
0.05150695890188217,
-0.0554606057703495,
0.034393660724163055,
-0.11757597327232361,
0.01142664160579443,
-0.017096051946282387,
0.04298596456646919,
-0.11796743422746658,
0.18150992691516876,
0.07039755582809448,
0.03785715252161026,
-0.014259718358516693,
0.013651572167873383,
0.012923958711326122,
0.09267987310886383,
0.16886241734027863,
0.009912578389048576,
-0.07294364273548126,
-0.05026544630527496,
-0.06265261024236679,
-0.019015641883015633,
0.10128577798604965,
0.0950014665722847,
-0.0275181345641613,
0.04962058737874031,
-0.0528595894575119,
-0.0034741079434752464,
-0.05343160033226013,
-0.19456173479557037,
-0.13125963509082794,
0.029101166874170303,
0.22694604098796844,
0.07559416443109512,
0.01571987196803093,
-0.08210552483797073,
-0.19960255920886993,
0.06900395452976227,
-0.0675804391503334,
-0.024682415649294853,
-0.11222286522388458,
-0.09582334011793137,
0.16377606987953186,
-0.01754515990614891,
0.03464841470122337,
0.050466425716876984,
0.01807505637407303,
-0.0934399962425232,
-0.0018837061943486333,
0.12692071497440338,
-0.10224106907844543,
-0.009859680198132992,
-0.025765491649508476,
0.2380838841199875,
0.04642932862043381,
0.08550014346837997,
0.05201404541730881,
-0.0049527958035469055,
0.03694656118750572,
-0.05987739562988281,
0.004962265957146883,
-0.06468415260314941,
-0.055860232561826706,
0.16251932084560394,
-0.0669737160205841,
-0.22666656970977783,
-0.009346854873001575,
-0.05404483154416084,
0.2146802693605423,
0.12286530435085297,
-0.024210292845964432,
0.10786101967096329,
0.2298506647348404,
-0.07297557592391968,
-0.26486825942993164,
0.024413401260972023,
0.018804900348186493,
0.07543956488370895,
-0.01662815921008587,
-0.14441843330860138,
0.01346964854747057,
0.005032172426581383,
-0.0019961618818342686,
0.01690300926566124,
-0.21476997435092926,
-0.13191939890384674,
0.12401942163705826,
-0.09162568300962448,
0.11285648494958878,
0.01031044963747263,
-0.05614316463470459,
-0.0926724225282669,
0.04549616947770119,
0.07748597115278244,
-0.1844038963317871,
0.08753954619169235,
0.12377266585826874,
0.07776331156492233,
0.029394051060080528,
0.061180923134088516,
0.12262450158596039,
0.09804755449295044,
-0.03146260976791382,
0.03472239151597023,
0.05743240937590599,
0.07585936039686203,
0.07041378319263458,
0.0483982153236866,
0.0014146126341074705,
-0.037747424095869064,
-0.06310074776411057,
-0.02023572288453579,
-0.06035218760371208,
0.07798311114311218,
0.04246681556105614,
-0.004696625750511885,
-0.03694326430559158,
-0.03656230866909027,
-0.01622598059475422,
0.015177030116319656,
0.09479376673698425,
-0.17407388985157013,
0.015638673678040504,
0.15052688121795654,
0.2523576021194458,
-0.10606516152620316,
0.014141429215669632,
0.03552347421646118,
-0.06267077475786209,
0.029280465096235275,
0.017264164984226227,
0.04078304395079613,
0.08833557367324829,
-0.03247895464301109,
0.11136075109243393,
-0.006316259503364563,
-0.08738098293542862,
0.09793267399072647,
0.0610513798892498,
-0.04684948921203613,
-0.10478859394788742,
-0.045009441673755646,
-0.03622793033719063,
0.06388000398874283,
0.06126636639237404,
0.2532970607280731,
0.007561000529676676,
0.0155308423563838,
-0.052939947694540024,
-0.019375303760170937,
-0.10971285402774811,
0.1847604662179947,
0.06988104432821274,
-0.0016365689225494862,
-0.12836337089538574,
0.09031803905963898,
0.02389725111424923,
-0.10064860433340073,
0.012669299729168415,
0.03604809567332268,
-0.047623954713344574,
-0.07894846051931381,
-0.12998515367507935,
0.044278066605329514,
-0.013479040004312992,
-0.1346961259841919,
-0.06543764472007751,
-0.17531707882881165,
0.018379753455519676,
0.20939582586288452,
-0.015901664271950722,
-0.0035475664772093296,
-0.07928772270679474,
-0.04317323490977287,
0.040246061980724335,
0.03834269568324089,
0.06096907705068588,
-0.037887733429670334,
-0.05640420317649841,
0.13103103637695312,
-0.05215096101164818,
0.07807362079620361,
-0.04171314463019371,
-0.01010892353951931,
-0.03430918604135513,
0.06118430197238922,
-0.05245402082800865,
-0.0160131324082613,
-0.06547136604785919,
-0.02286359667778015,
0.003727838397026062,
-0.04493577033281326,
-0.03645900636911392,
0.016303373500704765,
-0.10220839828252792,
0.05092034861445427,
0.017801087349653244,
0.07388341426849365,
-0.06820835918188095,
0.009794416837394238,
-0.004903023596853018,
-0.0025936041492968798,
0.10010766983032227,
0.06725618243217468,
-0.08963882178068161,
0.09652888774871826,
-0.20173977315425873,
-0.07627345621585846,
0.13629867136478424,
0.09358397871255875,
0.06502308696508408,
0.037745434790849686,
-0.011056337505578995,
0.1192423552274704,
0.07915515452623367,
-0.02053646184504032,
-0.014163314364850521,
-0.017101271077990532,
0.047269415110349655,
-0.11951056867837906,
-0.057293038815259933,
-0.01225695013999939,
0.02755935676395893,
0.15442699193954468,
0.06810587644577026,
0.10347294062376022,
-0.0848102942109108,
0.023447873070836067,
-0.003407109063118696,
0.04139629378914833,
0.0007098169880919158,
-0.08159316331148148,
-0.0580294243991375,
-0.053506314754486084,
0.07723581790924072,
-0.02126772329211235,
0.1010572761297226,
0.03140631690621376,
0.019964879378676414,
0.013885466381907463,
-0.017751283943653107,
-0.06851925700902939,
0.03972361609339714,
0.01995156705379486,
0.04223726689815521,
-0.017849303781986237,
-0.11608557403087616,
-0.014150120317935944,
0.06420867890119553,
0.0847042128443718,
-0.03193248063325882,
0.024261029437184334,
0.14317351579666138,
0.1114640161395073,
0.10245466232299805,
-0.031765494495630264,
0.021242212504148483,
0.04917934536933899,
-0.03084302693605423,
0.014818674884736538,
-0.07447419315576553,
0.09889864921569824,
0.016450179740786552,
-0.09193353354930878,
0.04584386944770813,
0.04996290057897568,
-0.06310465931892395,
-0.17231091856956482,
-0.02555501088500023,
-0.055784013122320175,
-0.09059794992208481,
-0.01578863337635994,
-0.1199120357632637,
0.07949543744325638,
-0.014674752950668335,
0.07774128764867783,
0.011492627672851086,
0.09364809840917587,
-0.18218083679676056,
-0.14530983567237854,
0.10921741276979446,
-0.044805970042943954,
0.11819479614496231,
-0.05272998660802841,
-0.03221190720796585,
0.10949622839689255,
0.043416187167167664,
0.032473839819431305,
0.03529532998800278,
-0.06858501583337784,
0.003415798768401146,
-0.09754887223243713,
-0.025101276114583015,
-0.02891283482313156,
0.02082500420510769,
0.06284160166978836,
0.13606569170951843,
0.11085616052150726,
-0.09468423575162888,
0.04070940613746643,
0.04108995199203491,
-0.05504760891199112,
-0.10825494676828384,
-0.0865391194820404,
-0.1154782623052597,
-0.013325106352567673,
0.15264953672885895,
-0.05008368194103241,
-0.09188738465309143,
-0.04654092714190483,
0.07676707208156586,
0.2822396755218506,
-0.11893629282712936,
0.021398551762104034,
-0.026950186118483543,
0.015706101432442665,
0.002991933375597,
-0.027429470792412758,
0.05675492063164711,
0.19071845710277557,
0.051583945751190186,
-0.004957858938723803,
-0.08322371542453766,
-0.0006059461738914251,
-0.06117694452404976,
0.037049420177936554,
-0.07519442588090897,
-0.07878370583057404,
0.06752647459506989,
0.19133946299552917,
-0.12297689914703369,
-0.15819966793060303,
-0.13745538890361786,
-0.1227029338479042,
-0.06383102387189865,
-0.024300871416926384,
0.03145529702305794,
0.18161208927631378,
-0.001600764342583716,
-0.05732650309801102,
-0.06614704430103302,
0.07186321169137955,
0.015361984260380268,
-0.1019567996263504,
-0.05537829548120499,
0.015611534006893635,
-0.20220813155174255,
-0.015816086903214455,
-0.03648356348276138,
0.13636329770088196,
0.015346932224929333,
0.07170747220516205,
0.048953935503959656,
0.10229621082544327,
0.02691766992211342,
-0.06691129505634308,
0.05694212019443512,
0.06406997889280319,
-0.04237331822514534,
0.2515687942504883,
0.0252162367105484,
-0.07611813396215439,
0.08300482481718063,
0.00675138970836997,
-0.08679363131523132,
-0.03603712469339371,
0.05631572753190994,
-0.11936618387699127,
0.07583966851234436,
0.00007994714542292058,
-0.012649544514715672,
0.005065135657787323,
0.0008785019745118916,
-0.017911113798618317,
0.02790890820324421,
0.0045970226638019085,
-0.03536537289619446,
-0.1952827423810959,
-0.03220364451408386,
-0.07128728926181793,
0.03874114900827408,
-0.12864266335964203,
-0.0021336409263312817,
-0.12275154143571854,
0.027115797623991966,
-0.048446666449308395,
0.0618148148059845,
0.06654037535190582,
-0.027973229065537453,
0.025227736681699753,
-0.014433907344937325,
0.05367591232061386,
0.0911674052476883,
-0.050096042454242706,
-0.03215154632925987
] |
null | null |
fairseq
|
# xm_transformer_600m-en_fr-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-French
- Trained on MuST-C, EuroParl-ST, VoxPopuli, LibriTrans, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-fr-cv7_css10](https://huggingface.co/facebook/tts_transformer-fr-cv7_css10)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.speech_to_text.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_fr-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-fr-cv7_css10",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-fr", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c", "europarl_st", "voxpopuli", "libritrans"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_fr-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"dataset:europarl_st",
"dataset:voxpopuli",
"dataset:libritrans",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-fr"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-europarl_st #dataset-voxpopuli #dataset-libritrans #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_fr-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-French
- Trained on MuST-C, EuroParl-ST, VoxPopuli, LibriTrans, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-fr-cv7_css10
## Usage
|
[
"# xm_transformer_600m-en_fr-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-French\n- Trained on MuST-C, EuroParl-ST, VoxPopuli, LibriTrans, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-fr-cv7_css10",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-europarl_st #dataset-voxpopuli #dataset-libritrans #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_fr-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-French\n- Trained on MuST-C, EuroParl-ST, VoxPopuli, LibriTrans, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-fr-cv7_css10",
"## Usage"
] |
[
74,
111,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-europarl_st #dataset-voxpopuli #dataset-libritrans #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_fr-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-French\n- Trained on MuST-C, EuroParl-ST, VoxPopuli, LibriTrans, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-fr-cv7_css10## Usage"
] |
[
-0.14886996150016785,
0.12020964175462723,
-0.004542685113847256,
0.005446355789899826,
0.03075806237757206,
-0.0534285269677639,
0.10122858732938766,
0.04585479572415352,
-0.024940788745880127,
0.02644120156764984,
-0.009047036059200764,
0.040512535721063614,
0.039648160338401794,
0.10473325103521347,
-0.03473398834466934,
-0.19398190081119537,
0.08503898233175278,
-0.03740394487977028,
-0.04564012587070465,
0.059214163571596146,
0.12995536625385284,
-0.047178689390420914,
0.035945285111665726,
0.042485255748033524,
-0.026688463985919952,
0.06746078282594681,
0.01566716656088829,
-0.10575298219919205,
0.10989667475223541,
0.0874059870839119,
-0.004368754103779793,
0.04362984374165535,
0.08259446918964386,
-0.17147983610630035,
0.02455795556306839,
-0.009274152107536793,
-0.017417943105101585,
0.010105833411216736,
0.03735288977622986,
-0.06306970119476318,
0.14730742573738098,
-0.0378517284989357,
-0.038418788462877274,
0.07189968228340149,
-0.0849718302488327,
-0.10629181563854218,
-0.03789588436484337,
-0.05590096861124039,
-0.013179692439734936,
0.06237279251217842,
-0.08230037987232208,
0.03233041241765022,
-0.09997798502445221,
0.09435895830392838,
0.07866232842206955,
-0.2383623868227005,
-0.01611444726586342,
-0.08226016163825989,
0.06333676725625992,
0.0852794349193573,
-0.04103470966219902,
0.0857023224234581,
0.020274970680475235,
0.010309768840670586,
-0.12132368236780167,
-0.119560606777668,
-0.1846439242362976,
-0.015081518329679966,
-0.1345367133617401,
0.0480693019926548,
0.36515215039253235,
0.03878115490078926,
-0.05569680407643318,
-0.023746127262711525,
0.005787074100226164,
0.1151893362402916,
-0.03873594105243683,
-0.05008441582322121,
0.0008305356022901833,
0.02264552377164364,
0.0021724198013544083,
-0.1116187795996666,
-0.08595429360866547,
-0.019096635282039642,
-0.10104542970657349,
0.07905630022287369,
0.012221797369420528,
0.008368873037397861,
0.011161324568092823,
-0.03785296157002449,
-0.10221856087446213,
-0.01700950600206852,
0.01933041773736477,
-0.042538344860076904,
-0.049380287528038025,
-0.0010264641605317593,
-0.007204093970358372,
-0.16719074547290802,
0.08905768394470215,
-0.07878043502569199,
-0.09323301911354065,
-0.002583274617791176,
-0.1170496940612793,
0.052494440227746964,
0.07474789768457413,
-0.024809014052152634,
-0.1495668739080429,
-0.10036783665418625,
-0.0015959918964654207,
-0.03885310888290405,
0.00687108188867569,
0.007158951368182898,
-0.12265083193778992,
-0.05006849393248558,
-0.12537072598934174,
0.08763428777456284,
-0.002924713073298335,
-0.00748972874134779,
-0.07061998546123505,
-0.008840146474540234,
0.016797030344605446,
-0.06534174829721451,
0.0016253340290859342,
0.05249844118952751,
-0.04483271762728691,
0.11982102692127228,
0.028673389926552773,
0.061674878001213074,
-0.09371055662631989,
-0.060313012450933456,
-0.021789265796542168,
0.06091040372848511,
0.023623179644346237,
-0.13878042995929718,
0.06404896080493927,
-0.0351950004696846,
-0.017501818016171455,
-0.09737147390842438,
-0.07320586591959,
-0.048595182597637177,
-0.06119195744395256,
-0.006932090502232313,
-0.002946811728179455,
-0.16493703424930573,
-0.016618946567177773,
0.01425093598663807,
-0.06546367704868317,
-0.05847333371639252,
-0.08901136368513107,
0.018857654184103012,
-0.12047170847654343,
0.08512891083955765,
-0.1480148583650589,
0.05639580637216568,
-0.040871113538742065,
-0.05071508511900902,
-0.1316620409488678,
0.15359973907470703,
-0.10248962044715881,
-0.06571884453296661,
-0.09289327263832092,
-0.05777839198708534,
-0.06331495195627213,
0.09510497003793716,
0.011349417269229889,
0.15987975895404816,
-0.30270302295684814,
-0.08375830203294754,
0.16907785832881927,
-0.05481747165322304,
-0.011794714257121086,
0.18585073947906494,
0.007875360548496246,
0.025786863639950752,
0.11447528749704361,
0.29060038924217224,
0.026855837553739548,
-0.1847831904888153,
-0.0506444089114666,
0.06310833245515823,
0.017163006588816643,
-0.011061054654419422,
0.06518900394439697,
-0.10041030496358871,
0.11823515594005585,
-0.022455889731645584,
0.07679257541894913,
0.037782855331897736,
-0.04533587396144867,
-0.017731500789523125,
0.04025750979781151,
-0.06625135242938995,
0.14502964913845062,
-0.03820054233074188,
-0.03290342912077904,
-0.07178669422864914,
-0.0379013828933239,
0.1428421437740326,
0.062479712069034576,
-0.09752346575260162,
0.08299914747476578,
-0.11192759871482849,
0.0841870978474617,
-0.03509965538978577,
0.03215119615197182,
-0.12760159373283386,
0.04814926162362099,
-0.052193671464920044,
0.1095881387591362,
0.19324105978012085,
0.1927633285522461,
0.0012272781459614635,
0.00330191757529974,
-0.07096368819475174,
0.07322538644075394,
0.021471919491887093,
0.03293396532535553,
-0.006643021013587713,
-0.20360569655895233,
0.08817093819379807,
-0.08249157667160034,
0.062025390565395355,
-0.061711981892585754,
-0.038507331162691116,
0.1886063814163208,
0.11234904080629349,
0.013026279397308826,
-0.001434662495739758,
0.05081980302929878,
0.07945974171161652,
0.01654025912284851,
0.04493454098701477,
0.010852609761059284,
-0.03573717176914215,
-0.05504490062594414,
0.20988760888576508,
-0.08400324732065201,
0.0669746920466423,
0.1415245085954666,
-0.10361931473016739,
0.000499637913890183,
0.041294410824775696,
-0.002062889514490962,
0.03320572152733803,
-0.0019589653238654137,
-0.08915682882070541,
0.18410491943359375,
-0.027611128985881805,
0.09392331540584564,
-0.11260637640953064,
0.017693227156996727,
0.009939566254615784,
-0.05949673056602478,
-0.04507002979516983,
0.13423360884189606,
-0.025557920336723328,
-0.0810505598783493,
0.085269995033741,
0.14322590827941895,
0.005238577723503113,
0.29349279403686523,
-0.041873812675476074,
-0.039103005081415176,
-0.014177204109728336,
-0.0013348805950954556,
-0.04767424240708351,
0.0946059376001358,
-0.19801092147827148,
-0.02653048001229763,
0.01852777786552906,
0.07017894834280014,
0.07798218727111816,
-0.06987690925598145,
0.004771301057189703,
-0.02219020202755928,
-0.1302596777677536,
-0.0955948680639267,
0.04616696387529373,
0.018817832693457603,
0.10361380130052567,
-0.07525146752595901,
-0.09849397838115692,
-0.008851666003465652,
-0.047026120126247406,
-0.10863156616687775,
0.08410009741783142,
-0.2509559392929077,
-0.22691276669502258,
-0.0693424716591835,
-0.010783936828374863,
-0.09064117819070816,
0.03110947459936142,
0.06739141792058945,
-0.0959286093711853,
-0.05836820974946022,
-0.05142999440431595,
0.09458182007074356,
-0.02710813842713833,
-0.052199412137269974,
-0.05055975541472435,
0.009262481704354286,
-0.00778506463393569,
-0.12481125444173813,
0.016742639243602753,
-0.03454448655247688,
0.015018323436379433,
-0.05454785376787186,
-0.03252970427274704,
0.09969421476125717,
0.12469799816608429,
0.0373900830745697,
-0.012151040136814117,
-0.06737872958183289,
0.18443667888641357,
-0.10753205418586731,
0.012832733802497387,
0.12390332669019699,
0.006465442478656769,
-0.016845542937517166,
0.14732792973518372,
0.02865312434732914,
-0.020398233085870743,
-0.009078123606741428,
-0.010446044616401196,
-0.02745666727423668,
-0.23426491022109985,
-0.18733680248260498,
-0.13083723187446594,
0.018789149820804596,
-0.12265841662883759,
-0.00773036340251565,
-0.040529731661081314,
-0.033746398985385895,
-0.02141745202243328,
-0.040398795157670975,
0.10900787264108658,
-0.010722551494836807,
0.2840464413166046,
-0.08488761633634567,
0.09895387291908264,
-0.07558426260948181,
-0.08228909224271774,
0.14672750234603882,
0.036536626517772675,
0.0748334750533104,
0.07435964047908783,
0.18127146363258362,
0.05490335822105408,
0.03696269914507866,
0.03227777034044266,
0.06003532558679581,
0.04854394495487213,
0.02319113351404667,
-0.027035051956772804,
-0.07487929612398148,
0.03683140501379967,
0.002618540544062853,
0.3012266159057617,
-0.08815460652112961,
-0.009307991713285446,
-0.013050872832536697,
0.10284595936536789,
0.052563123404979706,
0.11632471531629562,
-0.06645554304122925,
0.04199760779738426,
0.024587124586105347,
-0.026279807090759277,
-0.03728969767689705,
0.09640471637248993,
0.2515650689601898,
-0.03756324574351311,
0.1078089103102684,
0.10790012776851654,
0.04000483825802803,
-0.05530572310090065,
0.030009806156158447,
-0.12297618389129639,
-0.0032932066824287176,
-0.022267479449510574,
0.04553578048944473,
-0.15397685766220093,
0.19738991558551788,
0.06851481646299362,
0.02606145665049553,
-0.03789445012807846,
0.009374157525599003,
0.04315739870071411,
0.09175644814968109,
0.1877126693725586,
0.011473261751234531,
-0.06392858922481537,
-0.030545709654688835,
-0.05866114795207977,
-0.01235134620219469,
0.11175937205553055,
0.10102713853120804,
-0.024857310578227043,
0.033032480627298355,
-0.05151808261871338,
0.0008479435346089303,
-0.03184767812490463,
-0.22421535849571228,
-0.15116725862026215,
0.04090133309364319,
0.20549847185611725,
0.030450737103819847,
0.014347279444336891,
-0.09164029359817505,
-0.17003317177295685,
0.08552038669586182,
-0.051441751420497894,
-0.025155311450362206,
-0.11614377796649933,
-0.07070545107126236,
0.1595948338508606,
-0.012078501284122467,
0.03588169068098068,
0.036530617624521255,
0.02861277014017105,
-0.09836170077323914,
-0.019139284268021584,
0.10055653750896454,
-0.09636808931827545,
-0.01644040457904339,
-0.030238592997193336,
0.2282901257276535,
0.053269293159246445,
0.08772928267717361,
0.06204371526837349,
0.014119187369942665,
0.029073838144540787,
-0.07881486415863037,
0.0006292083999142051,
-0.05707506462931633,
-0.04911834001541138,
0.1564280390739441,
-0.07526642084121704,
-0.23619882762432098,
-0.013263493776321411,
-0.03713177517056465,
0.23362652957439423,
0.1155359148979187,
-0.03299012780189514,
0.11359154433012009,
0.21226327121257782,
-0.07498318701982498,
-0.2800143361091614,
0.031936679035425186,
0.0014942474663257599,
0.051078375428915024,
-0.011355554684996605,
-0.16737087070941925,
0.023242024704813957,
0.018860122188925743,
0.002395446877926588,
0.03533099591732025,
-0.19541628658771515,
-0.12572284042835236,
0.14455339312553406,
-0.0673055574297905,
0.10180553048849106,
-0.0026711777318269014,
-0.049671534448862076,
-0.0740353912115097,
0.022628791630268097,
0.11927319318056107,
-0.21041245758533478,
0.07694488018751144,
0.11014130711555481,
0.0685562789440155,
0.03260287642478943,
0.046975456178188324,
0.12422088533639908,
0.08898339420557022,
-0.03145551681518555,
0.018184516578912735,
0.04632391780614853,
0.08516047894954681,
0.08321192115545273,
0.06315498799085617,
-0.032278578728437424,
-0.02714964561164379,
-0.06676958501338959,
-0.016549400985240936,
-0.07129453867673874,
0.08874800056219101,
0.02026250772178173,
-0.02191556990146637,
-0.05705585703253746,
-0.009722758084535599,
-0.003677203319966793,
0.017914993688464165,
0.06541720777750015,
-0.1586574763059616,
0.000999235431663692,
0.15286582708358765,
0.23665659129619598,
-0.07868222892284393,
0.032420042902231216,
0.03837338089942932,
-0.05607867240905762,
0.012613081373274326,
0.006483275443315506,
0.03823406994342804,
0.10653144121170044,
-0.020979830995202065,
0.12649184465408325,
-0.003817650955170393,
-0.07875500619411469,
0.06541474163532257,
0.0555635504424572,
-0.08476287871599197,
-0.12288562208414078,
-0.043465252965688705,
-0.05875110998749733,
0.03248671442270279,
0.04900306090712547,
0.2608565390110016,
-0.005177509039640427,
0.022788500413298607,
-0.05908385291695595,
-0.005649568047374487,
-0.09747011959552765,
0.1825810968875885,
0.06995680183172226,
-0.004984838888049126,
-0.13846337795257568,
0.07267481833696365,
0.007405505981296301,
-0.08597331494092941,
0.01932283118367195,
0.038114190101623535,
-0.06534040719270706,
-0.08451949059963226,
-0.13232529163360596,
0.0489429272711277,
0.002389164874330163,
-0.11839509755373001,
-0.05145654454827309,
-0.1747829020023346,
0.025035327300429344,
0.22133934497833252,
-0.009526832960546017,
0.009445251896977425,
-0.07320491224527359,
-0.06547068804502487,
0.03750661388039589,
0.06451689451932907,
0.057168155908584595,
-0.051824901252985,
-0.031194597482681274,
0.11458560079336166,
-0.060659848153591156,
0.06884736567735672,
-0.03692653030157089,
-0.009156365878880024,
-0.04751390591263771,
0.04791722819209099,
-0.07674145698547363,
-0.014867592602968216,
-0.06756340712308884,
-0.017785487696528435,
0.00033134379191324115,
-0.05184919387102127,
-0.03815845772624016,
0.00217044516466558,
-0.1099856048822403,
0.047213923186063766,
0.03275134041905403,
0.09015176445245743,
-0.11728036403656006,
0.020583854988217354,
0.007304730359464884,
-0.005955718457698822,
0.1012185662984848,
0.08427292853593826,
-0.08329839259386063,
0.10972895473241806,
-0.17917855083942413,
-0.08987651765346527,
0.11255722492933273,
0.09537776559591293,
0.06865999847650528,
0.037435006350278854,
-0.013725913129746914,
0.1121663823723793,
0.08700253069400787,
0.00006409444904420525,
0.00379966851323843,
-0.009044690988957882,
0.05568607524037361,
-0.11531016230583191,
-0.05891592428088188,
-0.009057991206645966,
0.04226567596197128,
0.13895244896411896,
0.06719193607568741,
0.10835934430360794,
-0.07582884281873703,
0.012254832312464714,
-0.013073140755295753,
0.04608554393053055,
0.003990969154983759,
-0.09108518809080124,
-0.028079915791749954,
-0.052423566579818726,
0.072726771235466,
-0.01821400411427021,
0.13062287867069244,
0.025010205805301666,
0.003926682285964489,
0.02078142575919628,
-0.006970182992517948,
-0.07357645779848099,
0.046611715108156204,
0.038573767989873886,
0.043807875365018845,
-0.027245718985795975,
-0.10686033219099045,
-0.02057499997317791,
0.0506155826151371,
0.0974510982632637,
0.005553881637752056,
0.05279950052499771,
0.12156600505113602,
0.10236014425754547,
0.07684855908155441,
-0.054118864238262177,
0.045587509870529175,
0.05009814724326134,
-0.041212450712919235,
0.036187343299388885,
-0.08398989588022232,
0.1281692534685135,
0.010136566124856472,
-0.11005289852619171,
0.039517905563116074,
0.03433775529265404,
-0.08095204830169678,
-0.17282989621162415,
-0.02088548243045807,
-0.052542898803949356,
-0.07131139934062958,
-0.008924047462642193,
-0.13782238960266113,
0.0867474377155304,
0.0019128060666844249,
0.10379230976104736,
0.001981097273528576,
0.08579760044813156,
-0.18110494315624237,
-0.1348332315683365,
0.11535561829805374,
-0.03741531819105148,
0.09177166223526001,
-0.07395435869693756,
-0.042940590530633926,
0.07576147466897964,
0.020375235006213188,
0.029698429629206657,
0.03252388536930084,
-0.05430406704545021,
-0.004996852017939091,
-0.10656300187110901,
-0.02279892936348915,
-0.03230181708931923,
0.020109912380576134,
0.0405673086643219,
0.14612774550914764,
0.10873355716466904,
-0.08502940088510513,
0.039564888924360275,
0.05401989817619324,
-0.051204971969127655,
-0.10445842891931534,
-0.11369673907756805,
-0.1201060488820076,
-0.006087185349315405,
0.14877866208553314,
-0.04465993493795395,
-0.08774737268686295,
-0.04354793578386307,
0.10487050563097,
0.29834362864494324,
-0.13132105767726898,
0.027136903256177902,
-0.015264587476849556,
0.017180901020765305,
0.023618655279278755,
-0.03763361647725105,
0.060203295201063156,
0.1979006826877594,
0.031089942902326584,
-0.002304587047547102,
-0.07073043286800385,
-0.017404630780220032,
-0.055300332605838776,
0.04448781907558441,
-0.05205514281988144,
-0.07560429722070694,
0.03913633152842522,
0.18046057224273682,
-0.12360639125108719,
-0.12816013395786285,
-0.13679683208465576,
-0.1367802917957306,
-0.07152827084064484,
-0.03397977724671364,
-0.0013403550256043673,
0.17101389169692993,
-0.004248787183314562,
-0.04963718354701996,
-0.06800343841314316,
0.0969126895070076,
0.019560864195227623,
-0.12016932666301727,
-0.07008375972509384,
0.029476281255483627,
-0.1935567706823349,
0.021602587774395943,
-0.034625936299562454,
0.13017794489860535,
0.03131892532110214,
0.0711653083562851,
0.04180903732776642,
0.07378185540437698,
0.04417882859706879,
-0.07063314318656921,
0.024849707260727882,
0.062238480895757675,
-0.040630895644426346,
0.22392332553863525,
0.033953916281461716,
-0.067303866147995,
0.09165406972169876,
0.01057463325560093,
-0.09924615174531937,
-0.027178892865777016,
0.07647079974412918,
-0.12585054337978363,
0.07496026903390884,
0.033403389155864716,
-0.0005294379661791027,
0.01459002960473299,
-0.021725619211792946,
-0.022440556436777115,
0.023715505376458168,
-0.02152278833091259,
-0.029919950291514397,
-0.17985226213932037,
-0.030921543017029762,
-0.04768839478492737,
0.037417810410261154,
-0.14523184299468994,
-0.0013951730215921998,
-0.11241250485181808,
0.036096811294555664,
-0.040434181690216064,
0.07578609883785248,
0.082199327647686,
-0.027750175446271896,
0.02021925151348114,
-0.006665779743343592,
0.06241084262728691,
0.08363281190395355,
-0.02416641265153885,
-0.022367136552929878
] |
null | null |
fairseq
|
# xm_transformer_600m-en_ru-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-Russian
- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-ru-cv7_css10](https://huggingface.co/facebook/tts_transformer-ru-cv7_css10)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.speech_to_text.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_ru-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-ru-cv7_css10",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-ru", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_ru-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-ru"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_ru-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-Russian
- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-ru-cv7_css10
## Usage
|
[
"# xm_transformer_600m-en_ru-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Russian\n- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-ru-cv7_css10",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_ru-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Russian\n- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-ru-cv7_css10",
"## Usage"
] |
[
52,
96,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_ru-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Russian\n- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-ru-cv7_css10## Usage"
] |
[
-0.11441917717456818,
0.041015155613422394,
-0.0042375619523227215,
-0.03058096393942833,
0.0906679555773735,
-0.05924299359321594,
0.09428911656141281,
0.03360533341765404,
-0.04623179882764816,
-0.022870007902383804,
-0.004925580695271492,
-0.042656153440475464,
0.023462776094675064,
0.0874769538640976,
-0.018298357725143433,
-0.21977382898330688,
0.0460239015519619,
-0.07030040770769119,
-0.1045193076133728,
0.057413455098867416,
0.1270555853843689,
-0.03570684418082237,
0.010856772772967815,
0.015708841383457184,
-0.07445482909679413,
0.09182831645011902,
0.06371237337589264,
-0.10904350131750107,
0.06489045172929764,
0.11559446901082993,
0.023359742015600204,
0.07101060450077057,
0.07106510549783707,
-0.1260608285665512,
0.03175721690058708,
-0.021839875727891922,
-0.03657805919647217,
-0.014620821923017502,
0.038475628942251205,
-0.04790349677205086,
0.17424717545509338,
-0.050294723361730576,
-0.0809575617313385,
0.08549147099256516,
-0.0987115278840065,
-0.09210652858018875,
-0.005457174498587847,
0.043139006942510605,
0.019450591877102852,
0.09834296256303787,
-0.08047686517238617,
-0.016040679067373276,
-0.07387010753154755,
0.10132972151041031,
0.06690508872270584,
-0.3075767457485199,
-0.00896897166967392,
-0.0318196602165699,
0.04896382614970207,
0.10120390355587006,
-0.01670641079545021,
0.10899181663990021,
0.015479025430977345,
-0.007894567213952541,
-0.15050694346427917,
-0.13234202563762665,
-0.14408522844314575,
0.006784925702959299,
-0.14769347012043,
0.059463825076818466,
0.2969745099544525,
0.04832969605922699,
-0.015489950776100159,
0.043022844940423965,
-0.012868447229266167,
0.05317489802837372,
-0.007924810983240604,
-0.03159485012292862,
-0.035332970321178436,
0.04398888722062111,
-0.072853684425354,
-0.15957000851631165,
-0.11235596239566803,
-0.03413916006684303,
-0.09717147797346115,
0.16109001636505127,
-0.01235516183078289,
0.039777737110853195,
-0.02511642500758171,
-0.03793756663799286,
-0.07117611169815063,
-0.008811633102595806,
0.050996892154216766,
-0.08300923556089401,
-0.09502580016851425,
0.03631015866994858,
-0.06040515750646591,
-0.16275402903556824,
0.08163746446371078,
-0.15921185910701752,
-0.12288698554039001,
0.014285142533481121,
-0.05470237508416176,
0.08285581320524216,
0.011161865666508675,
0.04963075742125511,
-0.13547389209270477,
-0.06185917183756828,
0.004316121805459261,
-0.055780332535505295,
-0.017409175634384155,
-0.010693671181797981,
-0.17188233137130737,
-0.07907471060752869,
-0.13152991235256195,
0.0817359983921051,
-0.044994618743658066,
0.02786850929260254,
-0.0195456575602293,
-0.013909845612943172,
0.04086804389953613,
-0.06436693668365479,
-0.013234289363026619,
0.04554137587547302,
0.0007404536008834839,
0.10321842133998871,
0.029029568657279015,
0.06611645966768265,
-0.07932262867689133,
-0.11845941096544266,
0.0454375334084034,
0.07786640524864197,
0.005357849877327681,
-0.13399973511695862,
0.02966153994202614,
0.016457129269838333,
-0.002236431697383523,
-0.19135576486587524,
0.02060696855187416,
-0.05128378048539162,
-0.07563916593790054,
0.04060906171798706,
-0.004285051953047514,
-0.1117069199681282,
-0.05021913722157478,
0.013384574092924595,
-0.05671660602092743,
-0.09305907785892487,
-0.050272874534130096,
0.0010667142923921347,
-0.09153703600168228,
0.09089638292789459,
-0.12481760233640671,
0.06826458871364594,
-0.014126808382570744,
-0.03705396503210068,
-0.08721998333930969,
0.1861840933561325,
-0.07684502005577087,
-0.050230178982019424,
-0.043980032205581665,
-0.041380204260349274,
-0.08589047938585281,
0.09113958477973938,
-0.0025050309486687183,
0.12210235744714737,
-0.2392406463623047,
-0.07097381353378296,
0.15685395896434784,
-0.03841278702020645,
0.04215484485030174,
0.16455036401748657,
0.032044436782598495,
-0.0052761598490178585,
0.12272798269987106,
0.23986434936523438,
0.10848134756088257,
-0.13711577653884888,
-0.019918207079172134,
0.0718202218413353,
-0.08157464116811752,
-0.001189687754958868,
0.08032017946243286,
-0.03525391221046448,
0.035334739834070206,
-0.03477765619754791,
0.12137365341186523,
0.058307942003011703,
-0.04543042927980423,
-0.034048888832330704,
0.03400655463337898,
-0.07101590931415558,
0.11450377851724625,
-0.08080636709928513,
0.010265549644827843,
-0.05865049362182617,
-0.0714661031961441,
0.08390691876411438,
0.07402268052101135,
-0.06809142231941223,
0.10448385775089264,
-0.16693715751171112,
0.026067381724715233,
-0.030031565576791763,
0.058963317424058914,
-0.13251137733459473,
0.1296723484992981,
-0.0702798068523407,
0.14405915141105652,
0.22416198253631592,
0.218764066696167,
-0.0021815523505210876,
-0.026616228744387627,
-0.08844472467899323,
0.04005308821797371,
-0.008543276228010654,
0.04513232037425041,
-0.007529107388108969,
-0.15525594353675842,
0.08829852938652039,
-0.07566985487937927,
-0.017491593956947327,
-0.03260050341486931,
-0.0343942865729332,
0.15138830244541168,
0.06120423600077629,
-0.0058309040032327175,
0.06501110643148422,
0.04374322667717934,
0.08742529153823853,
0.01832493767142296,
0.038656797260046005,
0.0304058026522398,
-0.02445555105805397,
-0.10966577380895615,
0.22765803337097168,
-0.08519624918699265,
0.09998901188373566,
0.13101424276828766,
-0.12838110327720642,
0.005263130646198988,
0.08679326623678207,
0.016633305698633194,
0.030473917722702026,
0.07681416720151901,
-0.05506385117769241,
0.24256838858127594,
-0.060657184571027756,
0.07304663956165314,
-0.06367231905460358,
0.08224093914031982,
0.020024610683321953,
-0.0898653119802475,
-0.02802678570151329,
0.12154950946569443,
-0.05820329859852791,
-0.22864650189876556,
0.03130641207098961,
0.17116062343120575,
-0.0008698753663338721,
0.3166733980178833,
-0.022369736805558205,
-0.030874023213982582,
-0.024361880496144295,
-0.017960844561457634,
-0.05903249979019165,
0.063910111784935,
-0.10617305338382721,
-0.05135134980082512,
-0.01413784734904766,
0.06774817407131195,
0.10103888064622879,
-0.06470189988613129,
0.005236983299255371,
-0.012768016196787357,
-0.1360217183828354,
-0.13872230052947998,
0.049625296145677567,
0.011207420378923416,
0.07054100930690765,
-0.04750654473900795,
-0.05751447007060051,
0.024910079315304756,
-0.05778641626238823,
-0.10937053710222244,
0.038220930844545364,
-0.23775753378868103,
-0.28733670711517334,
-0.12320596724748611,
0.026077648624777794,
-0.038341958075761795,
0.050067026168107986,
0.09842344373464584,
-0.14073333144187927,
-0.0049974313005805016,
-0.028861092403531075,
0.1593138426542282,
-0.0644623413681984,
-0.02713971957564354,
-0.020135806873440742,
0.04799941927194595,
0.013237876817584038,
-0.0626382976770401,
0.0075715030543506145,
0.001812931033782661,
-0.01444646529853344,
0.004269767552614212,
-0.051507435739040375,
0.043465614318847656,
0.15174125134944916,
0.025377072393894196,
-0.010692729614675045,
-0.08294564485549927,
0.09959354251623154,
-0.09390340745449066,
0.01265703048557043,
0.14134103059768677,
-0.020302094519138336,
-0.014836248941719532,
0.1408650130033493,
-0.0022528988774865866,
0.025762716308236122,
0.025755805894732475,
-0.039998073130846024,
-0.05068260803818703,
-0.2119809240102768,
-0.156220480799675,
-0.119463711977005,
0.04218490049242973,
-0.17599299550056458,
-0.009480302222073078,
-0.059372320771217346,
-0.06049351021647453,
-0.03602764010429382,
-0.0886605754494667,
0.09186892956495285,
-0.02301313541829586,
0.23013819754123688,
-0.09262728691101074,
0.10604165494441986,
-0.1041262298822403,
-0.04286893829703331,
0.13689982891082764,
0.003124985145404935,
0.08327971398830414,
0.11848212778568268,
0.08067607134580612,
0.046949345618486404,
0.048937179148197174,
0.12307008355855942,
0.02091359533369541,
0.09845693409442902,
0.01101701520383358,
-0.012910529971122742,
-0.06094151735305786,
0.05693982169032097,
0.06483706831932068,
0.34265565872192383,
-0.11603614687919617,
0.00010589318844722584,
0.047685954719781876,
0.07795558124780655,
0.10115621984004974,
0.09384793788194656,
-0.014905400574207306,
-0.02866760827600956,
0.02736942283809185,
-0.018619481474161148,
-0.021180713549256325,
0.10985670238733292,
0.23939108848571777,
-0.01594126597046852,
0.08782191574573517,
0.10016577690839767,
0.04939751327037811,
0.03958188742399216,
0.05000690370798111,
-0.09700572490692139,
0.003995815757662058,
-0.013479648157954216,
0.052918441593647,
-0.17086169123649597,
0.1808178722858429,
0.05995306745171547,
0.018647445365786552,
0.008453226648271084,
-0.018815992400050163,
0.044616807252168655,
0.14786820113658905,
0.11046084016561508,
0.019154831767082214,
-0.13065055012702942,
-0.09529728442430496,
-0.057732000946998596,
-0.009767022915184498,
0.15816520154476166,
0.12770918011665344,
-0.004769980441778898,
0.013621477410197258,
-0.05551811307668686,
0.013533304445445538,
0.005476898979395628,
-0.23647987842559814,
-0.12553420662879944,
0.026373468339443207,
0.21625003218650818,
0.0809473991394043,
0.0008140085265040398,
-0.06595218181610107,
-0.1495285928249359,
0.07503722608089447,
-0.04654542729258537,
-0.012303984723985195,
-0.08099626004695892,
-0.07405615597963333,
0.1291685700416565,
-0.05002671480178833,
0.028017712756991386,
0.04075006768107414,
-0.005832181312143803,
-0.0689731314778328,
-0.06696893274784088,
0.10747665911912918,
-0.0714554712176323,
-0.034962017089128494,
0.026669930666685104,
0.24057483673095703,
0.04676787182688713,
0.10290144383907318,
0.014733968302607536,
-0.012069394811987877,
0.04854368045926094,
-0.0499948114156723,
0.03038543276488781,
-0.028922537341713905,
-0.10200679302215576,
0.1420445740222931,
-0.05934765934944153,
-0.21884724497795105,
-0.06884189695119858,
-0.056142836809158325,
0.23346412181854248,
0.10244735330343246,
-0.08990676701068878,
0.14037443697452545,
0.16491937637329102,
-0.037974610924720764,
-0.2779448926448822,
-0.01145038940012455,
0.0022011331748217344,
0.1200726330280304,
-0.0085682338103652,
-0.150419220328331,
-0.036966804414987564,
-0.08981021493673325,
0.0024412597995251417,
0.0037036973517388105,
-0.14620786905288696,
-0.12341010570526123,
0.14162929356098175,
-0.1460825353860855,
0.1390746831893921,
-0.010123368352651596,
-0.03803441673517227,
-0.09201786667108536,
0.08976437896490097,
0.0867270678281784,
-0.20808681845664978,
0.11887956410646439,
0.12087827920913696,
0.07242351025342941,
0.03380666673183441,
0.06797628849744797,
0.08062346279621124,
0.007823511958122253,
-0.02862561121582985,
0.005985863972455263,
-0.006375783588737249,
0.08600030839443207,
0.06441226601600647,
-0.0014567238977178931,
-0.01110748015344143,
0.016720179468393326,
-0.017940662801265717,
-0.04599335417151451,
-0.051743727177381516,
0.06685575842857361,
0.06140443682670593,
-0.017043467611074448,
-0.05347958579659462,
-0.008930093608796597,
-0.024558698758482933,
-0.005884240847080946,
0.09316641837358475,
-0.20277953147888184,
-0.04337875172495842,
0.152400940656662,
0.207050159573555,
-0.04969607666134834,
0.09712789952754974,
0.023518195375800133,
-0.08734266459941864,
0.07010326534509659,
-0.05529286339879036,
0.043085597455501556,
0.06763748079538345,
-0.054083384573459625,
0.1129302829504013,
-0.010679974220693111,
-0.06978138536214828,
0.09654375910758972,
0.07192715257406235,
-0.07167309522628784,
-0.12298943102359772,
-0.09618350863456726,
-0.023254314437508583,
0.10160171240568161,
0.07553578913211823,
0.26560911536216736,
-0.045739490538835526,
0.035182707011699677,
-0.05179965868592262,
-0.02209216170012951,
-0.1456756740808487,
0.12754122912883759,
0.07280470430850983,
-0.005141494330018759,
-0.08927489817142487,
0.09931047260761261,
-0.0010732802329584956,
-0.09115840494632721,
0.009666796773672104,
0.03730624541640282,
-0.0627240315079689,
-0.10631908476352692,
-0.15806305408477783,
0.0072605242021381855,
0.01928587444126606,
-0.14082221686840057,
-0.01976439356803894,
-0.16753323376178741,
0.02840694971382618,
0.20102259516716003,
-0.009076938033103943,
0.0018917128909379244,
-0.0875990241765976,
-0.029887471348047256,
0.07333345711231232,
0.023548081517219543,
0.05004790425300598,
-0.08304474502801895,
-0.11680465936660767,
0.151625394821167,
-0.04494405537843704,
0.12786227464675903,
-0.0708930641412735,
-0.014410407282412052,
-0.023388748988509178,
0.052665624767541885,
-0.13705091178417206,
-0.036354243755340576,
-0.04945747181773186,
-0.020889762789011,
-0.002031563315540552,
-0.06453283876180649,
-0.05834493786096573,
0.04217972606420517,
-0.09362082928419113,
0.03972432017326355,
-0.017221417278051376,
0.054695699363946915,
-0.03914174064993858,
-0.001955826999619603,
-0.0324370414018631,
0.018922653049230576,
0.10458371788263321,
0.1488138735294342,
-0.09250736236572266,
0.11298181861639023,
-0.1433844119310379,
-0.05227179452776909,
0.12637175619602203,
0.07633183151483536,
0.04008780047297478,
0.0722162052989006,
-0.012212925590574741,
0.09673701226711273,
0.06628469377756119,
-0.0206318236887455,
0.016895394772291183,
0.015341023914515972,
0.009076269343495369,
-0.16087983548641205,
-0.0021239137277007103,
-0.01763008162379265,
-0.00289943628013134,
0.12399128079414368,
0.1311187893152237,
0.12747952342033386,
-0.1146484762430191,
0.05950279161334038,
0.020054277032613754,
0.02970687486231327,
-0.010227056220173836,
-0.06627906113862991,
-0.05274766683578491,
-0.09579474478960037,
0.08936883509159088,
-0.042131975293159485,
0.11174468696117401,
-0.009607755579054356,
0.012576643377542496,
0.006128201726824045,
-0.09716689586639404,
-0.04773801565170288,
0.05189581587910652,
0.04889987036585808,
0.0661720409989357,
-0.033185914158821106,
-0.1233043298125267,
-0.03287278115749359,
0.032587818801403046,
0.08378999680280685,
-0.03174334019422531,
0.09784587472677231,
0.113236203789711,
0.12206506729125977,
0.09442319720983505,
-0.05669739842414856,
0.056267816573381424,
0.022969624027609825,
-0.11013862490653992,
-0.009775021113455296,
-0.12788844108581543,
0.07990781962871552,
0.09114030748605728,
-0.04715889319777489,
0.03680470213294029,
0.023071497678756714,
-0.060399170964956284,
-0.14386437833309174,
-0.029089221730828285,
-0.08190257102251053,
-0.12110265344381332,
-0.0021245162934064865,
-0.08425775915384293,
0.07891201972961426,
-0.04332736134529114,
0.07461385428905487,
-0.017624082043766975,
0.1266205757856369,
-0.14342261850833893,
-0.17322173714637756,
0.14450252056121826,
-0.05432136356830597,
0.09133516997098923,
0.016565760597586632,
-0.04638830944895744,
0.14089885354042053,
-0.022189944982528687,
0.011132254265248775,
0.022094452753663063,
-0.07727649062871933,
0.0020592797081917524,
-0.10591049492359161,
-0.03398989140987396,
-0.029642906039953232,
0.05154211074113846,
0.08000457286834717,
0.11117931455373764,
0.11650010943412781,
-0.10118629038333893,
0.028324974700808525,
0.07777811586856842,
-0.058454837650060654,
-0.1541461944580078,
-0.07786751538515091,
-0.04769331216812134,
-0.007852774113416672,
0.1600216180086136,
-0.08820700645446777,
-0.06999123096466064,
-0.07915571331977844,
0.09597048163414001,
0.32573869824409485,
-0.13179992139339447,
0.04696568474173546,
-0.03873917832970619,
0.02960594743490219,
-0.013013346120715141,
0.042699482291936874,
0.04262014105916023,
0.09778061509132385,
0.08456183969974518,
-0.08185884356498718,
-0.10257121920585632,
-0.008004803210496902,
-0.037406060844659805,
0.05299168452620506,
-0.04639073461294174,
-0.09423419088125229,
0.05925996974110603,
0.16931520402431488,
-0.14045482873916626,
-0.12151853740215302,
-0.13153362274169922,
-0.11865051090717316,
-0.03540489822626114,
0.032243914902210236,
0.07309631258249283,
0.19382146000862122,
0.026215801015496254,
-0.07023513317108154,
-0.055738575756549835,
0.033413879573345184,
0.026628650724887848,
-0.09819155186414719,
0.020822158083319664,
0.010015993379056454,
-0.21072612702846527,
-0.11620859056711197,
-0.03396032378077507,
0.14607208967208862,
0.00510348379611969,
0.10242540389299393,
0.06625828891992569,
0.155251607298851,
0.024443550035357475,
-0.10529804229736328,
0.022556401789188385,
0.08911912888288498,
-0.05623501539230347,
0.23588471114635468,
0.06465037167072296,
-0.07253339141607285,
0.07231227308511734,
-0.01713324896991253,
-0.06355701386928558,
-0.034546926617622375,
0.052635133266448975,
-0.11820976436138153,
0.06128234043717384,
0.03891071304678917,
-0.018083618953824043,
-0.012281404808163643,
0.0032613689545542,
-0.026677384972572327,
0.038913294672966,
-0.020216651260852814,
-0.04719754308462143,
-0.1370783895254135,
-0.05129624530673027,
-0.08507893234491348,
0.08240372687578201,
-0.08801860362291336,
-0.005129749421030283,
-0.10172947496175766,
0.05599759891629219,
-0.021342089399695396,
0.10342464596033096,
0.03770047798752785,
-0.021655840799212456,
0.049683231860399246,
-0.061830949038267136,
0.0784987285733223,
0.07786447554826736,
-0.10378352552652359,
-0.0687655508518219
] |
null | null |
fairseq
|
# xm_transformer_600m-en_tr-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-Turkish
- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-tr-cv7](https://huggingface.co/facebook/tts_transformer-tr-cv7)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.speech_to_text.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_tr-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-tr-cv7",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-tr", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c", "covost2"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_tr-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"dataset:covost2",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-tr"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_tr-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-Turkish
- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-tr-cv7
## Usage
|
[
"# xm_transformer_600m-en_tr-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Turkish\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-tr-cv7",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_tr-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Turkish\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-tr-cv7",
"## Usage"
] |
[
59,
98,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_tr-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Turkish\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-tr-cv7## Usage"
] |
[
-0.1290048211812973,
0.013473445549607277,
-0.0039613437838852406,
-0.04826812818646431,
0.05142645910382271,
-0.04296405240893364,
0.1091817319393158,
0.03226778656244278,
-0.10838039219379425,
-0.018567223101854324,
-0.01852063462138176,
0.009838908910751343,
0.07894682884216309,
0.06872943043708801,
-0.0641917809844017,
-0.20882655680179596,
0.049232400953769684,
-0.06764369457960129,
-0.07075347006320953,
0.08652149885892868,
0.12020565569400787,
-0.030925849452614784,
0.01615816541016102,
0.043044447898864746,
-0.09805614501237869,
0.03955855965614319,
0.0307087991386652,
-0.14284811913967133,
0.0768781304359436,
0.08546994626522064,
0.033791497349739075,
0.051690880209207535,
0.03981928527355194,
-0.11775672435760498,
0.042828720062971115,
-0.027293696999549866,
-0.004833134822547436,
-0.00538627989590168,
0.022977767512202263,
-0.05570128560066223,
0.1595398187637329,
-0.05018909275531769,
-0.0580892488360405,
0.05676532909274101,
-0.10415735095739365,
-0.05778166279196739,
-0.04832083731889725,
0.0009843608131632209,
0.05899112671613693,
0.06196186691522598,
-0.07262628525495529,
0.0012969112722203135,
-0.06676656007766724,
0.09735637903213501,
0.016813082620501518,
-0.2955603301525116,
-0.051989469677209854,
-0.08515094220638275,
0.023953359574079514,
0.0889076441526413,
0.0028546445537358522,
0.1078455001115799,
-0.010326182469725609,
-0.013166951946914196,
-0.16395792365074158,
-0.14635761082172394,
-0.17771711945533752,
-0.06899376213550568,
-0.10704436153173447,
0.04486291483044624,
0.3171370029449463,
0.028592990711331367,
-0.011549212969839573,
0.024802254512906075,
-0.007968886755406857,
0.06272739917039871,
-0.019100267440080643,
-0.0497148223221302,
-0.06116481125354767,
0.040740661323070526,
-0.028852002695202827,
-0.10864045470952988,
-0.12478701025247574,
-0.015131814405322075,
-0.15038946270942688,
0.1571144014596939,
0.0034909346140921116,
0.019710492342710495,
-0.03998834639787674,
-0.012217595241963863,
-0.05875403806567192,
-0.03209150582551956,
0.04556306079030037,
-0.018148021772503853,
-0.10611877590417862,
0.012943517416715622,
-0.02457384020090103,
-0.19307617843151093,
0.09880876541137695,
-0.17710290849208832,
-0.09046100825071335,
0.0034340503625571728,
-0.0783703476190567,
0.0915747582912445,
-0.009234984405338764,
-0.003369895974174142,
-0.11134427040815353,
-0.08443846553564072,
0.006217253394424915,
-0.05627414956688881,
-0.0006419284036383033,
-0.018671726807951927,
-0.13185951113700867,
-0.05066265910863876,
-0.11881176382303238,
0.11446567624807358,
-0.016394997015595436,
0.030824793502688408,
0.005325677338987589,
-0.0016846340149641037,
0.041511233896017075,
-0.04949573799967766,
-0.04100595414638519,
0.0496041439473629,
-0.00452176621183753,
0.07697295397520065,
0.028854239732027054,
0.08938739448785782,
-0.07148744910955429,
-0.0805470421910286,
0.04509202390909195,
0.08636295795440674,
0.06014778092503548,
-0.10328467190265656,
0.036553774029016495,
0.01728031598031521,
-0.04072609171271324,
-0.2004321962594986,
0.019226791337132454,
-0.040758345276117325,
-0.0924825519323349,
0.031171495094895363,
0.0007280210847966373,
-0.1315733641386032,
-0.025437699630856514,
0.02921220287680626,
-0.056120455265045166,
-0.03847973048686981,
-0.06498567759990692,
0.03229488059878349,
-0.09049290418624878,
0.07089557498693466,
-0.1402311623096466,
0.06479835510253906,
-0.006143859121948481,
-0.02862459607422352,
-0.0807843878865242,
0.17301422357559204,
-0.019682496786117554,
-0.017393702641129494,
-0.09150528162717819,
-0.021905342116951942,
-0.08306626975536346,
0.034787118434906006,
-0.021299175918102264,
0.10266660153865814,
-0.24277906119823456,
-0.06600718945264816,
0.11918114125728607,
-0.02042025700211525,
-0.009247630834579468,
0.1690780520439148,
0.04306604340672493,
0.043876323848962784,
0.12929502129554749,
0.2664199471473694,
0.11043938994407654,
-0.13186533749103546,
-0.005326037760823965,
0.055190518498420715,
-0.02996782213449478,
0.0035389831755310297,
0.07626146078109741,
-0.08318920433521271,
0.033925049006938934,
-0.031151073053479195,
0.10480119287967682,
0.09031769633293152,
-0.031211204826831818,
-0.03533695265650749,
0.0741075798869133,
-0.07068698853254318,
0.11931197345256805,
-0.09853893518447876,
0.018650544807314873,
-0.06369931250810623,
-0.042437635362148285,
-0.008132231421768665,
0.07866548001766205,
-0.08208966255187988,
0.09189024567604065,
-0.13146542012691498,
0.0754101499915123,
-0.08632905781269073,
0.049056921154260635,
-0.10772927105426788,
0.09235648810863495,
-0.08169945329427719,
0.15701332688331604,
0.24021542072296143,
0.1875608116388321,
-0.025827188044786453,
-0.005199512001127005,
-0.051504943519830704,
0.08084016293287277,
0.05453930422663689,
0.0672188326716423,
-0.03050832822918892,
-0.14327391982078552,
0.1290937066078186,
-0.048946067690849304,
0.12624163925647736,
0.0014648105716332793,
-0.03535056859254837,
0.1870097666978836,
0.07651595026254654,
-0.0010898590553551912,
0.03557101637125015,
0.05331112816929817,
0.08858612179756165,
0.04391402751207352,
0.028952350839972496,
0.012598925270140171,
-0.007688700687140226,
-0.11129865795373917,
0.255644291639328,
-0.215228870511055,
0.05205255001783371,
0.13588443398475647,
-0.10769527405500412,
-0.015424149110913277,
0.03545493260025978,
0.01596015691757202,
0.010108902119100094,
0.06215536594390869,
-0.04462311044335365,
0.2638537883758545,
-0.05683370679616928,
0.10881013423204422,
-0.06572853773832321,
0.04337522014975548,
0.009278750978410244,
-0.0674496665596962,
0.001096053165383637,
0.11429598182439804,
-0.06158839538693428,
-0.2149149775505066,
0.03849557414650917,
0.20030292868614197,
0.020305542275309563,
0.2705134451389313,
-0.006973988376557827,
-0.008199071511626244,
-0.04281897842884064,
0.01554650254547596,
-0.023635683581233025,
0.04026219993829727,
-0.15941385924816132,
-0.030399002134799957,
0.007387576159089804,
0.0717775896191597,
0.09665749967098236,
-0.07564512640237808,
0.018408335745334625,
0.007833346724510193,
-0.12219052016735077,
-0.10702864080667496,
0.06266309320926666,
0.02364722080528736,
0.08353843539953232,
-0.06567005068063736,
-0.0775194764137268,
0.012917806394398212,
-0.04695219546556473,
-0.109166219830513,
0.04592575505375862,
-0.2375052273273468,
-0.2989049553871155,
-0.09122645854949951,
0.02149893157184124,
-0.03502202033996582,
0.02043028362095356,
0.10268664360046387,
-0.08378785103559494,
0.016408368945121765,
-0.02459334209561348,
0.1219736710190773,
-0.06467664241790771,
0.0026233838871121407,
-0.06032942607998848,
-0.0006735015194863081,
-0.015022845938801765,
-0.08156867325305939,
0.011039583943784237,
-0.015629932284355164,
-0.040205761790275574,
0.007612396031618118,
-0.049178555607795715,
0.067998968064785,
0.19041556119918823,
0.017011048272252083,
-0.00037734193028882146,
-0.1019500121474266,
0.08562937378883362,
-0.13707990944385529,
0.0008400503429584205,
0.0976814478635788,
-0.032490622252225876,
-0.008362030610442162,
0.1667206734418869,
0.006796759553253651,
0.016898225992918015,
-0.008317439816892147,
-0.05686052516102791,
-0.045357782393693924,
-0.19515836238861084,
-0.14409305155277252,
-0.144033744931221,
0.007362031377851963,
-0.16233783960342407,
-0.012024774216115475,
-0.00165794906206429,
-0.03540995717048645,
-0.05003424733877182,
-0.050557710230350494,
0.11242187023162842,
-0.030133137479424477,
0.27981168031692505,
-0.04775763675570488,
0.08206416666507721,
-0.09453857690095901,
-0.07275746017694473,
0.12010417133569717,
0.0405225045979023,
0.07600224018096924,
0.11720915138721466,
0.13648951053619385,
0.04590188339352608,
0.06073929741978645,
0.09648890048265457,
0.023584026843309402,
0.06942632049322128,
0.03230052813887596,
-0.015540829859673977,
-0.0868089571595192,
0.03793424367904663,
0.01866021566092968,
0.35082319378852844,
-0.10580850392580032,
0.004826557356864214,
0.0438404306769371,
0.10414029657840729,
0.050245631486177444,
0.0811634212732315,
-0.0407346710562706,
0.015008412301540375,
-0.008204258047044277,
-0.04233943298459053,
0.008499499410390854,
0.0648379847407341,
0.23520450294017792,
-0.020346516743302345,
0.1135522797703743,
0.08186577260494232,
0.05427107214927673,
-0.0385727696120739,
0.02672787569463253,
-0.12097222357988358,
-0.03981076180934906,
-0.008035203441977501,
0.050547223538160324,
-0.10773956030607224,
0.18869051337242126,
0.07634612917900085,
0.03450336307287216,
-0.03285379335284233,
-0.013102115131914616,
0.06857816129922867,
0.10687649250030518,
0.10348072648048401,
0.011230346746742725,
-0.07991406321525574,
-0.1019560694694519,
-0.12456156313419342,
0.0021599852479994297,
0.1388990879058838,
0.0741947665810585,
-0.01819593831896782,
0.028335796669125557,
-0.044192638248205185,
0.002583075547590852,
-0.04028809070587158,
-0.2224181443452835,
-0.1416713446378708,
0.07290101051330566,
0.24296091496944427,
0.046887803822755814,
0.012073991820216179,
-0.059488482773303986,
-0.16154444217681885,
0.07384859025478363,
-0.10318450629711151,
-0.019154587760567665,
-0.0934450775384903,
-0.08883959800004959,
0.10011307895183563,
-0.05682908371090889,
-0.008003921248018742,
0.019781699404120445,
-0.026718249544501305,
-0.09567801654338837,
-0.03863498941063881,
0.09961477667093277,
-0.06471583247184753,
-0.02612072415649891,
-0.0021169139072299004,
0.25394943356513977,
0.03755870461463928,
0.11146853864192963,
0.06720064580440521,
-0.02068384550511837,
0.04310851916670799,
-0.05435722693800926,
-0.030161982402205467,
-0.011312962509691715,
-0.081039659678936,
0.13075093924999237,
0.04397406801581383,
-0.20499113202095032,
-0.0758090540766716,
-0.03204725682735443,
0.24189266562461853,
0.11257010698318481,
-0.05158727243542671,
0.11031043529510498,
0.1459457129240036,
-0.028283873572945595,
-0.222585529088974,
-0.025903675705194473,
0.0007248403271660209,
0.09076648950576782,
-0.05924355611205101,
-0.08312004804611206,
-0.00339264003559947,
-0.011670473031699657,
-0.00996549241244793,
0.03778179734945297,
-0.24512024223804474,
-0.13154654204845428,
0.15638914704322815,
-0.11948344856500626,
0.1258101910352707,
-0.024954333901405334,
-0.06203342601656914,
-0.06164490804076195,
0.09304258227348328,
0.08351410925388336,
-0.2870851755142212,
0.11516760289669037,
0.11441029608249664,
0.09118863195180893,
0.01730623096227646,
0.07195636630058289,
0.13825547695159912,
0.032430730760097504,
-0.03893522545695305,
-0.012322559952735901,
-0.013670402579009533,
0.08207668364048004,
0.06426049768924713,
0.001837223069742322,
-0.05313655361533165,
-0.0007725844043307006,
-0.04368980601429939,
-0.02858281321823597,
-0.04987542703747749,
0.08961611986160278,
0.052133966237306595,
0.0008729277178645134,
-0.05824822932481766,
-0.06289096176624298,
-0.03320532664656639,
0.010115884244441986,
0.03605903685092926,
-0.1968236118555069,
-0.014581426046788692,
0.1371450573205948,
0.21324856579303741,
-0.09029905498027802,
0.09878545999526978,
-0.01311812549829483,
-0.07609696686267853,
0.05228916555643082,
0.01877642422914505,
0.004782652482390404,
0.08926864713430405,
-0.05053948611021042,
0.12070819735527039,
-0.025166187435388565,
-0.0656261071562767,
0.1430167257785797,
0.0849393829703331,
-0.06410738825798035,
-0.1427796632051468,
-0.06667274981737137,
0.007316129747778177,
0.07524764537811279,
0.04170721024274826,
0.27359631657600403,
-0.034575264900922775,
0.053207073360681534,
-0.06061147153377533,
-0.033945709466934204,
-0.12880118191242218,
0.1758194863796234,
0.04979531094431877,
-0.019629377871751785,
-0.11468765884637833,
0.07417840510606766,
0.046241290867328644,
-0.07987271249294281,
-0.014058283530175686,
0.05213256552815437,
-0.07317456603050232,
-0.09783727675676346,
-0.17038574814796448,
0.03079264983534813,
0.04357830807566643,
-0.15112943947315216,
0.003934832755476236,
-0.17766621708869934,
-0.00269867479801178,
0.20074090361595154,
-0.026284491643309593,
0.02827714942395687,
-0.09778750687837601,
-0.033710382878780365,
0.06311862915754318,
0.018603593111038208,
0.05813886970281601,
-0.04605233296751976,
-0.06582297384738922,
0.21877452731132507,
-0.04507466405630112,
0.11101128906011581,
-0.05002830922603607,
-0.038986921310424805,
0.036599963903427124,
0.049367066472768784,
-0.08703335374593735,
-0.0060705034993588924,
-0.08533742278814316,
0.0015520303277298808,
0.041450802236795425,
-0.048167310655117035,
-0.05498009920120239,
0.0034441854804754257,
-0.08303133398294449,
0.05301225185394287,
-0.02222616784274578,
0.043627768754959106,
-0.08047556132078171,
-0.010688677430152893,
-0.014018824324011803,
0.021526159718632698,
0.12633691728115082,
0.1345735341310501,
-0.10758659988641739,
0.10561251640319824,
-0.18870654702186584,
-0.08815029263496399,
0.14552396535873413,
0.08375301957130432,
0.05846544727683067,
0.04891901835799217,
-0.018576186150312424,
0.11640184372663498,
0.07350916415452957,
0.010932303965091705,
0.04262394830584526,
-0.004327353555709124,
0.007447998970746994,
-0.16156111657619476,
-0.012160984799265862,
-0.021691499277949333,
-0.004753004293888807,
0.10804518312215805,
0.08960270881652832,
0.08983670175075531,
-0.08440601825714111,
0.05259818583726883,
0.01554725132882595,
0.030139751732349396,
-0.02317783422768116,
-0.10522632300853729,
-0.005679157096892595,
-0.09826657921075821,
0.1009335145354271,
-0.028145141899585724,
0.052789345383644104,
-0.0007386556244455278,
0.011150622740387917,
-0.008346637710928917,
-0.06721106171607971,
-0.032950110733509064,
0.027158496901392937,
0.061838068068027496,
0.1024753674864769,
-0.03327704593539238,
-0.12935656309127808,
-0.04255122318863869,
0.06015508249402046,
0.016705520451068878,
0.024881256744265556,
0.043600887060165405,
0.1267174780368805,
0.11169056594371796,
0.1036987155675888,
-0.050410714000463486,
0.05512407422065735,
-0.0018442600267007947,
-0.11137320846319199,
0.00040763182914815843,
-0.08178596198558807,
0.023613398894667625,
0.10753390192985535,
-0.07569698244333267,
-0.03211107850074768,
0.025755232200026512,
-0.07374687492847443,
-0.13194401562213898,
-0.05032376945018768,
-0.05808299034833908,
-0.07591979950666428,
-0.028710059821605682,
-0.07237677276134491,
0.059855829924345016,
-0.004514285828918219,
0.08466821908950806,
0.010145219974219799,
0.167387917637825,
-0.14472298324108124,
-0.14657969772815704,
0.14652207493782043,
-0.07024696469306946,
0.11507686227560043,
-0.04513655602931976,
-0.06579746305942535,
0.10308986157178879,
0.006642828695476055,
0.0645398423075676,
0.03336729481816292,
-0.06379292905330658,
0.004142525605857372,
-0.13237348198890686,
-0.041258569806814194,
-0.034654609858989716,
0.026608267799019814,
0.07289397716522217,
0.10096806287765503,
0.148917555809021,
-0.07818271964788437,
0.02522706426680088,
0.02219262346625328,
-0.03908383473753929,
-0.15429551899433136,
-0.0715911015868187,
-0.12963318824768066,
-0.02392197586596012,
0.11899063736200333,
-0.055375467985868454,
-0.07751668989658356,
-0.03916076570749283,
0.09503433108329773,
0.34430992603302,
-0.132339209318161,
0.07228026539087296,
-0.030045870691537857,
0.02989780530333519,
-0.0236374419182539,
0.015516147017478943,
0.04496478661894798,
0.1191849336028099,
0.07628903537988663,
-0.039330460131168365,
-0.06485896557569504,
-0.025848301127552986,
-0.04410041868686676,
0.002336656441912055,
-0.06153386831283569,
-0.09068913012742996,
0.08031819760799408,
0.20067811012268066,
-0.10818284749984741,
-0.13948412239551544,
-0.11943823844194412,
-0.09834645688533783,
-0.06939518451690674,
0.03181997686624527,
0.04331209138035774,
0.18343278765678406,
-0.007872003130614758,
-0.06995385140180588,
-0.06862697005271912,
0.044519152492284775,
0.024774568155407906,
-0.11858464777469635,
-0.024102194234728813,
0.03535350784659386,
-0.19980059564113617,
-0.09215684980154037,
-0.025986986234784126,
0.13877366483211517,
0.002973240567371249,
0.10243067890405655,
0.08062240481376648,
0.14816415309906006,
0.057478152215480804,
-0.03153024613857269,
0.061832815408706665,
0.08520818501710892,
-0.04489511623978615,
0.19411318004131317,
0.08603131026029587,
-0.05222038924694061,
0.08356526494026184,
-0.04462480545043945,
-0.020713165402412415,
-0.027010483667254448,
0.02258826233446598,
-0.0700807273387909,
0.0690067932009697,
0.030840599909424782,
-0.005720901302993298,
0.04440345615148544,
0.040395841002464294,
-0.08051610738039017,
0.03276276960968971,
-0.040170129388570786,
-0.06540843099355698,
-0.15889036655426025,
-0.07786780595779419,
-0.09014873951673508,
0.09243045747280121,
-0.06481476128101349,
-0.012919479049742222,
-0.12326083332300186,
0.02597302757203579,
-0.012951736338436604,
0.1058211699128151,
0.03142699971795082,
-0.04660749435424805,
0.02096724510192871,
-0.043784283101558685,
0.05858183652162552,
0.08896230906248093,
-0.09588153660297394,
-0.04605288803577423
] |
null | null |
fairseq
|
# xm_transformer_600m-en_vi-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-Vietnamese
- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-vi-cv7](https://huggingface.co/facebook/tts_transformer-vi-cv7)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.speech_to_text.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_vi-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-vi-cv7",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-vi", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_vi-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-vi"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_vi-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-Vietnamese
- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-vi-cv7
## Usage
|
[
"# xm_transformer_600m-en_vi-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Vietnamese\n- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-vi-cv7",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_vi-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Vietnamese\n- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-vi-cv7",
"## Usage"
] |
[
52,
94,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_vi-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Vietnamese\n- Trained on MuST-C, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-vi-cv7## Usage"
] |
[
-0.10522110760211945,
0.07279615104198456,
-0.003259617369621992,
-0.042873844504356384,
0.07640158385038376,
-0.09745340049266815,
0.03202252835035324,
0.06775195151567459,
-0.1002088114619255,
-0.024275152012705803,
-0.039276883006095886,
0.01835273951292038,
0.04946822300553322,
0.06510508060455322,
-0.031190995126962662,
-0.24325557053089142,
0.06475652009248734,
-0.0025131376460194588,
-0.06883729249238968,
0.0645965039730072,
0.09896507114171982,
-0.04416443035006523,
0.056860968470573425,
0.04769120737910271,
-0.05335717275738716,
0.07448003441095352,
0.027539623901247978,
-0.10775663703680038,
0.07452002167701721,
0.09075073897838593,
0.010489375330507755,
0.06708882004022598,
0.07888105511665344,
-0.1331956535577774,
0.02571144513785839,
-0.006443788297474384,
-0.003546658204868436,
-0.019781911745667458,
-0.006055973004549742,
0.014308146201074123,
0.14431163668632507,
-0.03309589996933937,
-0.061416562646627426,
0.0826115608215332,
-0.09930245578289032,
-0.11210259795188904,
-0.001517037395387888,
-0.03159330412745476,
0.06415490061044693,
0.060718927532434464,
-0.07454204559326172,
0.022661790251731873,
-0.08169781416654587,
0.08459901064634323,
0.0548444539308548,
-0.26491132378578186,
0.01107102446258068,
-0.046782635152339935,
0.11679810285568237,
0.06549558788537979,
-0.03766629472374916,
0.09985516965389252,
0.03906804695725441,
0.008024631068110466,
-0.13304120302200317,
-0.1535562425851822,
-0.1452474594116211,
0.010686756111681461,
-0.13765834271907806,
0.07704907655715942,
0.2813820540904999,
0.055902089923620224,
-0.04114298149943352,
0.045082513242959976,
-0.03304369002580643,
0.05369770899415016,
-0.01799064874649048,
-0.07221036404371262,
0.010366463102400303,
0.060969699174165726,
-0.00544883543625474,
-0.15270985662937164,
-0.08929158747196198,
-0.04936585947871208,
-0.1183466762304306,
0.03183428943157196,
-0.007629521656781435,
0.008192547596991062,
-0.06704499572515488,
-0.05296207219362259,
-0.09333645552396774,
0.0040993341244757175,
0.020252810791134834,
-0.050696928054094315,
-0.095243439078331,
0.025638362392783165,
0.001512809656560421,
-0.17212805151939392,
0.06445125490427017,
-0.1943238526582718,
-0.1273188292980194,
0.013885651715099812,
-0.1253037303686142,
0.06050695851445198,
0.01758737303316593,
0.05501430854201317,
-0.18284328281879425,
-0.03926518186926842,
0.007848117500543594,
-0.030926255509257317,
-0.023108700290322304,
-0.0021429264452308416,
-0.15058046579360962,
-0.05009513348340988,
-0.12470926344394684,
0.06217679753899574,
-0.03868570178747177,
0.03598044067621231,
-0.032451216131448746,
-0.032907936722040176,
0.07314415276050568,
-0.05372786149382591,
-0.01934754103422165,
0.037891287356615067,
0.009545310400426388,
0.1355278342962265,
0.0795358195900917,
0.08939337730407715,
-0.08301308006048203,
-0.11890515685081482,
0.019151445478200912,
0.06541323661804199,
0.0031987489201128483,
-0.11556892842054367,
0.036080002784729004,
0.03522004559636116,
-0.0321219302713871,
-0.12964770197868347,
-0.013445533812046051,
-0.033946793526411057,
-0.07910554111003876,
0.049150750041007996,
-0.036298491060733795,
-0.13487936556339264,
-0.036879803985357285,
-0.008103355765342712,
-0.045877449214458466,
-0.06805267184972763,
-0.036617059260606766,
0.011966087855398655,
-0.06314604729413986,
0.09969162940979004,
-0.15495672821998596,
0.08415766060352325,
0.00013064281665720046,
0.014020866714417934,
-0.10412394255399704,
0.16263441741466522,
-0.03987370803952217,
-0.04120051488280296,
-0.0531952828168869,
-0.059927359223365784,
-0.08642000705003738,
0.08195151388645172,
-0.02845391072332859,
0.11176829785108566,
-0.2566540837287903,
-0.07568526268005371,
0.1519482433795929,
-0.04084179177880287,
0.006757755763828754,
0.1765233427286148,
0.05117111653089523,
0.008995819836854935,
0.09769104421138763,
0.24015265703201294,
0.07591814547777176,
-0.17632970213890076,
0.02236456796526909,
0.09834326058626175,
-0.07204881310462952,
-0.012602260336279869,
0.1003243699669838,
-0.04137232154607773,
0.008076018653810024,
-0.0349109061062336,
0.09079854935407639,
0.05073406919836998,
-0.04606347903609276,
-0.029527099803090096,
0.037458281964063644,
-0.06746813654899597,
0.13312996923923492,
-0.07491960376501083,
0.02155277132987976,
-0.004959458485245705,
-0.02365076169371605,
0.16878968477249146,
0.08476804196834564,
-0.0779113695025444,
0.10303015261888504,
-0.18228860199451447,
0.010076286271214485,
-0.04936162382364273,
0.052958786487579346,
-0.0995582640171051,
0.1547553539276123,
-0.05385780334472656,
0.09871004521846771,
0.20604975521564484,
0.21366000175476074,
-0.03368726000189781,
-0.003282905323430896,
-0.07164923846721649,
0.051545899361371994,
0.02054288238286972,
0.06731501966714859,
0.008432148024439812,
-0.11574758589267731,
0.10292576253414154,
-0.06461658328771591,
-0.028434354811906815,
-0.05849924311041832,
-0.03763803094625473,
0.1659705936908722,
0.046781253069639206,
0.006274048704653978,
0.055294282734394073,
0.0934988483786583,
0.08695381134748459,
0.026917535811662674,
0.035891637206077576,
0.02623477764427662,
-0.02122008614242077,
-0.08649490773677826,
0.2723061442375183,
-0.12565046548843384,
0.09820438921451569,
0.1332882195711136,
-0.16121040284633636,
0.0371537022292614,
0.08939153701066971,
0.01625053957104683,
0.004086208064109087,
0.022721804678440094,
-0.05322210490703583,
0.2103751003742218,
-0.06381779164075851,
0.13186009228229523,
-0.08559637516736984,
0.07680036872625351,
0.013132592663168907,
-0.07770741730928421,
-0.013722199946641922,
0.10534282773733139,
-0.0018307218560948968,
-0.1965649574995041,
0.04411416873335838,
0.1758728325366974,
0.021514037624001503,
0.282576322555542,
-0.019687434658408165,
-0.007489013019949198,
-0.04631338268518448,
0.024432910606265068,
-0.050884541124105453,
0.03246929869055748,
-0.20479823648929596,
-0.02861444652080536,
-0.015488454140722752,
0.08636406064033508,
0.0964130163192749,
-0.0807267427444458,
0.00932059995830059,
0.0008100842824205756,
-0.12593358755111694,
-0.12336300313472748,
0.041128262877464294,
0.021326545625925064,
0.05533238872885704,
-0.04968609660863876,
-0.03793657198548317,
0.017398200929164886,
-0.046245284378528595,
-0.11833469569683075,
0.054816510528326035,
-0.24382618069648743,
-0.3383673131465912,
-0.10638833045959473,
-0.01389278843998909,
-0.028347479179501534,
0.0442221537232399,
0.0849129930138588,
-0.1368069052696228,
-0.023046553134918213,
-0.02847580797970295,
0.13431993126869202,
-0.07114451378583908,
-0.02770642191171646,
0.0007712653605267406,
0.03369009122252464,
0.01125786080956459,
-0.07316336780786514,
0.020109299570322037,
-0.0016588743310421705,
-0.03186767175793648,
-0.0391310416162014,
-0.08735284209251404,
0.03776826709508896,
0.18870888650417328,
0.032700132578611374,
-0.012738524004817009,
-0.07021423429250717,
0.11399758607149124,
-0.08312027901411057,
0.007541291881352663,
0.15426267683506012,
-0.011491972021758556,
-0.010050909593701363,
0.13881216943264008,
-0.013227434828877449,
-0.0041208830662071705,
0.029656097292900085,
-0.01420405600219965,
-0.04182256758213043,
-0.1972956508398056,
-0.12615543603897095,
-0.13823409378528595,
0.057340819388628006,
-0.1651475876569748,
0.0035449787974357605,
-0.040730834007263184,
-0.06809669733047485,
0.012511657550930977,
0.017479661852121353,
0.08700107783079147,
-0.035173553973436356,
0.26409199833869934,
-0.11465117335319519,
0.0701591894030571,
-0.09618639945983887,
-0.06557780504226685,
0.13269339501857758,
0.0762915387749672,
0.03885399177670479,
0.12709249556064606,
0.12226716428995132,
0.0756971463561058,
0.04811651259660721,
0.12868545949459076,
0.014360110275447369,
0.011204089038074017,
0.012541396543383598,
-0.044204890727996826,
-0.06099574640393257,
0.07933184504508972,
0.030136482790112495,
0.3222064673900604,
-0.09859944880008698,
0.006455456372350454,
0.06619461625814438,
0.06358221173286438,
0.06479004770517349,
0.08459736406803131,
-0.002429484622552991,
0.05392605438828468,
0.047214508056640625,
-0.01675783097743988,
-0.011854643933475018,
0.12019796669483185,
0.2769932448863983,
-0.03593544661998749,
0.10621900856494904,
0.11103150993585587,
0.05470172315835953,
-0.014703456312417984,
0.0601046048104763,
-0.09856873005628586,
-0.012438427656888962,
0.010707646608352661,
0.03214988857507706,
-0.16782359778881073,
0.16592241823673248,
0.05691716447472572,
-0.015723375603556633,
0.009149083867669106,
-0.009436358697712421,
0.029377566650509834,
0.19750042259693146,
0.15193572640419006,
0.03181619569659233,
-0.06436936557292938,
-0.0637521892786026,
-0.04606027901172638,
-0.01238142792135477,
0.12124533206224442,
0.17159847915172577,
-0.05508377030491829,
0.012069606222212315,
-0.07748252898454666,
0.006314604543149471,
0.00425483426079154,
-0.18729573488235474,
-0.08873353153467178,
0.0317087359726429,
0.2033872902393341,
0.06984762102365494,
0.003912594169378281,
-0.05004898086190224,
-0.1547478586435318,
0.1133013367652893,
-0.06647299230098724,
0.030055241659283638,
-0.08302099257707596,
-0.10401280224323273,
0.11552823334932327,
-0.025427022948861122,
0.02825782261788845,
0.02076341025531292,
-0.01126613188534975,
-0.06536584347486496,
-0.03557644784450531,
0.1457371711730957,
-0.05320090800523758,
-0.0196673646569252,
0.007283693645149469,
0.19640874862670898,
-0.008470139466226101,
0.09519549459218979,
0.030232924968004227,
-0.04822508245706558,
0.07337908446788788,
-0.07888008654117584,
-0.015440747141838074,
0.020727215334773064,
-0.11538619548082352,
0.10289975255727768,
-0.07771435379981995,
-0.2295357584953308,
-0.020842812955379486,
-0.07436111569404602,
0.21455644071102142,
0.09363069385290146,
-0.051263462752103806,
0.131844624876976,
0.20935672521591187,
-0.06533919274806976,
-0.2667875289916992,
-0.02082798071205616,
-0.035981882363557816,
0.08988569676876068,
-0.010271679610013962,
-0.1191430389881134,
-0.017901185899972916,
-0.05747024714946747,
0.012857560068368912,
-0.05977070331573486,
-0.172199547290802,
-0.1369933784008026,
0.11959268152713776,
-0.09485374391078949,
0.13970591127872467,
-0.010191796347498894,
-0.07110267132520676,
-0.06846541911363602,
0.03151679039001465,
0.1258779764175415,
-0.1974349170923233,
0.09440439939498901,
0.1240619570016861,
0.03637654706835747,
0.016430795192718506,
0.06523110717535019,
0.08892224729061127,
0.016103383153676987,
-0.029133301228284836,
0.01979709230363369,
-0.03769217059016228,
0.05742473900318146,
0.08093560487031937,
0.04177843779325485,
-0.005786381661891937,
0.008151217363774776,
-0.03103291615843773,
-0.03878794610500336,
-0.06971124559640884,
0.033800698816776276,
0.05938918888568878,
-0.010819675400853157,
-0.04705396667122841,
-0.0003439058200456202,
-0.023501016199588776,
0.0013142549432814121,
0.06354057043790817,
-0.19417858123779297,
-0.06302279978990555,
0.17638923227787018,
0.22728362679481506,
-0.08908185362815857,
0.12659470736980438,
0.011575380340218544,
-0.07067154347896576,
0.07200285792350769,
-0.038039665669202805,
0.06661269068717957,
0.061081092804670334,
-0.035570304840803146,
0.16557858884334564,
-0.008228280581533909,
-0.06041892245411873,
0.10807336866855621,
0.05569658800959587,
-0.026795344427227974,
-0.16655629873275757,
-0.09102535247802734,
0.0010331141529604793,
0.1173432245850563,
0.08049967139959335,
0.21626883745193481,
-0.022611796855926514,
0.02537636086344719,
-0.040815021842718124,
-0.013303062878549099,
-0.15839147567749023,
0.1195434108376503,
0.05131392553448677,
0.0030323085375130177,
-0.10217048227787018,
0.08775580674409866,
0.04739006608724594,
-0.12914292514324188,
-0.0014294707216322422,
0.08280985057353973,
-0.05905143916606903,
-0.08516686409711838,
-0.14174708724021912,
0.043055251240730286,
0.062094371765851974,
-0.12546849250793457,
-0.022067759186029434,
-0.1618274301290512,
0.02923225797712803,
0.2269609421491623,
-0.019105903804302216,
0.03916551545262337,
-0.09034845232963562,
-0.05636833980679512,
0.05146876350045204,
0.014160603284835815,
0.0297782514244318,
-0.060786984860897064,
-0.11683370172977448,
0.12149625271558762,
-0.0026988242752850056,
0.1727183759212494,
-0.0659690722823143,
-0.035199377685785294,
-0.05211864784359932,
0.050895027816295624,
-0.10842035710811615,
-0.0011005204869434237,
-0.061037857085466385,
-0.022370923310518265,
0.0035330969840288162,
-0.07107678800821304,
-0.05917530506849289,
0.031626228243112564,
-0.0960245206952095,
0.05193178728222847,
-0.0069239442236721516,
0.051409296691417694,
-0.04821872338652611,
-0.005009708926081657,
0.024522162973880768,
0.02651715651154518,
0.08714142441749573,
0.051252253353595734,
-0.11908091604709625,
0.08645965903997421,
-0.1400906890630722,
-0.11025451123714447,
0.15520727634429932,
0.09148236364126205,
0.06125829741358757,
0.05691121518611908,
-0.009424601681530476,
0.11416961252689362,
0.09048476815223694,
-0.04003971815109253,
0.034648213535547256,
0.0005825223634019494,
-0.02431831881403923,
-0.1686331331729889,
-0.03906542807817459,
-0.012588820420205593,
-0.011411508545279503,
0.09732232987880707,
0.0881020650267601,
0.06874404102563858,
-0.08292336016893387,
0.03043188899755478,
0.004157767165452242,
0.015041285194456577,
-0.02778809145092964,
-0.06098643317818642,
-0.018897028639912605,
-0.08746100217103958,
0.061070818454027176,
-0.012819868512451649,
0.10002626478672028,
-0.022367389872670174,
0.011084320954978466,
0.008062674663960934,
-0.09273014217615128,
-0.07802803814411163,
0.0494849719107151,
0.06968817859888077,
0.0587591677904129,
-0.018119242042303085,
-0.10570689290761948,
-0.03621984273195267,
0.01996891386806965,
0.13255582749843597,
-0.062202081084251404,
0.0768236592411995,
0.08479315042495728,
0.11303933709859848,
0.09967197477817535,
-0.004670761059969664,
-0.03260215371847153,
0.06616680324077606,
-0.13055501878261566,
0.010432089678943157,
-0.1267816722393036,
0.049532320350408554,
0.06787288933992386,
-0.0589660182595253,
0.059323396533727646,
0.04102436080574989,
-0.0775618702173233,
-0.17675673961639404,
-0.07800887525081635,
-0.08277582377195358,
-0.1366599202156067,
-0.003922868054360151,
-0.07636042684316635,
0.10704786330461502,
-0.07605467736721039,
0.09750007838010788,
0.003769409377127886,
0.12927916646003723,
-0.13706979155540466,
-0.15949873626232147,
0.1001828983426094,
-0.06546435505151749,
0.09154423326253891,
-0.026433318853378296,
-0.0005132628139108419,
0.15889008343219757,
0.0014739860780537128,
0.021901091560721397,
0.009883440099656582,
-0.08208241313695908,
0.01923074945807457,
-0.0947345420718193,
-0.007440063636749983,
-0.03667571768164635,
0.0022038789466023445,
0.04351857677102089,
0.09292511641979218,
0.12306426465511322,
-0.1062798798084259,
0.04587247595191002,
0.04785146191716194,
-0.07340682297945023,
-0.16802969574928284,
-0.1216910257935524,
-0.029073594138026237,
-0.025082647800445557,
0.14375613629817963,
-0.07781905680894852,
-0.041912227869033813,
-0.0807800143957138,
0.15771938860416412,
0.2541217505931854,
-0.14619192481040955,
0.028061525896191597,
-0.024212488904595375,
0.025317665189504623,
-0.02414042130112648,
0.014670277945697308,
0.0861838236451149,
0.13153406977653503,
0.0783454030752182,
-0.09333010762929916,
-0.08866817504167557,
-0.034691985696554184,
-0.013479088433086872,
0.027571801096200943,
-0.027853993698954582,
-0.059826914221048355,
0.05702498182654381,
0.1414753794670105,
-0.13846349716186523,
-0.0969025194644928,
-0.1535325050354004,
-0.10988461971282959,
-0.03460449352860451,
0.0039774030447006226,
0.06058438867330551,
0.17788873612880707,
-0.04068078473210335,
-0.0570925809442997,
-0.05190229415893555,
0.07228566706180573,
0.02524656429886818,
-0.12407755851745605,
-0.0012261384399607778,
0.02952277846634388,
-0.1963167041540146,
-0.09885712713003159,
-0.017717041075229645,
0.07192093878984451,
-0.016750551760196686,
0.09641803056001663,
0.06669718027114868,
0.1729339063167572,
-0.0055855740793049335,
-0.06803788989782333,
-0.0015097908908501267,
0.09876906126737595,
-0.047354795038700104,
0.2712574601173401,
0.004215161316096783,
-0.0681568831205368,
0.07180511951446533,
-0.024573663249611855,
-0.08092562109231949,
-0.0332423634827137,
0.025704864412546158,
-0.11244455724954605,
0.07116179913282394,
0.01063520647585392,
-0.005028826650232077,
0.03837202861905098,
-0.012461688369512558,
-0.011983301490545273,
-0.02660423517227173,
-0.04094691574573517,
-0.050145458430051804,
-0.13356588780879974,
-0.030907439067959785,
-0.09642384201288223,
0.08841930329799652,
-0.0968959629535675,
-0.013403073884546757,
-0.09888350963592529,
0.04865746572613716,
-0.04759383946657181,
0.08970887213945389,
0.04740016162395477,
-0.003777859266847372,
0.02762414887547493,
-0.10993918776512146,
0.0709877610206604,
0.07508867979049683,
-0.08574124425649643,
-0.0622849278151989
] |
null | null |
fairseq
|
# xm_transformer_600m-en_zh-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- English-Chinese
- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/tts_transformer-zh-cv7_css10](https://huggingface.co/facebook/tts_transformer-zh-cv7_css10)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.speech_to_text.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-en_zh-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/tts_transformer-zh-cv7_css10",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
```
|
{"language": "en-zh", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["must_c", "covost2"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-en_es-multi_domain/resolve/main/common_voice_en_18295850.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-en_zh-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:must_c",
"dataset:covost2",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"en-zh"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-en_zh-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- English-Chinese
- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/tts_transformer-zh-cv7_css10
## Usage
|
[
"# xm_transformer_600m-en_zh-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Chinese\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-zh-cv7_css10",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-en_zh-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Chinese\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-zh-cv7_css10",
"## Usage"
] |
[
59,
101,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-must_c #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-en_zh-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- English-Chinese\n- Trained on MuST-C, CoVoST 2, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/tts_transformer-zh-cv7_css10## Usage"
] |
[
-0.14758509397506714,
0.05939028784632683,
-0.002448016544803977,
-0.04038853570818901,
0.06438861042261124,
-0.08270903676748276,
0.09486079961061478,
0.05111633986234665,
-0.08339823782444,
0.001496913842856884,
-0.05368470400571823,
-0.0002973366645164788,
0.08431719243526459,
0.06804629415273666,
-0.05031644180417061,
-0.2329082489013672,
0.05446471646428108,
0.009564674459397793,
-0.07246590405702591,
0.07242948561906815,
0.11676857620477676,
-0.07242989540100098,
0.050434309989213943,
0.06820672750473022,
-0.05827499181032181,
0.012140663340687752,
0.014331869781017303,
-0.12372175604104996,
0.07708539068698883,
0.10139735788106918,
0.03960851952433586,
0.08244069665670395,
0.07213719934225082,
-0.11960719525814056,
0.03477245196700096,
-0.03914642333984375,
-0.01129829790443182,
0.0019258724059909582,
0.01592826470732689,
0.025103893131017685,
0.11207792907953262,
-0.011757095344364643,
-0.054508794099092484,
0.10337461531162262,
-0.07825379073619843,
-0.05112265422940254,
-0.03934047371149063,
0.0302711334079504,
0.07074297219514847,
0.0749402791261673,
-0.07473226636648178,
0.02618018537759781,
-0.09133128076791763,
0.07646737992763519,
0.033490851521492004,
-0.29607677459716797,
-0.0076860529370605946,
0.0025933505967259407,
0.10246047377586365,
0.04977378249168396,
-0.05164601653814316,
0.05971984565258026,
0.015230005607008934,
-0.02952299825847149,
-0.16391682624816895,
-0.15604771673679352,
-0.11690708249807358,
-0.015576294623315334,
-0.12876500189304352,
0.10428626090288162,
0.30955538153648376,
0.06014933064579964,
-0.05108758807182312,
0.014814520254731178,
-0.007637575268745422,
0.04227215796709061,
-0.03137170523405075,
-0.07449839264154434,
-0.026499543339014053,
0.05564019829034805,
-0.055126361548900604,
-0.13982336223125458,
-0.11612941324710846,
-0.028208591043949127,
-0.10546895116567612,
0.06546203047037125,
-0.016312723979353905,
-0.005059082061052322,
-0.060383331030607224,
-0.05043048411607742,
-0.031240468844771385,
-0.022687140852212906,
0.020711472257971764,
-0.04258127883076668,
-0.10675612092018127,
0.0075299241580069065,
0.0008048860472626984,
-0.15108752250671387,
0.0763714462518692,
-0.14837481081485748,
-0.12422502785921097,
0.03054671548306942,
-0.07744356244802475,
0.06600052118301392,
0.016167663037776947,
0.04076124727725983,
-0.16305625438690186,
-0.04844321683049202,
0.031892258673906326,
-0.03925409913063049,
-0.009939984418451786,
-0.022031966596841812,
-0.15363332629203796,
-0.05182928591966629,
-0.12059485167264938,
0.07694876939058304,
-0.03967202454805374,
0.024153156206011772,
-0.038911327719688416,
-0.041148122400045395,
0.11672230809926987,
-0.027115561068058014,
-0.02150479331612587,
0.07589925825595856,
-0.006668712943792343,
0.11600223183631897,
0.04453935846686363,
0.11476917564868927,
-0.052964456379413605,
-0.09923349320888519,
-0.001534926355816424,
0.07476035505533218,
0.041330236941576004,
-0.08778467774391174,
0.04205525666475296,
0.030472520738840103,
-0.03346722200512886,
-0.15804271399974823,
0.01775067113339901,
-0.051277581602334976,
-0.1183592826128006,
0.050479236990213394,
-0.05984073504805565,
-0.1340285986661911,
-0.05063600838184357,
-0.010154667310416698,
-0.05521174892783165,
-0.063643679022789,
-0.0416598916053772,
0.007692319341003895,
-0.07447826862335205,
0.05491720512509346,
-0.16885627806186676,
0.063919298350811,
0.0035340061876922846,
0.00048667355440557003,
-0.09145216643810272,
0.16274015605449677,
-0.026819679886102676,
-0.01281251385807991,
-0.05673080310225487,
-0.0389341376721859,
-0.04994593933224678,
0.04536167159676552,
-0.025387872010469437,
0.12253330647945404,
-0.25263404846191406,
-0.0707850456237793,
0.10747029632329941,
-0.038554221391677856,
-0.017043014988303185,
0.16672952473163605,
0.03424530848860741,
0.025814153254032135,
0.09266132861375809,
0.2117312252521515,
0.09764710813760757,
-0.17178206145763397,
-0.00519773131236434,
0.06736300140619278,
-0.04427186772227287,
-0.023291349411010742,
0.09720494598150253,
-0.05450323596596718,
0.033936288207769394,
-0.03742356598377228,
0.11355365812778473,
0.04518066719174385,
-0.06415842473506927,
-0.05408742278814316,
0.04726004973053932,
-0.08195515722036362,
0.10919932276010513,
-0.07321306318044662,
0.029476914554834366,
-0.019510138779878616,
-0.0099226338788867,
0.037025876343250275,
0.09097228199243546,
-0.0928373709321022,
0.11079714447259903,
-0.17250563204288483,
0.03414379805326462,
0.010793390683829784,
0.032541424036026,
-0.1041114553809166,
0.10712571442127228,
-0.04301672801375389,
0.13594791293144226,
0.21675270795822144,
0.1860349178314209,
-0.05556422844529152,
0.0027184525970369577,
-0.07090552896261215,
0.05565543845295906,
0.04029593989253044,
0.0592588372528553,
-0.01653015986084938,
-0.13245360553264618,
0.09114152938127518,
-0.046308405697345734,
0.14177127182483673,
-0.0831017717719078,
-0.040286868810653687,
0.16425417363643646,
0.06368810683488846,
-0.002325895009562373,
0.04624934867024422,
0.10210810601711273,
0.10787296295166016,
0.010147607885301113,
0.0455026850104332,
0.015836220234632492,
0.021442212164402008,
-0.1747622936964035,
0.2789030075073242,
-0.1440993845462799,
0.009808970615267754,
0.1332527995109558,
-0.13459980487823486,
-0.0018880071584135294,
0.0071000210009515285,
0.011363147757947445,
0.0009752252954058349,
0.0410454086959362,
-0.01295792032033205,
0.2542343735694885,
-0.08684194087982178,
0.12590736150741577,
-0.07991974800825119,
0.08343970775604248,
0.003261430887505412,
-0.06469620764255524,
0.0017080316320061684,
0.11894384026527405,
-0.027458056807518005,
-0.19412128627300262,
0.0419108085334301,
0.11190960556268692,
0.009839330799877644,
0.2514253258705139,
-0.021272242069244385,
0.008881141431629658,
-0.047538403421640396,
0.013761758804321289,
-0.03457760065793991,
0.0509362667798996,
-0.15869225561618805,
-0.036240123212337494,
0.001431487500667572,
0.04430621117353439,
0.09307412058115005,
-0.10334550589323044,
0.011763256043195724,
-0.01174086146056652,
-0.11948247998952866,
-0.09289944916963577,
0.03875049948692322,
0.014585316181182861,
0.05882469192147255,
-0.06600763648748398,
-0.07339952886104584,
0.0219443179666996,
-0.04072754830121994,
-0.14730560779571533,
0.04109542816877365,
-0.22667862474918365,
-0.3064209520816803,
-0.06735929846763611,
0.0026989115867763758,
-0.024003496393561363,
0.04473000392317772,
0.077213354408741,
-0.09765292704105377,
-0.012688721530139446,
-0.04063645005226135,
0.06282752752304077,
-0.06395076215267181,
0.00022642996918875724,
-0.02535664662718773,
0.029567023739218712,
-0.015458820387721062,
-0.07285169512033463,
0.03658897429704666,
-0.026564767584204674,
-0.03843642771244049,
0.008224336430430412,
-0.07893882691860199,
0.06204262748360634,
0.21281377971172333,
0.07463277131319046,
-0.019376059994101524,
-0.10109619796276093,
0.07630573213100433,
-0.10969790816307068,
0.004273290280252695,
0.1354914903640747,
-0.02419467829167843,
-0.014899207279086113,
0.1272972822189331,
-0.005939586088061333,
0.013080835342407227,
0.027464764192700386,
-0.06310297548770905,
-0.04541122540831566,
-0.20550537109375,
-0.14869330823421478,
-0.1486068069934845,
0.04865176975727081,
-0.17784255743026733,
-0.007339687552303076,
0.04637119174003601,
-0.04408452287316322,
-0.0241625364869833,
-0.02076699212193489,
0.07934659719467163,
-0.026916328817605972,
0.19056034088134766,
-0.04812217876315117,
0.08383616805076599,
-0.08196252584457397,
-0.05481331795454025,
0.10405205190181732,
0.0435636043548584,
0.053275443613529205,
0.1278802752494812,
0.153967022895813,
0.04275590926408768,
0.11618859320878983,
0.16233320534229279,
0.005388688296079636,
0.05151296779513359,
0.009411646053195,
-0.030528057366609573,
-0.06907214224338531,
0.04438309744000435,
0.05754528194665909,
0.39518409967422485,
-0.11173324286937714,
0.0008455977658741176,
0.07614313811063766,
0.05218356475234032,
0.0924801304936409,
0.10147079825401306,
-0.003234763164073229,
0.010288350284099579,
0.011188790202140808,
0.00024197107995860279,
0.02250106818974018,
0.11031276732683182,
0.24779894948005676,
-0.05352167785167694,
0.11786814779043198,
0.1116921529173851,
0.0647391602396965,
-0.02167276293039322,
0.02913154661655426,
-0.1315031349658966,
-0.023939087986946106,
-0.0030380948446691036,
0.015040285885334015,
-0.1748044490814209,
0.15844953060150146,
0.05789493769407272,
0.0012935290578752756,
-0.020293164998292923,
-0.019977275282144547,
0.05001522973179817,
0.12549574673175812,
0.12296275794506073,
0.010259483009576797,
-0.05993276089429855,
-0.09944621473550797,
-0.09887509047985077,
-0.006706675514578819,
0.13299888372421265,
0.1316620409488678,
-0.02994507923722267,
0.04927285015583038,
-0.07350364327430725,
0.013711716048419476,
0.016975637525320053,
-0.21523453295230865,
-0.08700720220804214,
0.05929311737418175,
0.2221563458442688,
0.06321360915899277,
0.014711279422044754,
-0.050112996250391006,
-0.16075222194194794,
0.051028281450271606,
-0.09976445138454437,
0.015830542892217636,
-0.0887385755777359,
-0.0897909551858902,
0.12071341276168823,
-0.04066571220755577,
0.0444687120616436,
0.026912152767181396,
0.00042585530900396407,
-0.07156538218259811,
-0.017689529806375504,
0.12317756563425064,
-0.05638331547379494,
-0.06854599714279175,
-0.002809700323268771,
0.2400878518819809,
0.0373135581612587,
0.1221788302063942,
0.05578618496656418,
-0.04563358798623085,
0.050152163952589035,
-0.07888533174991608,
-0.0249504242092371,
-0.014005531556904316,
-0.07450131326913834,
0.12980221211910248,
-0.030085070058703423,
-0.2099212408065796,
-0.07471707463264465,
-0.08955246955156326,
0.21597076952457428,
0.13510815799236298,
-0.049699731171131134,
0.10439105331897736,
0.1948409378528595,
-0.01958688721060753,
-0.24980394542217255,
-0.03338022902607918,
-0.009068475104868412,
0.09316375851631165,
-0.08254518359899521,
-0.11853059381246567,
-0.0014557657996192575,
-0.04148687794804573,
0.0005012066103518009,
-0.0017490559257566929,
-0.16423074901103973,
-0.16304023563861847,
0.1293986588716507,
-0.13220608234405518,
0.08620840311050415,
-0.039761047810316086,
-0.0918920487165451,
-0.051088202744722366,
0.07280367612838745,
0.1342185139656067,
-0.22948330640792847,
0.12020611763000488,
0.13028128445148468,
0.01933748833835125,
0.008501709438860416,
0.061891306191682816,
0.09861302375793457,
0.05645296722650528,
-0.04592824727296829,
0.029354598373174667,
-0.04494939744472504,
0.06799565255641937,
0.08449508994817734,
0.023486696183681488,
0.008997842669487,
0.014397927559912205,
-0.03558805584907532,
-0.026794053614139557,
-0.04553390294313431,
0.04077473282814026,
0.050252072513103485,
-0.014474524185061455,
-0.10274848341941833,
-0.0056248498149216175,
-0.03148967772722244,
0.020940294489264488,
0.11070387810468674,
-0.1821013242006302,
-0.04772304743528366,
0.17407125234603882,
0.23480962216854095,
-0.11108925193548203,
0.14216944575309753,
-0.01907861791551113,
-0.09478326886892319,
0.06816820055246353,
0.00005607058483292349,
0.03858141228556633,
0.06401809304952621,
-0.027006814256310463,
0.14014583826065063,
0.0005538691766560078,
-0.060073450207710266,
0.14747576415538788,
0.036675021052360535,
-0.058427877724170685,
-0.14008957147598267,
-0.09158764779567719,
-0.0030837662052363157,
0.09851344674825668,
0.07435638457536697,
0.23431546986103058,
-0.03010750189423561,
0.023742685094475746,
-0.0519120879471302,
-0.02541169710457325,
-0.12781426310539246,
0.1403523087501526,
0.07218945771455765,
-0.0007626807200722396,
-0.12440915405750275,
0.05434134602546692,
0.06546052545309067,
-0.052089232951402664,
-0.024254875257611275,
0.07691352069377899,
-0.0816076397895813,
-0.08679196983575821,
-0.19216081500053406,
0.046510182321071625,
0.01199845876544714,
-0.13949589431285858,
-0.014375291764736176,
-0.18405823409557343,
0.005972518585622311,
0.2198657989501953,
-0.021645914763212204,
0.04131309315562248,
-0.07050017267465591,
-0.05061953887343407,
0.03871815651655197,
-0.014172296039760113,
0.031548306345939636,
-0.03699784725904465,
-0.09451562911272049,
0.21399977803230286,
-0.010058822110295296,
0.15672479569911957,
-0.06323691457509995,
-0.03831208497285843,
-0.04858347401022911,
0.06305283308029175,
-0.06987987458705902,
0.01828211359679699,
-0.08665553480386734,
-0.03278559073805809,
-0.00020970204786863178,
-0.08185030519962311,
-0.060717932879924774,
0.03288158401846886,
-0.07440176606178284,
0.042716801166534424,
-0.017441315576434135,
0.052248019725084305,
-0.04482237622141838,
-0.00992822740226984,
0.00983322225511074,
0.009141044691205025,
0.11558176577091217,
0.11624991148710251,
-0.10989858210086823,
0.08283325284719467,
-0.18494637310504913,
-0.09217043966054916,
0.15221984684467316,
0.10527478903532028,
0.053611379116773605,
0.05614691600203514,
-0.04152042418718338,
0.1092894971370697,
0.08426439017057419,
-0.03125998005270958,
0.041173454374074936,
0.015305308625102043,
-0.050174932926893234,
-0.2022559940814972,
-0.021467572078108788,
-0.03242192044854164,
-0.012427467852830887,
0.13067710399627686,
0.09726320952177048,
0.08332131057977676,
-0.08058003336191177,
0.05889645591378212,
-0.01124777551740408,
0.017449816688895226,
-0.03139466792345047,
-0.06073640659451485,
-0.006529169622808695,
-0.10022898018360138,
0.0959482192993164,
-0.02349879965186119,
0.06850733608007431,
-0.0768352746963501,
-0.006643611937761307,
-0.011419066227972507,
-0.04912181571125984,
-0.029022131115198135,
0.06245124340057373,
0.12135210633277893,
0.11691858619451523,
-0.02629687450826168,
-0.0568050816655159,
-0.011490748263895512,
0.03208906948566437,
0.11534016579389572,
0.003664966905489564,
0.020858554169535637,
0.09855937957763672,
0.12787121534347534,
0.10852768272161484,
-0.05408474802970886,
0.07128208875656128,
0.05143404006958008,
-0.16520023345947266,
-0.03732535243034363,
-0.09909358620643616,
0.04570185765624046,
0.07919133454561234,
-0.06962712854146957,
0.03325880691409111,
0.03428340703248978,
-0.09185608476400375,
-0.143037348985672,
-0.027970895171165466,
-0.09521479904651642,
-0.09766244888305664,
-0.014801960438489914,
-0.06730373203754425,
0.04308473691344261,
-0.04283137246966362,
0.07591826468706131,
0.006466154009103775,
0.10817968100309372,
-0.09803242981433868,
-0.14613297581672668,
0.11798858642578125,
-0.05180499702692032,
0.09483347833156586,
-0.05009928345680237,
-0.05497756600379944,
0.13487541675567627,
0.0005635732086375356,
0.04387262836098671,
0.007428754586726427,
-0.06263872236013412,
0.058958809822797775,
-0.06651204079389572,
-0.033629029989242554,
-0.01799730211496353,
-0.01922142319381237,
0.074678935110569,
0.08090823143720627,
0.11924876272678375,
-0.06659553945064545,
0.043501630425453186,
0.014382787048816681,
-0.04528800770640373,
-0.1360819786787033,
-0.1102658361196518,
-0.07388786226511002,
-0.025250693783164024,
0.10829945653676987,
-0.04081675410270691,
-0.05659859627485275,
-0.034561026841402054,
0.13701431453227997,
0.29147231578826904,
-0.18905030190944672,
0.05626462772488594,
-0.02289390377700329,
0.0225993599742651,
-0.04197810962796211,
0.0038782719057053328,
0.07830728590488434,
0.15481123328208923,
0.07362901419401169,
-0.07960521429777145,
-0.08547420054674149,
-0.037495750933885574,
-0.04336264729499817,
0.0234407726675272,
-0.040983133018016815,
-0.07094057649374008,
0.05458105355501175,
0.1569930762052536,
-0.15887925028800964,
-0.09548319876194,
-0.12134493887424469,
-0.11771111190319061,
-0.03926297649741173,
0.025517186149954796,
-0.019143298268318176,
0.20331153273582458,
-0.024982284754514694,
-0.06702344119548798,
-0.016142157837748528,
0.04317924752831459,
0.02760016918182373,
-0.09576348960399628,
0.02556615136563778,
0.04510236904025078,
-0.2026999592781067,
-0.07671278715133667,
-0.028682151809334755,
0.11066252738237381,
-0.001630179351195693,
0.1205255389213562,
0.0736178383231163,
0.15734247863292694,
0.038142114877700806,
-0.08123300224542618,
0.041283946484327316,
0.08625015616416931,
-0.03728004917502403,
0.24662558734416962,
0.039667367935180664,
-0.05497584864497185,
0.09291281551122665,
0.018420163542032242,
-0.037018004804849625,
-0.044594667851924896,
0.02015095017850399,
-0.112064428627491,
0.08623987436294556,
-0.0207811426371336,
-0.037815745919942856,
0.026722826063632965,
0.04276784136891365,
-0.04743032902479172,
-0.02031335048377514,
-0.05672699958086014,
-0.07600253820419312,
-0.1713806390762329,
-0.028482913970947266,
-0.12457162141799927,
0.08069323003292084,
-0.05127258971333504,
0.00006645598477916792,
-0.11656273901462555,
0.012103651650249958,
-0.042675189673900604,
0.0619434230029583,
0.01983441226184368,
-0.03336706385016441,
0.02516239508986473,
-0.06061239540576935,
0.05486170947551727,
0.07245399057865143,
-0.0955488458275795,
-0.08813117444515228
] |
null | null |
fairseq
|
# xm_transformer_600m-es_en-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- Spanish-English
- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/fastspeech2-en-ljspeech](https://huggingface.co/facebook/fastspeech2-en-ljspeech)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.text_to_speech.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-es_en-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/fastspeech2-en-ljspeech",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
@inproceedings{wang-etal-2021-fairseq,
title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit",
author = "Wang, Changhan and
Hsu, Wei-Ning and
Adi, Yossi and
Polyak, Adam and
Lee, Ann and
Chen, Peng-Jen and
Gu, Jiatao and
Pino, Juan",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-demo.17",
doi = "10.18653/v1/2021.emnlp-demo.17",
pages = "143--152",
}
```
|
{"language": "es-en", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["mtedx", "covost2", "europarl_st", "voxpopuli"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-es_en-multi_domain/resolve/main/common_voice_es_19966634.flac"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-es_en-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:mtedx",
"dataset:covost2",
"dataset:europarl_st",
"dataset:voxpopuli",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"es-en"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-es_en-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- Spanish-English
- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/fastspeech2-en-ljspeech
## Usage
|
[
"# xm_transformer_600m-es_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- Spanish-English\n- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-es_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- Spanish-English\n- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech",
"## Usage"
] |
[
74,
104,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-es_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- Spanish-English\n- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech## Usage"
] |
[
-0.16295483708381653,
0.07829774916172028,
-0.004650667309761047,
-0.008533249609172344,
0.018770704045891762,
-0.058536868542432785,
0.10757331550121307,
0.05815373361110687,
-0.027019931003451347,
0.04727327823638916,
-0.01299404539167881,
0.056166842579841614,
0.02929912507534027,
0.11326760053634644,
-0.04145918786525726,
-0.1706715077161789,
0.07505687326192856,
-0.060652729123830795,
-0.01845661737024784,
0.0412820540368557,
0.11447002738714218,
-0.07620644569396973,
0.029118066653609276,
0.04414825513958931,
-0.04515417665243149,
0.037093453109264374,
0.02329561859369278,
-0.12335667759180069,
0.09710830450057983,
0.07076460123062134,
0.00746562983840704,
0.04891392961144447,
0.07365964353084564,
-0.14377446472644806,
0.028527479618787766,
-0.011261732317507267,
-0.006568757351487875,
0.018700527027249336,
0.041597142815589905,
-0.04981929063796997,
0.10703887790441513,
-0.029791900888085365,
-0.020256411284208298,
0.06955400854349136,
-0.09779331088066101,
-0.09632574021816254,
-0.052446987479925156,
-0.07204979658126831,
0.021694965660572052,
0.07245438545942307,
-0.07602231204509735,
0.047214165329933167,
-0.10649608075618744,
0.06396093964576721,
0.04527875781059265,
-0.2184300422668457,
-0.040482137352228165,
-0.06784220039844513,
0.07611114531755447,
0.08701777458190918,
-0.03195924311876297,
0.09633270651102066,
0.008747274056077003,
-0.002852404722943902,
-0.1300806701183319,
-0.13361835479736328,
-0.2157900631427765,
-0.04511590301990509,
-0.13756071031093597,
0.053469203412532806,
0.33792760968208313,
0.04363821819424629,
-0.04781003296375275,
-0.038920193910598755,
0.017563246190547943,
0.1045774519443512,
-0.06141515076160431,
-0.048424895852804184,
-0.005100163631141186,
0.01693052239716053,
-0.003759638639166951,
-0.09068711847066879,
-0.11894334107637405,
0.008272288367152214,
-0.09800940752029419,
0.11744580417871475,
-0.008558453060686588,
0.006228166166692972,
0.003198581049218774,
-0.03004523180425167,
-0.09731870889663696,
-0.03159413859248161,
0.039169520139694214,
-0.03091837838292122,
-0.046101637184619904,
0.002738154726102948,
0.009462217800319195,
-0.18652889132499695,
0.10006913542747498,
-0.11873564124107361,
-0.07549413293600082,
-0.020380955189466476,
-0.11122655123472214,
0.0670647993683815,
0.06770893186330795,
-0.008987871930003166,
-0.11384361237287521,
-0.0941111147403717,
0.016583846881985664,
-0.029289450496435165,
0.030233129858970642,
0.005746493581682444,
-0.14411653578281403,
-0.02454477921128273,
-0.1338333785533905,
0.10004378110170364,
0.030681151896715164,
-0.033259592950344086,
-0.09212566912174225,
-0.02704751305282116,
0.039048932492733,
-0.03562290221452713,
0.01793203316628933,
0.05859976261854172,
-0.023778116330504417,
0.10916581004858017,
0.009158714674413204,
0.0855996310710907,
-0.06963669508695602,
-0.04769337549805641,
-0.021492915228009224,
0.06037751957774162,
0.03113391622900963,
-0.12119410932064056,
0.08523955196142197,
-0.024211063981056213,
-0.011190752498805523,
-0.11727437376976013,
-0.013161679729819298,
-0.08528334647417068,
-0.07478854060173035,
-0.0163936298340559,
-0.02467941679060459,
-0.1865222305059433,
-0.013540449552237988,
0.02779274620115757,
-0.06777762621641159,
0.011615514755249023,
-0.08365748822689056,
0.017085516825318336,
-0.139910027384758,
0.06285037100315094,
-0.12817730009555817,
0.04916714131832123,
-0.02314315363764763,
-0.05969186872243881,
-0.14217430353164673,
0.13836991786956787,
-0.10200754553079605,
-0.0430268794298172,
-0.09725941717624664,
-0.028895597904920578,
-0.04602070897817612,
0.08877067267894745,
-0.011339218355715275,
0.1525110900402069,
-0.2577741742134094,
-0.07993830740451813,
0.12951980531215668,
-0.04300651699304581,
-0.009145112708210945,
0.16321630775928497,
0.011329668574035168,
0.07378509640693665,
0.09837578982114792,
0.2748964726924896,
0.04842511564493179,
-0.2486744076013565,
-0.02964642457664013,
0.04453995078802109,
0.015902571380138397,
-0.029832342639565468,
0.07233926653862,
-0.11536203324794769,
0.12657791376113892,
-0.028966916725039482,
0.08121171593666077,
0.033689625561237335,
-0.036172837018966675,
-0.03165881708264351,
0.06705166399478912,
-0.0509737990796566,
0.11090616136789322,
-0.03690231591463089,
-0.013463719747960567,
-0.06568324565887451,
-0.03560979291796684,
0.08218041807413101,
0.06717728823423386,
-0.07051889598369598,
0.07718232274055481,
-0.11243046075105667,
0.07598729431629181,
-0.06893216818571091,
0.01581256464123726,
-0.10407806932926178,
0.02635473571717739,
-0.03755616396665573,
0.13279104232788086,
0.20352381467819214,
0.19254086911678314,
-0.021383201703429222,
-0.016518892720341682,
-0.0558529756963253,
0.06801638007164001,
0.024651873856782913,
0.03366793692111969,
-0.0379677414894104,
-0.20923815667629242,
0.11645644903182983,
-0.08956614136695862,
0.10002174973487854,
-0.03149239718914032,
-0.05315877124667168,
0.1885068118572235,
0.08938658982515335,
0.03224153071641922,
0.003132415935397148,
0.062013912945985794,
0.10249358415603638,
0.040435466915369034,
0.04141440615057945,
0.0342935174703598,
-0.02126518078148365,
-0.08064403384923935,
0.2291642129421234,
-0.12186004966497421,
0.04149213060736656,
0.13028141856193542,
-0.09819526970386505,
0.0006319666863419116,
0.005619762931019068,
-0.002419511554762721,
0.03577972576022148,
-0.015339416451752186,
-0.06297475844621658,
0.22315101325511932,
-0.028611423447728157,
0.09492571651935577,
-0.11865276098251343,
0.006971235387027264,
0.01703053154051304,
-0.05092662200331688,
-0.04283737391233444,
0.1439668983221054,
0.0013018198078498244,
-0.09352850914001465,
0.06590498983860016,
0.14984390139579773,
0.023420920595526695,
0.2863064408302307,
-0.03291454166173935,
-0.018547600135207176,
-0.014024578966200352,
-0.02631419152021408,
-0.03187687695026398,
0.09248381853103638,
-0.18070252239704132,
0.0004902672953903675,
0.02460549958050251,
0.07383447885513306,
0.07729308307170868,
-0.09207494556903839,
-0.005335627123713493,
-0.03871360793709755,
-0.1034359261393547,
-0.07013849914073944,
0.05342169851064682,
0.021291323006153107,
0.09192357957363129,
-0.0592879019677639,
-0.11030327528715134,
-0.00463957991451025,
-0.050187837332487106,
-0.11087916791439056,
0.06924883276224136,
-0.25281134247779846,
-0.24406760931015015,
-0.07932575047016144,
0.013900774531066418,
-0.05663934722542763,
0.031746771186590195,
0.08081956952810287,
-0.0789143517613411,
-0.05061051994562149,
-0.029660770669579506,
0.10395477712154388,
-0.023981722071766853,
-0.05472283065319061,
-0.04059760645031929,
0.017691005021333694,
-0.029084306210279465,
-0.11887090653181076,
0.009870102629065514,
-0.04730527102947235,
0.01703798584640026,
-0.04044421389698982,
-0.030173975974321365,
0.08771863579750061,
0.15115313231945038,
0.07846583425998688,
-0.026233013719320297,
-0.07433277368545532,
0.13780264556407928,
-0.15815728902816772,
0.005608076695352793,
0.13670602440834045,
0.015103152953088284,
-0.03238573297858238,
0.14062531292438507,
0.04029848426580429,
-0.02300838753581047,
-0.025432702153921127,
-0.02814376913011074,
-0.014895299449563026,
-0.24987362325191498,
-0.2046937644481659,
-0.12490212172269821,
0.02010076493024826,
-0.10525136440992355,
-0.013690785504877567,
-0.03621172532439232,
-0.028223242610692978,
-0.020100655034184456,
-0.05722346901893616,
0.09348016232252121,
-0.014673680067062378,
0.271192342042923,
-0.07710341364145279,
0.08512984961271286,
-0.06714963912963867,
-0.07636824250221252,
0.14228419959545135,
0.03374866768717766,
0.05552727356553078,
0.10145925730466843,
0.14775289595127106,
0.05607796087861061,
0.015229861252009869,
0.04535803198814392,
0.03521398454904556,
0.06774184852838516,
0.008249342441558838,
-0.04117272049188614,
-0.08172652870416641,
0.04371770843863487,
0.01612447202205658,
0.3269907832145691,
-0.10938306897878647,
-0.029391886666417122,
-0.0020478665828704834,
0.08815658837556839,
0.06513819098472595,
0.1089838519692421,
-0.03569692000746727,
0.041406191885471344,
0.021541494876146317,
-0.04447445645928383,
-0.017546983435750008,
0.07730165868997574,
0.24727977812290192,
-0.07066696882247925,
0.1218450590968132,
0.1144246831536293,
0.04213731363415718,
-0.047271549701690674,
0.028821811079978943,
-0.15317906439304352,
-0.011731781996786594,
-0.003980903420597315,
0.043601810932159424,
-0.1214819848537445,
0.21518342196941376,
0.08224524557590485,
0.033863365650177,
-0.050644535571336746,
-0.0005959051777608693,
0.025748714804649353,
0.06799889355897903,
0.17590169608592987,
0.006055403966456652,
-0.04740428552031517,
-0.005255487747490406,
-0.10081958025693893,
-0.002444009529426694,
0.10156989097595215,
0.07303173094987869,
-0.030805397778749466,
0.053592219948768616,
-0.05848727002739906,
-0.0029622954316437244,
-0.06500721722841263,
-0.18540720641613007,
-0.1344740241765976,
0.04651493579149246,
0.1989094465970993,
0.03537945821881294,
0.005372083745896816,
-0.09694510698318481,
-0.1626998484134674,
0.05769268795847893,
-0.053414251655340195,
-0.009186148643493652,
-0.12545092403888702,
-0.06566839665174484,
0.14221708476543427,
-0.013377971947193146,
0.025184789672493935,
0.03918490931391716,
0.031693555414676666,
-0.1128452941775322,
0.005503327120095491,
0.1256418228149414,
-0.11028497666120529,
-0.05253878980875015,
-0.05027084797620773,
0.25601184368133545,
0.057655688375234604,
0.09677412360906601,
0.06439777463674545,
0.008226349018514156,
0.01946382224559784,
-0.07088729739189148,
0.02368491142988205,
-0.011123121716082096,
-0.03261217847466469,
0.12951159477233887,
-0.049353938549757004,
-0.22247281670570374,
-0.024509817361831665,
-0.06179110333323479,
0.22325237095355988,
0.16855210065841675,
-0.031169429421424866,
0.13362474739551544,
0.19448192417621613,
-0.07026766985654831,
-0.28096652030944824,
0.018635664135217667,
0.03551798313856125,
0.05529234558343887,
-0.05623719096183777,
-0.14098037779331207,
0.029691696166992188,
0.037068892270326614,
-0.01603274792432785,
0.035651158541440964,
-0.2218981236219406,
-0.13869363069534302,
0.11212890595197678,
-0.09538203477859497,
0.12668976187705994,
0.010966015048325062,
-0.050579287111759186,
-0.04594646394252777,
0.05567111819982529,
0.13505586981773376,
-0.22350212931632996,
0.09157349914312363,
0.10824046283960342,
0.07932090759277344,
0.028346164152026176,
0.050925418734550476,
0.13285088539123535,
0.07621737569570541,
-0.04919949546456337,
0.04436003789305687,
0.018392642959952354,
0.046675119549036026,
0.06721844524145126,
0.030062105506658554,
0.012190730310976505,
-0.0563860721886158,
-0.08295916020870209,
-0.0049593704752624035,
-0.07567138224840164,
0.09289733320474625,
0.016208836808800697,
-0.01094544492661953,
-0.060118168592453,
-0.01620868220925331,
-0.028723178431391716,
0.01639433018863201,
0.1318347156047821,
-0.1675858199596405,
0.011898119002580643,
0.15590110421180725,
0.2688066065311432,
-0.11314089596271515,
0.03419556841254234,
0.01386203058063984,
-0.08115842938423157,
0.022705400362610817,
0.004651980008929968,
0.0208648182451725,
0.10080010443925858,
-0.031202038750052452,
0.09698738902807236,
-0.018987329676747322,
-0.09717357903718948,
0.09240412712097168,
0.06558165699243546,
-0.06731197237968445,
-0.12079596519470215,
-0.041707661002874374,
0.0038292112294584513,
0.032967664301395416,
0.02018723264336586,
0.28418445587158203,
0.011435229331254959,
-0.00256301648914814,
-0.05088558420538902,
-0.014981593005359173,
-0.08700758218765259,
0.19207166135311127,
0.07859539240598679,
0.000565975671634078,
-0.13104234635829926,
0.08065379410982132,
0.031193483620882034,
-0.10849540680646896,
0.009328585118055344,
0.0628698468208313,
-0.05889248102903366,
-0.0904724970459938,
-0.10589365661144257,
0.08179106563329697,
-0.03847667574882507,
-0.1071588546037674,
-0.044429708272218704,
-0.16166731715202332,
0.017042430117726326,
0.20900504291057587,
-0.020970700308680534,
-0.0031905563082545996,
-0.0718425065279007,
-0.05581264942884445,
0.05370160937309265,
0.0479184053838253,
0.038183990865945816,
-0.047394346445798874,
-0.026328187435865402,
0.14377838373184204,
-0.05612380430102348,
0.07010111957788467,
-0.04146001115441322,
-0.006541860289871693,
-0.05551571026444435,
0.04378092288970947,
-0.03512749820947647,
0.002296601887792349,
-0.05616633966565132,
-0.021453427150845528,
0.004771960433572531,
-0.05777791142463684,
-0.039065949618816376,
-0.006164954975247383,
-0.11923582851886749,
0.06366399675607681,
0.008359686471521854,
0.08059877157211304,
-0.07622934132814407,
0.0045601557940244675,
-0.00012358702952042222,
0.011435518972575665,
0.10188888013362885,
0.06633853912353516,
-0.05125212296843529,
0.10218311846256256,
-0.19601452350616455,
-0.06786220520734787,
0.13599266111850739,
0.09389428049325943,
0.0764787495136261,
0.03635212779045105,
-0.023981407284736633,
0.10069601982831955,
0.08647076785564423,
0.0049109430983662605,
-0.011747905053198338,
-0.03187453746795654,
0.07373546808958054,
-0.08601424843072891,
-0.0681830570101738,
-0.027147192507982254,
0.046705372631549835,
0.12823693454265594,
0.04156539589166641,
0.1061151772737503,
-0.08306843042373657,
0.0332249253988266,
-0.0033576448913663626,
0.05573411285877228,
-0.014148572459816933,
-0.09352447092533112,
-0.016695205122232437,
-0.05770792067050934,
0.08874659240245819,
-0.02336682565510273,
0.08677644282579422,
0.042289718985557556,
0.017874281853437424,
0.02107963338494301,
-0.02385718934237957,
-0.07290332764387131,
0.04920317977666855,
-0.008531700819730759,
0.06976207345724106,
-0.009901751764118671,
-0.10454874485731125,
-0.01957586221396923,
0.08594807982444763,
0.09702951461076736,
0.027374669909477234,
0.04307292774319649,
0.17907781898975372,
0.11294906586408615,
0.10087668150663376,
-0.0686182826757431,
0.02636256255209446,
0.049350496381521225,
-0.06701790541410446,
0.02670018933713436,
-0.0963975042104721,
0.06771979480981827,
-0.005867911968380213,
-0.1053127869963646,
0.012224128469824791,
0.024641718715429306,
-0.05003855749964714,
-0.17139597237110138,
-0.003964426927268505,
-0.06400805711746216,
-0.07797104120254517,
-0.021582622081041336,
-0.11778897792100906,
0.08025926351547241,
-0.043968621641397476,
0.0946488231420517,
-0.0016147588612511754,
0.06929881870746613,
-0.16780473291873932,
-0.1389879435300827,
0.13574333488941193,
-0.0289804395288229,
0.12760652601718903,
-0.09333953261375427,
-0.03150486573576927,
0.06347881257534027,
0.04128744453191757,
0.04535277187824249,
0.03360865265130997,
-0.05580136179924011,
0.007032215129584074,
-0.10356844961643219,
-0.03433554619550705,
-0.020004767924547195,
0.011555210687220097,
0.059605952352285385,
0.14741010963916779,
0.10263333469629288,
-0.04996861517429352,
0.04484648257493973,
0.07344049215316772,
-0.039083562791347504,
-0.10252203047275543,
-0.09704169631004333,
-0.09178562462329865,
-0.02884209342300892,
0.14420826733112335,
-0.05463092401623726,
-0.07033250480890274,
-0.043071504682302475,
0.13214902579784393,
0.3028101325035095,
-0.14010992646217346,
0.03837836533784866,
-0.0009922452736645937,
0.013144578784704208,
-0.0015842011198401451,
-0.022265013307332993,
0.03401662036776543,
0.2002333104610443,
0.018136346712708473,
0.0025580187793821096,
-0.04769059270620346,
0.002767621772363782,
-0.07243460416793823,
0.016194457188248634,
-0.06043794006109238,
-0.07634085416793823,
0.05233090743422508,
0.17170365154743195,
-0.1083475649356842,
-0.1205982118844986,
-0.12706008553504944,
-0.15207648277282715,
-0.09526805579662323,
-0.03393620252609253,
0.005829977802932262,
0.17522023618221283,
0.02045734040439129,
-0.04626047983765602,
-0.07813993096351624,
0.06815822422504425,
-0.00010085738904308528,
-0.0983361303806305,
-0.07554084807634354,
0.04181511700153351,
-0.21115124225616455,
0.013666016049683094,
-0.03946390375494957,
0.1076161190867424,
0.0338328517973423,
0.0648951455950737,
0.034822963178157806,
0.07592447102069855,
0.056926533579826355,
-0.0746041014790535,
0.05972389504313469,
0.055339742451906204,
-0.041746508330106735,
0.24226504564285278,
0.06397871673107147,
-0.07374154031276703,
0.08385748416185379,
-0.0034244246780872345,
-0.0809912234544754,
-0.011309544555842876,
0.0792652890086174,
-0.1288171261548996,
0.07544060051441193,
0.01853608526289463,
-0.010503677651286125,
-0.00835702195763588,
0.0016220599645748734,
-0.005030088592320681,
0.02506297640502453,
0.005483277607709169,
-0.04848705977201462,
-0.1856595277786255,
-0.025056492537260056,
-0.05095686390995979,
0.03517621383070946,
-0.12478934973478317,
0.013735240325331688,
-0.12534698843955994,
0.03258664906024933,
-0.04260607063770294,
0.053959816694259644,
0.05099565163254738,
-0.04347280412912369,
0.02236686646938324,
-0.053242236375808716,
0.06575163453817368,
0.0733945220708847,
-0.023425932973623276,
-0.01607334055006504
] |
null | null |
fairseq
|
# xm_transformer_600m-fr_en-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- French-English
- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/fastspeech2-en-ljspeech](https://huggingface.co/facebook/fastspeech2-en-ljspeech)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.text_to_speech.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-fr_en-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/fastspeech2-en-ljspeech",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
@inproceedings{wang-etal-2021-fairseq,
title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit",
author = "Wang, Changhan and
Hsu, Wei-Ning and
Adi, Yossi and
Polyak, Adam and
Lee, Ann and
Chen, Peng-Jen and
Gu, Jiatao and
Pino, Juan",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-demo.17",
doi = "10.18653/v1/2021.emnlp-demo.17",
pages = "143--152",
}
```
|
{"language": "fr-en", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["mtedx", "covost2", "europarl_st", "voxpopuli"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-fr_en-multi_domain/resolve/main/common_voice_fr_19731305.mp3"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-fr_en-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:mtedx",
"dataset:covost2",
"dataset:europarl_st",
"dataset:voxpopuli",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"fr-en"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-fr_en-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- French-English
- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/fastspeech2-en-ljspeech
## Usage
|
[
"# xm_transformer_600m-fr_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- French-English\n- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-fr_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- French-English\n- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech",
"## Usage"
] |
[
74,
104,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #dataset-europarl_st #dataset-voxpopuli #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-fr_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- French-English\n- Trained on mTEDx, CoVoST 2, EuroParl-ST, VoxPopuli, Multilingual LibriSpeech, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech## Usage"
] |
[
-0.15418685972690582,
0.0839628279209137,
-0.004394817166030407,
-0.008799498900771141,
0.006397781893610954,
-0.06472437083721161,
0.11978844553232193,
0.05817558988928795,
-0.027079686522483826,
0.04079011082649231,
-0.003133694641292095,
0.031899452209472656,
0.02287191152572632,
0.12896966934204102,
-0.02524019032716751,
-0.16618414223194122,
0.07739294320344925,
-0.056533265858888626,
-0.022052861750125885,
0.04373498633503914,
0.12129680812358856,
-0.07184100151062012,
0.033877696841955185,
0.05764397233724594,
-0.043252840638160706,
0.03566504269838333,
0.01815413124859333,
-0.10546411573886871,
0.09888261556625366,
0.06918221712112427,
-0.0011353855952620506,
0.044446252286434174,
0.07965637743473053,
-0.15823189914226532,
0.02891518734395504,
-0.008533851243555546,
-0.011067292653024197,
0.01399290468543768,
0.0392272025346756,
-0.05047786980867386,
0.10165920108556747,
-0.02587812766432762,
-0.023858429864048958,
0.07085488736629486,
-0.07371250540018082,
-0.09355258196592331,
-0.050021473318338394,
-0.06060617044568062,
0.00868894625455141,
0.08112479001283646,
-0.07364002615213394,
0.04655841737985611,
-0.11594551056623459,
0.07351378351449966,
0.04607105255126953,
-0.22530557215213776,
-0.03651927411556244,
-0.06784605979919434,
0.0743703618645668,
0.07641824334859848,
-0.050832830369472504,
0.09470508247613907,
-0.0016671960474923253,
-0.00004062889638589695,
-0.12005974352359772,
-0.1278674155473709,
-0.2023608684539795,
-0.045405223965644836,
-0.1423795372247696,
0.0633435845375061,
0.344633013010025,
0.05010272562503815,
-0.05636361241340637,
-0.039720047265291214,
0.022849811241030693,
0.09402384608983994,
-0.07388163357973099,
-0.048779841512441635,
0.0027660054620355368,
0.015877434983849525,
-0.011370976455509663,
-0.09561454504728317,
-0.11232021450996399,
0.001305907848291099,
-0.09533453732728958,
0.10340932756662369,
-0.008319815620779991,
0.012543252669274807,
0.00100589613430202,
-0.031774938106536865,
-0.12521040439605713,
-0.015584822744131088,
0.027758194133639336,
-0.039814293384552,
-0.04842143505811691,
-0.0009671385632827878,
0.00913110189139843,
-0.16434019804000854,
0.09842006862163544,
-0.09455884248018265,
-0.07256893813610077,
-0.01366942934691906,
-0.11979832500219345,
0.06394728273153305,
0.06313987076282501,
-0.0063768960535526276,
-0.11799365282058716,
-0.10640984028577805,
0.015428954735398293,
-0.01880851946771145,
0.018772607669234276,
0.016171086579561234,
-0.12908759713172913,
-0.022908341139554977,
-0.13445129990577698,
0.08345235884189606,
0.01956949383020401,
-0.03556661307811737,
-0.08580026775598526,
-0.021685156971216202,
0.047571826726198196,
-0.03553689643740654,
0.01885986514389515,
0.05845239758491516,
-0.028302984312176704,
0.11302582919597626,
0.01724732108414173,
0.09422319382429123,
-0.07490599155426025,
-0.04266754165291786,
-0.019473988562822342,
0.06701980531215668,
0.036217059940099716,
-0.11923251301050186,
0.07560313493013382,
-0.0025330018252134323,
-0.019929224625229836,
-0.10982782393693924,
-0.013300816528499126,
-0.0758972242474556,
-0.06363381445407867,
-0.0066185458563268185,
-0.028191687539219856,
-0.18620193004608154,
-0.028113452717661858,
0.030753331258893013,
-0.05533453822135925,
0.006617488339543343,
-0.08374472707509995,
0.0007934800814837217,
-0.16357232630252838,
0.06352830678224564,
-0.13242483139038086,
0.05545594170689583,
-0.016392605379223824,
-0.062436673790216446,
-0.1393766850233078,
0.13222797214984894,
-0.10720857232809067,
-0.05917756259441376,
-0.0981421247124672,
-0.04224257916212082,
-0.049927208572626114,
0.07699626684188843,
-0.0047154854983091354,
0.14720258116722107,
-0.2618728280067444,
-0.08252902328968048,
0.13484381139278412,
-0.048545099794864655,
-0.008289158344268799,
0.19075362384319305,
0.011298571713268757,
0.052893951535224915,
0.08839081227779388,
0.2691422998905182,
0.06462176144123077,
-0.25316867232322693,
-0.03625749424099922,
0.06034747511148453,
0.014972484670579433,
-0.022032758221030235,
0.07678371667861938,
-0.10722114145755768,
0.11620447784662247,
-0.03659103438258171,
0.0953282043337822,
0.033623889088630676,
-0.04374222084879875,
-0.03229134902358055,
0.05815303325653076,
-0.05482522025704384,
0.12506140768527985,
-0.03633503243327141,
-0.01930948905646801,
-0.0695500299334526,
-0.04398383945226669,
0.06868968158960342,
0.07835385948419571,
-0.07794298976659775,
0.0779462456703186,
-0.10177085548639297,
0.06686923652887344,
-0.04484257102012634,
0.010260126553475857,
-0.09786930680274963,
0.01956211030483246,
-0.03148249536752701,
0.11271228641271591,
0.20075415074825287,
0.19794408977031708,
-0.011914188042283058,
-0.005393380764871836,
-0.061246998608112335,
0.0674998089671135,
0.016985271126031876,
0.028579512611031532,
-0.035203102976083755,
-0.21065722405910492,
0.12135709822177887,
-0.09019142389297485,
0.1169329285621643,
-0.033167678862810135,
-0.053891588002443314,
0.18417814373970032,
0.09697958081960678,
0.03243039920926094,
0.001950203673914075,
0.055704161524772644,
0.09300022572278976,
0.03933718055486679,
0.05102868750691414,
0.02872699871659279,
-0.029226642102003098,
-0.07040721923112869,
0.23279507458209991,
-0.1017676517367363,
0.04828689247369766,
0.13531626760959625,
-0.09644143283367157,
-0.0005361377843655646,
0.011932811699807644,
-0.011322182603180408,
0.03280971571803093,
-0.00871855579316616,
-0.057456932961940765,
0.21197238564491272,
-0.024782419204711914,
0.0949479416012764,
-0.10699149966239929,
-0.002496499801054597,
0.01920515112578869,
-0.041087787598371506,
-0.039614804089069366,
0.14193688333034515,
0.001202202751301229,
-0.08096781373023987,
0.05931901931762695,
0.16111700236797333,
0.017522580921649933,
0.2780076265335083,
-0.03581259027123451,
-0.017393507063388824,
-0.015181390568614006,
-0.02193419449031353,
-0.04252581298351288,
0.09172746539115906,
-0.18094199895858765,
-0.007010720670223236,
0.026637239381670952,
0.06787753105163574,
0.0688641145825386,
-0.09044210612773895,
0.004253394436091185,
-0.04575395956635475,
-0.11163395643234253,
-0.06523032486438751,
0.05696636810898781,
0.020802125334739685,
0.09600622206926346,
-0.06166942045092583,
-0.12833617627620697,
-0.007985393516719341,
-0.05445313826203346,
-0.10869671404361725,
0.07108418643474579,
-0.2567192018032074,
-0.2336946278810501,
-0.06280447542667389,
0.01339420024305582,
-0.077315554022789,
0.020596694201231003,
0.07339078187942505,
-0.07406125962734222,
-0.057097457349300385,
-0.04941530525684357,
0.07971861213445663,
-0.03718610107898712,
-0.041518762707710266,
-0.04613631218671799,
0.011011778376996517,
-0.025932030752301216,
-0.12710930407047272,
0.006996144540607929,
-0.052625708281993866,
0.016805721446871758,
-0.04292163997888565,
-0.020326117053627968,
0.07431590557098389,
0.13890163600444794,
0.06171141564846039,
-0.024823101237416267,
-0.07137171924114227,
0.15113013982772827,
-0.1512606143951416,
0.0072032855823636055,
0.12052325904369354,
0.02276342734694481,
-0.023910265415906906,
0.14245113730430603,
0.042640138417482376,
-0.01333186961710453,
-0.01674988679587841,
-0.01961677335202694,
-0.008940359577536583,
-0.23563317954540253,
-0.2098556011915207,
-0.1253538429737091,
0.03503868728876114,
-0.10665272921323776,
-0.010546659119427204,
-0.03328123316168785,
-0.04343912750482559,
-0.022548088803887367,
-0.056097328662872314,
0.10227391868829727,
-0.01616509258747101,
0.2707817852497101,
-0.07380605489015579,
0.09386329352855682,
-0.06985843181610107,
-0.0768301859498024,
0.14154832065105438,
0.01864844560623169,
0.05293136462569237,
0.09705892950296402,
0.14074359834194183,
0.054880205541849136,
0.01820085383951664,
0.05045260116457939,
0.03993001952767372,
0.06974250823259354,
0.009600351564586163,
-0.03396251052618027,
-0.08066630363464355,
0.057630643248558044,
0.023433391004800797,
0.3338819444179535,
-0.11535030603408813,
-0.025433208793401718,
0.0059425407089293,
0.09486064314842224,
0.06621900945901871,
0.1167336255311966,
-0.031083814799785614,
0.04337257519364357,
0.00963984802365303,
-0.05293986201286316,
-0.016855815425515175,
0.08944887667894363,
0.24965840578079224,
-0.07399697601795197,
0.11372547596693039,
0.11491453647613525,
0.03791939094662666,
-0.035473234951496124,
0.026600021868944168,
-0.1353529840707779,
-0.007910986430943012,
-0.00788365863263607,
0.03364775329828262,
-0.11838119477033615,
0.20753180980682373,
0.07774268090724945,
0.027065202593803406,
-0.047310736030340195,
0.0016032835701480508,
0.038625121116638184,
0.08609161525964737,
0.18508554995059967,
0.011369072832167149,
-0.047292813658714294,
0.005600855685770512,
-0.09697765111923218,
-0.006068112328648567,
0.11326394230127335,
0.07909032702445984,
-0.03908340260386467,
0.05082770437002182,
-0.07377662509679794,
-0.0007422530907206237,
-0.0647260844707489,
-0.20293694734573364,
-0.1297810971736908,
0.05091938003897667,
0.21157652139663696,
0.03122941218316555,
0.004725171718746424,
-0.10171730071306229,
-0.1694340705871582,
0.06807728111743927,
-0.04594241455197334,
0.0030103581957519054,
-0.1199687197804451,
-0.07768233865499496,
0.14347676932811737,
-0.01221559103578329,
0.039387963712215424,
0.03337710723280907,
0.04010644555091858,
-0.11485104262828827,
0.0020245136693120003,
0.11296524852514267,
-0.11501900851726532,
-0.04445602744817734,
-0.04607945680618286,
0.26159486174583435,
0.06212152913212776,
0.0972844734787941,
0.06041141599416733,
0.011103224940598011,
0.009606058709323406,
-0.08120019733905792,
0.024939214810729027,
-0.020053565502166748,
-0.037173956632614136,
0.13255371153354645,
-0.048889853060245514,
-0.22773891687393188,
-0.022645486518740654,
-0.04838449880480766,
0.23430293798446655,
0.17233408987522125,
-0.03715512529015541,
0.1353977471590042,
0.1893138289451599,
-0.0584925077855587,
-0.2975143492221832,
0.022493496537208557,
0.03771643340587616,
0.04480905458331108,
-0.06714197993278503,
-0.136694073677063,
0.046108826994895935,
0.021328726783394814,
-0.016000157222151756,
0.040521733462810516,
-0.20405277609825134,
-0.1449967920780182,
0.11631935834884644,
-0.10207580029964447,
0.12275463342666626,
0.019685497507452965,
-0.030540291219949722,
-0.04246770218014717,
0.04902533441781998,
0.1426762342453003,
-0.23288589715957642,
0.08751771599054337,
0.11000595986843109,
0.07002807408571243,
0.02029310166835785,
0.04983682930469513,
0.1209757998585701,
0.09597181528806686,
-0.042814090847969055,
0.0528431236743927,
0.027619464322924614,
0.045354120433330536,
0.07297240197658539,
0.04085210710763931,
0.010528136044740677,
-0.05228935927152634,
-0.08009520918130875,
0.002185638528317213,
-0.07477131485939026,
0.09939850121736526,
0.015243221074342728,
-0.0125739686191082,
-0.06864631175994873,
-0.008574147708714008,
-0.02086891047656536,
0.024242399260401726,
0.10593929886817932,
-0.15700563788414001,
-0.00023321359185501933,
0.1631060689687729,
0.262322336435318,
-0.11011172086000443,
0.028834711760282516,
0.02958543784916401,
-0.08427134156227112,
0.01634823903441429,
0.00018841320706997067,
0.031214889138936996,
0.10105524957180023,
-0.031207110732793808,
0.10659115016460419,
-0.014906475320458412,
-0.0886930525302887,
0.07463766634464264,
0.052344854921102524,
-0.08539512753486633,
-0.12563160061836243,
-0.046231914311647415,
-0.005271242931485176,
0.030961215496063232,
0.02161855250597,
0.2801247835159302,
-0.0021901261061429977,
0.0051320684142410755,
-0.05477812513709068,
-0.013508643954992294,
-0.09395508468151093,
0.17991921305656433,
0.0727076381444931,
0.001995073165744543,
-0.1297653168439865,
0.07162000983953476,
0.0198748167604208,
-0.10391132533550262,
0.0064568668603897095,
0.0667988732457161,
-0.06230025738477707,
-0.09484928846359253,
-0.10889498144388199,
0.08258374035358429,
-0.01774776168167591,
-0.10197269171476364,
-0.032212547957897186,
-0.16083663702011108,
0.02468448132276535,
0.20927694439888,
-0.01651662029325962,
-0.0058422633446753025,
-0.06613337248563766,
-0.06363465636968613,
0.042921919375658035,
0.06652386486530304,
0.044436972588300705,
-0.05033937096595764,
-0.00829419307410717,
0.13280808925628662,
-0.06017817184329033,
0.08265336602926254,
-0.03929710015654564,
-0.008851375430822372,
-0.042650219053030014,
0.03488808497786522,
-0.030987637117505074,
-0.006219846662133932,
-0.0494794137775898,
-0.013411586172878742,
0.006510667037218809,
-0.06024668738245964,
-0.03785991296172142,
-0.002947706263512373,
-0.12066400051116943,
0.054586660116910934,
0.018496068194508553,
0.07833494991064072,
-0.08835102617740631,
0.008122083730995655,
0.00839751772582531,
0.007273150607943535,
0.1070052832365036,
0.07812287658452988,
-0.04518090933561325,
0.09285128116607666,
-0.18682777881622314,
-0.08927838504314423,
0.11433601379394531,
0.09553229063749313,
0.06833498179912567,
0.0258625578135252,
-0.025958918035030365,
0.0951947346329689,
0.09203436225652695,
0.0033228122629225254,
-0.012650342658162117,
-0.019314328208565712,
0.0654553771018982,
-0.10096406191587448,
-0.06234567239880562,
-0.03439236804842949,
0.044813498854637146,
0.11769537627696991,
0.05738852545619011,
0.1085181012749672,
-0.0779019296169281,
0.030671782791614532,
-0.00034434819826856256,
0.0543527714908123,
-0.010111206211149693,
-0.10164064913988113,
-0.014403652399778366,
-0.06329844892024994,
0.08538375049829483,
-0.02740630879998207,
0.09142833948135376,
0.03803180530667305,
0.005840519443154335,
0.027652747929096222,
-0.03845304623246193,
-0.08451402932405472,
0.05399663373827934,
-0.02051897905766964,
0.06494132429361343,
-0.013922237791121006,
-0.10439110547304153,
-0.03069484420120716,
0.0900266170501709,
0.10840734094381332,
0.03542453795671463,
0.04123418405652046,
0.16672086715698242,
0.11077025532722473,
0.08623063564300537,
-0.0736067146062851,
0.02386164665222168,
0.05500297248363495,
-0.07035017758607864,
0.03185877203941345,
-0.11003904789686203,
0.08496059477329254,
-0.016000112518668175,
-0.10661067813634872,
0.016156475991010666,
0.028170324862003326,
-0.05852598324418068,
-0.17312824726104736,
0.01652481034398079,
-0.06779811531305313,
-0.07510819286108017,
-0.01730639487504959,
-0.11732035875320435,
0.08926859498023987,
-0.0442693755030632,
0.11206676065921783,
0.011929661966860294,
0.07151001691818237,
-0.19166839122772217,
-0.12237446010112762,
0.14469310641288757,
-0.03826546296477318,
0.11179866641759872,
-0.08893255889415741,
-0.038864750415086746,
0.0625961646437645,
0.020780431106686592,
0.04371502622961998,
0.02884409949183464,
-0.056334689259529114,
0.012267803773283958,
-0.10279606282711029,
-0.03510337322950363,
-0.023922566324472427,
0.006897240411490202,
0.05929195135831833,
0.14489243924617767,
0.10831443965435028,
-0.05879291892051697,
0.047047343105077744,
0.057019684463739395,
-0.042058397084474564,
-0.11200819909572601,
-0.10543185472488403,
-0.09308192878961563,
-0.029335785657167435,
0.14121125638484955,
-0.05376935750246048,
-0.05868185684084892,
-0.04434826597571373,
0.12717100977897644,
0.31250086426734924,
-0.13812723755836487,
0.04138907417654991,
0.006891354918479919,
0.011323212645947933,
0.0038609111215919256,
-0.03689068183302879,
0.04752262681722641,
0.18376056849956512,
0.011734924279153347,
0.003690468380227685,
-0.04766957461833954,
-0.008691013790667057,
-0.06803717464208603,
0.015469285659492016,
-0.052929092198610306,
-0.07502300292253494,
0.045037854462862015,
0.1589311808347702,
-0.10667909681797028,
-0.10841359198093414,
-0.12728634476661682,
-0.14957918226718903,
-0.08552966266870499,
-0.029890025034546852,
-0.014958132989704609,
0.17750194668769836,
0.015113289467990398,
-0.051142368465662,
-0.07742001116275787,
0.08057566732168198,
-0.006928494665771723,
-0.0960567519068718,
-0.07295308262109756,
0.03848963603377342,
-0.207822784781456,
0.022182736545801163,
-0.031153479591012,
0.11613328754901886,
0.03339724615216255,
0.0560850128531456,
0.03903768211603165,
0.06235494837164879,
0.05717185139656067,
-0.07808363437652588,
0.03778364509344101,
0.07411875575780869,
-0.04880886524915695,
0.25073501467704773,
0.05030728504061699,
-0.05769900977611542,
0.08638256043195724,
-0.01924186199903488,
-0.09406942129135132,
-0.008777123875916004,
0.08111896365880966,
-0.12793539464473724,
0.08125822246074677,
0.020884808152914047,
-0.005884589161723852,
-0.004410506691783667,
-0.005030312575399876,
-0.017755847424268723,
0.017651615664362907,
-0.015802379697561264,
-0.04455080255866051,
-0.17593133449554443,
-0.006512799765914679,
-0.058398835361003876,
0.0391526035964489,
-0.1248406246304512,
0.005773428361862898,
-0.12648947536945343,
0.04438961669802666,
-0.03778583183884621,
0.05591367557644844,
0.0541299432516098,
-0.05337587371468544,
0.018293756991624832,
-0.05092751607298851,
0.0678434893488884,
0.06875970959663391,
-0.014397100545465946,
-0.018681753426790237
] |
null | null |
fairseq
|
# xm_transformer_600m-ru_en-multi_domain
[W2V2-Transformer](https://aclanthology.org/2021.acl-long.68/) speech-to-text translation model from fairseq S2T ([paper](https://arxiv.org/abs/2010.05171)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_to_text)):
- Russian-English
- Trained on mTEDx, CoVoST 2, OpenSTT, Common Voice v7 and CCMatrix
- Speech synthesis with [facebook/fastspeech2-en-ljspeech](https://huggingface.co/facebook/fastspeech2-en-ljspeech)
## Usage
```python
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub
from fairseq.models.text_to_speech.hub_interface import S2THubInterface
from fairseq.models.text_to_speech.hub_interface import TTSHubInterface
import IPython.display as ipd
import torchaudio
models, cfg, task = load_model_ensemble_and_task_from_hf_hub(
"facebook/xm_transformer_600m-ru_en-multi_domain",
arg_overrides={"config_yaml": "config.yaml"},
)
model = models[0]
generator = task.build_generator(model, cfg)
# requires 16000Hz mono channel audio
audio, _ = torchaudio.load("/path/to/an/audio/file")
sample = S2THubInterface.get_model_input(task, audio)
text = S2THubInterface.get_prediction(task, model, generator, sample)
# speech synthesis
tts_models, tts_cfg, tts_task = load_model_ensemble_and_task_from_hf_hub(
f"facebook/fastspeech2-en-ljspeech",
arg_overrides={"vocoder": "griffin_lim", "fp16": False},
)
tts_model = tts_models[0]
TTSHubInterface.update_cfg_with_data_cfg(tts_cfg, tts_task.data_cfg)
tts_generator = tts_task.build_generator([tts_model], tts_cfg)
tts_sample = TTSHubInterface.get_model_input(tts_task, text)
wav, sr = TTSHubInterface.get_prediction(
tts_task, tts_model, tts_generator, tts_sample
)
ipd.Audio(wav, rate=rate)
```
## Citation
```bibtex
@inproceedings{li-etal-2021-multilingual,
title = "Multilingual Speech Translation from Efficient Finetuning of Pretrained Models",
author = "Li, Xian and
Wang, Changhan and
Tang, Yun and
Tran, Chau and
Tang, Yuqing and
Pino, Juan and
Baevski, Alexei and
Conneau, Alexis and
Auli, Michael",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.68",
doi = "10.18653/v1/2021.acl-long.68",
pages = "827--838",
}
@inproceedings{wang-etal-2020-fairseq,
title = "Fairseq {S}2{T}: Fast Speech-to-Text Modeling with Fairseq",
author = "Wang, Changhan and
Tang, Yun and
Ma, Xutai and
Wu, Anne and
Okhonko, Dmytro and
Pino, Juan",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: System Demonstrations",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-demo.6",
pages = "33--39",
}
@inproceedings{wang-etal-2021-fairseq,
title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit",
author = "Wang, Changhan and
Hsu, Wei-Ning and
Adi, Yossi and
Polyak, Adam and
Lee, Ann and
Chen, Peng-Jen and
Gu, Jiatao and
Pino, Juan",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-demo.17",
doi = "10.18653/v1/2021.emnlp-demo.17",
pages = "143--152",
}
```
|
{"language": "ru-en", "library_name": "fairseq", "tags": ["fairseq", "audio", "audio-to-audio", "speech-to-speech-translation"], "datasets": ["mtedx", "covost2"], "task": "audio-to-audio", "widget": [{"example_title": "Common Voice sample 1", "src": "https://huggingface.co/facebook/xm_transformer_600m-ru_en-multi_domain/resolve/main/common_voice_ru_18945535.flac"}]}
|
audio-to-audio
|
facebook/xm_transformer_600m-ru_en-multi_domain
|
[
"fairseq",
"audio",
"audio-to-audio",
"speech-to-speech-translation",
"dataset:mtedx",
"dataset:covost2",
"arxiv:2010.05171",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.05171"
] |
[
"ru-en"
] |
TAGS
#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #arxiv-2010.05171 #has_space #region-us
|
# xm_transformer_600m-ru_en-multi_domain
W2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):
- Russian-English
- Trained on mTEDx, CoVoST 2, OpenSTT, Common Voice v7 and CCMatrix
- Speech synthesis with facebook/fastspeech2-en-ljspeech
## Usage
|
[
"# xm_transformer_600m-ru_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- Russian-English\n- Trained on mTEDx, CoVoST 2, OpenSTT, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech",
"## Usage"
] |
[
"TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n",
"# xm_transformer_600m-ru_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- Russian-English\n- Trained on mTEDx, CoVoST 2, OpenSTT, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech",
"## Usage"
] |
[
58,
89,
3
] |
[
"passage: TAGS\n#fairseq #audio #audio-to-audio #speech-to-speech-translation #dataset-mtedx #dataset-covost2 #arxiv-2010.05171 #has_space #region-us \n# xm_transformer_600m-ru_en-multi_domain\n\nW2V2-Transformer speech-to-text translation model from fairseq S2T (paper/code):\n- Russian-English\n- Trained on mTEDx, CoVoST 2, OpenSTT, Common Voice v7 and CCMatrix\n- Speech synthesis with facebook/fastspeech2-en-ljspeech## Usage"
] |
[
-0.14277034997940063,
0.027694476768374443,
-0.004204226192086935,
-0.03422290086746216,
0.05018642917275429,
-0.07203462719917297,
0.1154492199420929,
0.055206723511219025,
-0.018337996676564217,
-0.010680412873625755,
0.014064528979361057,
-0.01499759778380394,
0.018880510702729225,
0.13806591928005219,
-0.05371055006980896,
-0.17988689243793488,
0.05308961495757103,
-0.08146598935127258,
-0.057267509400844574,
0.040670476853847504,
0.12812386453151703,
-0.07538440823554993,
0.0021895524114370346,
0.037987612187862396,
-0.05663829296827316,
0.049227919429540634,
0.05601831153035164,
-0.11724166572093964,
0.04886705428361893,
0.10054735094308853,
-0.0016888108802959323,
0.0905534103512764,
0.04588015004992485,
-0.10147631168365479,
0.03704609349370003,
-0.02690376713871956,
-0.015490836463868618,
-0.004057746846228838,
0.04952315241098404,
-0.03191360458731651,
0.09230753034353256,
-0.05756285786628723,
-0.06217341125011444,
0.0714072659611702,
-0.08481945097446442,
-0.10752130299806595,
-0.030319303274154663,
-0.024522945284843445,
0.03431392088532448,
0.10463601350784302,
-0.07453577220439911,
0.03519643470644951,
-0.07500136643648148,
0.08345524221658707,
0.033084332942962646,
-0.27026742696762085,
-0.03291850909590721,
-0.02474040724337101,
0.04676998406648636,
0.10149276256561279,
-0.02252170257270336,
0.10518338531255722,
-0.007027823012322187,
-0.03503502160310745,
-0.1270064264535904,
-0.1250012367963791,
-0.14221619069576263,
-0.016605624929070473,
-0.154526486992836,
0.08091180771589279,
0.31506776809692383,
0.056460391730070114,
-0.023487752303481102,
-0.02885494939982891,
0.008779752999544144,
0.07454615831375122,
-0.034181371331214905,
-0.031587839126586914,
-0.035242579877376556,
0.03126607835292816,
-0.07236242294311523,
-0.13639305531978607,
-0.14063484966754913,
-0.00870828703045845,
-0.08641856908798218,
0.2107313573360443,
-0.026046913117170334,
0.055441346019506454,
-0.041167035698890686,
-0.04640743136405945,
-0.09891243278980255,
-0.022954238578677177,
0.051791608333587646,
-0.0873490422964096,
-0.06055387109518051,
0.04143708571791649,
-0.042345862835645676,
-0.20138786733150482,
0.09875509142875671,
-0.17255984246730804,
-0.10841044783592224,
0.013872163370251656,
-0.03831317275762558,
0.09012816101312637,
0.035822365432977676,
0.030312689021229744,
-0.07800497114658356,
-0.0707455426454544,
0.015710067003965378,
-0.024538591504096985,
-0.0022540672216564417,
-0.001443023793399334,
-0.1626642793416977,
-0.061139192432165146,
-0.15457561612129211,
0.06955690681934357,
0.002703282982110977,
0.013660687953233719,
-0.022570528090000153,
-0.03246576711535454,
0.11011376976966858,
-0.03551897034049034,
0.005979800131171942,
0.058537472039461136,
-0.019714774563908577,
0.11220631748437881,
0.016309231519699097,
0.09772248566150665,
-0.059891991317272186,
-0.0777004286646843,
0.04606544226408005,
0.07744695246219635,
0.030460244044661522,
-0.13094519078731537,
0.03937086462974548,
0.027378076687455177,
0.0026669746730476618,
-0.1963435858488083,
0.04492003098130226,
-0.0798887088894844,
-0.0652308315038681,
0.0401877798140049,
-0.031339071691036224,
-0.14466995000839233,
-0.04924287647008896,
0.0442037507891655,
-0.059760451316833496,
-0.03545133396983147,
-0.059650883078575134,
-0.006819507572799921,
-0.15670278668403625,
0.0768246278166771,
-0.14382173120975494,
0.07180117815732956,
0.0028127969708293676,
-0.06734669953584671,
-0.07777856290340424,
0.15842923521995544,
-0.09221144020557404,
-0.041697315871715546,
-0.05337924882769585,
-0.019079608842730522,
-0.08825021237134933,
0.06401267647743225,
0.004553050268441439,
0.15051457285881042,
-0.20958803594112396,
-0.05858978256583214,
0.10210713744163513,
-0.025348100811243057,
0.012814272195100784,
0.15754932165145874,
0.028280554339289665,
0.03382520377635956,
0.09306338429450989,
0.23285849392414093,
0.09834956377744675,
-0.22913922369480133,
-0.010248325765132904,
0.06137377768754959,
-0.048232320696115494,
-0.03489416837692261,
0.08412590622901917,
-0.06684152036905289,
0.06192398816347122,
-0.032839108258485794,
0.16077306866645813,
0.05134449154138565,
-0.04115621745586395,
-0.04259050264954567,
0.06353393942117691,
-0.06639844924211502,
0.0965263620018959,
-0.08629801124334335,
0.017057335004210472,
-0.06828482449054718,
-0.052630096673965454,
-0.0030088392086327076,
0.08423729985952377,
-0.05222288519144058,
0.09381943196058273,
-0.1501827985048294,
0.007199012208729982,
-0.056782182306051254,
0.04519389197230339,
-0.09426818042993546,
0.09445574134588242,
-0.037690550088882446,
0.12166791409254074,
0.2007952779531479,
0.19453482329845428,
0.0006077439757063985,
-0.04877981171011925,
-0.060953252017498016,
0.04600366950035095,
0.03337343782186508,
0.03307696804404259,
-0.06103609874844551,
-0.18415896594524384,
0.13867883384227753,
-0.0862770825624466,
0.04365735873579979,
-0.030539032071828842,
-0.05268343538045883,
0.16402772068977356,
0.037424132227897644,
0.022864574566483498,
0.04512732848525047,
0.06760440766811371,
0.07516404986381531,
0.04564327001571655,
0.047612085938453674,
0.030327335000038147,
0.00019163508841302246,
-0.12832048535346985,
0.28040897846221924,
-0.11562275886535645,
0.058720193803310394,
0.12856517732143402,
-0.10584793239831924,
0.014800751581788063,
0.011503010056912899,
-0.002754582790657878,
0.024858981370925903,
0.05891085043549538,
-0.0181119404733181,
0.23619990050792694,
-0.06123722344636917,
0.0718432366847992,
-0.08031734824180603,
0.053940802812576294,
0.027223341166973114,
-0.0679628998041153,
-0.03785130754113197,
0.12221592664718628,
-0.021978314965963364,
-0.21668440103530884,
0.018522797152400017,
0.19894035160541534,
0.018631242215633392,
0.31086960434913635,
-0.012286818586289883,
-0.007600987330079079,
-0.012356028892099857,
-0.04115005210042,
-0.06232502683997154,
0.0858120247721672,
-0.11171633750200272,
-0.025958577170968056,
0.011002941988408566,
0.07416602224111557,
0.10474522411823273,
-0.07544565945863724,
0.0021844743750989437,
-0.021622831001877785,
-0.10442967712879181,
-0.11119254678487778,
0.07950848340988159,
0.012833632528781891,
0.06924024224281311,
-0.04929828271269798,
-0.07171350717544556,
0.010889247059822083,
-0.07179147005081177,
-0.12256059795618057,
0.01446884498000145,
-0.24423648416996002,
-0.25964999198913574,
-0.07807094603776932,
0.03360168635845184,
-0.03257486969232559,
0.035467952489852905,
0.10594222694635391,
-0.10520847886800766,
-0.02091451734304428,
-0.030385242775082588,
0.1061936542391777,
-0.0690261721611023,
0.0039052527863532305,
-0.017201419919729233,
0.03120378404855728,
-0.006636142730712891,
-0.06768627464771271,
0.0023035435006022453,
-0.0304581206291914,
-0.004400656092911959,
0.03576333448290825,
-0.035980965942144394,
0.03619585558772087,
0.14909237623214722,
0.06032504886388779,
-0.01886221393942833,
-0.09157311171293259,
0.1094723716378212,
-0.14159083366394043,
0.002663999330252409,
0.12472482770681381,
-0.004455827176570892,
-0.007959720678627491,
0.12776166200637817,
0.014402386732399464,
0.014794210903346539,
0.02289450913667679,
-0.060425739735364914,
-0.031699374318122864,
-0.22612929344177246,
-0.17654193937778473,
-0.12274561077356339,
0.05129695683717728,
-0.1327499896287918,
-0.014747943729162216,
-0.045567769557237625,
-0.042004287242889404,
-0.024501265957951546,
-0.09980957955121994,
0.09916014224290848,
-0.02224423736333847,
0.203498974442482,
-0.06436310708522797,
0.1061972975730896,
-0.08340878784656525,
-0.028859099373221397,
0.1348787397146225,
-0.022404003888368607,
0.05184873938560486,
0.12308290600776672,
0.055607523769140244,
0.047924142330884933,
0.03508285805583,
0.11861494928598404,
0.01558875571936369,
0.1021847277879715,
0.005704120267182589,
-0.027353012934327126,
-0.0906921848654747,
0.05943070724606514,
0.08703446388244629,
0.36068713665008545,
-0.14784441888332367,
0.0031856168061494827,
0.06067444756627083,
0.07348746061325073,
0.09811341017484665,
0.09410690516233444,
0.009772051125764847,
-0.018006546422839165,
0.0018695654580369592,
-0.032226625829935074,
0.004788458812981844,
0.09839094430208206,
0.25200825929641724,
-0.034325551241636276,
0.11978896707296371,
0.11286754906177521,
0.052749596536159515,
0.04319480061531067,
0.035213321447372437,
-0.1275206357240677,
-0.015511038713157177,
-0.014877605251967907,
0.025726178660988808,
-0.15414147078990936,
0.18326163291931152,
0.07239776849746704,
0.0256077591329813,
-0.02750670537352562,
-0.02758723869919777,
0.052027445286512375,
0.08802179247140884,
0.11998975276947021,
0.013145184144377708,
-0.08857591450214386,
-0.023290902376174927,
-0.11774255335330963,
0.00362770794890821,
0.15023843944072723,
0.11828675121068954,
-0.019696328788995743,
0.04557119309902191,
-0.06377696245908737,
0.015621660277247429,
0.00323038874194026,
-0.21287238597869873,
-0.10765347629785538,
0.03981607407331467,
0.2470599263906479,
0.027898188680410385,
0.010211796499788761,
-0.07430069148540497,
-0.187456876039505,
0.02342287264764309,
-0.030791833996772766,
-0.01787916198372841,
-0.10071289539337158,
-0.08429960161447525,
0.11970313638448715,
-0.04111158847808838,
0.03646199777722359,
0.03983741253614426,
0.008100052364170551,
-0.08849313110113144,
-0.04279866814613342,
0.1262507140636444,
-0.10414393991231918,
-0.06055818870663643,
-0.0065303463488817215,
0.2497691959142685,
0.06168239191174507,
0.10509199649095535,
0.016832470893859863,
0.003001305740326643,
0.01023807842284441,
-0.0548212006688118,
0.06677194684743881,
0.029858103021979332,
-0.10673284530639648,
0.1257634460926056,
-0.0007038443582132459,
-0.2097257524728775,
-0.07160631567239761,
-0.08423828333616257,
0.21878942847251892,
0.1844896376132965,
-0.08848597854375839,
0.15733008086681366,
0.15159285068511963,
-0.033489253371953964,
-0.2931022047996521,
0.01084684208035469,
0.0433543361723423,
0.09537187218666077,
-0.07453488558530807,
-0.13412801921367645,
-0.029028454795479774,
-0.07853502035140991,
-0.03202856704592705,
0.04042747989296913,
-0.11736265569925308,
-0.15317103266716003,
0.11767279356718063,
-0.15302862226963043,
0.18006454408168793,
0.009106011129915714,
-0.04255790263414383,
-0.04598430171608925,
0.09089416265487671,
0.13195069134235382,
-0.2090141326189041,
0.11251606047153473,
0.1189250573515892,
0.06227100268006325,
0.03402135148644447,
0.055092811584472656,
0.08781801909208298,
0.014286164194345474,
-0.03494958579540253,
0.021646780893206596,
0.004074322059750557,
-0.008579186163842678,
0.05136736109852791,
0.01315014436841011,
0.02990751340985298,
-0.0068909223191440105,
-0.054933007806539536,
-0.022419657558202744,
-0.05765078216791153,
0.059316541999578476,
0.059925902634859085,
-0.022400232031941414,
-0.06634582579135895,
-0.020318107679486275,
-0.013710175640881062,
0.002104957588016987,
0.1613936722278595,
-0.19082385301589966,
-0.04466578736901283,
0.18439175188541412,
0.22847461700439453,
-0.1067858636379242,
0.12888163328170776,
0.02225372940301895,
-0.10550758242607117,
0.06880733370780945,
-0.07318483293056488,
0.03149012476205826,
0.07740721851587296,
-0.05459383502602577,
0.09360214322805405,
-0.024451877921819687,
-0.08241914957761765,
0.08992040902376175,
0.07189950346946716,
-0.09891872107982635,
-0.11986328661441803,
-0.09042006731033325,
0.06026207655668259,
0.09788469225168228,
0.028939727693796158,
0.2807060182094574,
-0.043682776391506195,
0.03132592886686325,
-0.060589976608753204,
-0.01303535234183073,
-0.10246329009532928,
0.1499902307987213,
0.0778665617108345,
-0.00906466320157051,
-0.09100612252950668,
0.10393794625997543,
0.011466522701084614,
-0.09911055862903595,
0.0006299479282461107,
0.05782865732908249,
-0.05862775444984436,
-0.10306808352470398,
-0.1382078230381012,
0.02446814626455307,
0.013439824804663658,
-0.12188288569450378,
-0.0004975663032382727,
-0.14036309719085693,
0.02039600908756256,
0.16848410665988922,
-0.022238805890083313,
-0.0121547756716609,
-0.061520226299762726,
-0.01584695838391781,
0.07647696882486343,
0.026320183649659157,
0.030817542225122452,
-0.07942871749401093,
-0.06871739029884338,
0.14474378526210785,
-0.06327413767576218,
0.11461696028709412,
-0.06703339517116547,
-0.009575079195201397,
-0.008067099377512932,
0.04406937211751938,
-0.061659786850214005,
-0.008521510288119316,
-0.04035181924700737,
-0.02353585697710514,
-0.0005000833189114928,
-0.07710589468479156,
-0.05320005118846893,
0.02480742521584034,
-0.11141566932201385,
0.04936414211988449,
-0.022144222632050514,
0.04616568610072136,
-0.02128247544169426,
-0.022605476900935173,
-0.03596116974949837,
0.024157334119081497,
0.11511360853910446,
0.15296633541584015,
-0.0729416012763977,
0.09044097363948822,
-0.15324832499027252,
-0.06384466588497162,
0.14325806498527527,
0.09531307965517044,
0.04641150310635567,
0.06450703740119934,
-0.0409240797162056,
0.08684335649013519,
0.07085461914539337,
-0.026483874768018723,
0.004762357100844383,
-0.001542637008242309,
0.0281300600618124,
-0.13079863786697388,
0.004142236430197954,
-0.03501938655972481,
-0.0006090571987442672,
0.09949754178524017,
0.12722629308700562,
0.13057029247283936,
-0.10525549948215485,
0.06386154890060425,
0.01862947642803192,
0.04281456023454666,
-0.004326719790697098,
-0.09654074907302856,
-0.0060315122827887535,
-0.08344980329275131,
0.11340103298425674,
-0.05669214203953743,
0.08407124131917953,
0.009226874448359013,
-0.046603914350271225,
0.014711814001202583,
-0.11753116548061371,
-0.051274195313453674,
0.03466640040278435,
0.0006094707641750574,
0.07905387133359909,
-0.020199596881866455,
-0.11861886084079742,
-0.033217791467905045,
0.08790480345487595,
0.0933929905295372,
0.0014433411415666342,
0.06404649466276169,
0.19478271901607513,
0.1352284848690033,
0.10495279729366302,
-0.07811546325683594,
0.06133435666561127,
-0.003348646918311715,
-0.12750156223773956,
0.010521741583943367,
-0.13159091770648956,
0.028251247480511665,
0.04820284619927406,
-0.05537909269332886,
0.015699315816164017,
-0.0004887568647973239,
-0.031084343791007996,
-0.16714991629123688,
-0.0026783852372318506,
-0.08783422410488129,
-0.10262961685657501,
-0.02201816625893116,
-0.08038508147001266,
0.07551991939544678,
-0.10854701697826385,
0.09469946473836899,
-0.0011158599518239498,
0.09991581737995148,
-0.13674378395080566,
-0.15249933302402496,
0.17582017183303833,
-0.06043335050344467,
0.09536247700452805,
-0.022607242688536644,
-0.037606604397296906,
0.12330376356840134,
-0.0023610126227140427,
0.027364231646060944,
0.021562399342656136,
-0.09253857284784317,
0.01085895299911499,
-0.07668125629425049,
-0.04617529734969139,
-0.029877878725528717,
0.037506766617298126,
0.08719111979007721,
0.09963952004909515,
0.11645843833684921,
-0.04485911875963211,
0.028436655178666115,
0.08234983682632446,
-0.04574965685606003,
-0.15231117606163025,
-0.07209684699773788,
-0.08404333144426346,
-0.024257458746433258,
0.15146072208881378,
-0.08316867798566818,
-0.055012382566928864,
-0.0739707499742508,
0.11100119352340698,
0.3468850553035736,
-0.15894567966461182,
0.06909149885177612,
-0.022740671411156654,
0.02757350169122219,
-0.02562115341424942,
0.01781439408659935,
0.03298972174525261,
0.11062061041593552,
0.07290580868721008,
-0.028722381219267845,
-0.07759664207696915,
0.0157651137560606,
-0.03406474366784096,
0.02471843548119068,
-0.040462661534547806,
-0.09353533387184143,
0.04757431522011757,
0.14799708127975464,
-0.1210608184337616,
-0.09272260218858719,
-0.17867113649845123,
-0.14464929699897766,
-0.05640029162168503,
0.023469656705856323,
0.025781266391277313,
0.20873796939849854,
0.03939134627580643,
-0.07335659861564636,
-0.03870636224746704,
-0.004376796539872885,
-0.0037510378751903772,
-0.06093358248472214,
0.014108098112046719,
0.025884496048092842,
-0.22974060475826263,
-0.09329146891832352,
-0.014866250567138195,
0.13420265913009644,
0.003169155912473798,
0.08241917937994003,
0.051439616829156876,
0.12994010746479034,
0.039484914392232895,
-0.10279621928930283,
0.02863677777349949,
0.08748932927846909,
-0.053492557257413864,
0.2517731785774231,
0.09008925408124924,
-0.04603173956274986,
0.06626085937023163,
-0.018712962046265602,
-0.07219095528125763,
-0.0036032714415341616,
0.06695473939180374,
-0.11348526179790497,
0.05856441333889961,
0.011291828006505966,
-0.02447417564690113,
-0.03027423471212387,
0.0341988280415535,
-0.022288745269179344,
0.024919893592596054,
-0.03823941573500633,
-0.06508166342973709,
-0.14990001916885376,
-0.03778315335512161,
-0.08354498445987701,
0.09163863956928253,
-0.06594904512166977,
0.018526794388890266,
-0.12388123571872711,
0.04604854807257652,
-0.0123476292937994,
0.06924700736999512,
0.011915478855371475,
-0.043959878385066986,
0.03081241063773632,
-0.07743342965841293,
0.08503924310207367,
0.060990117490291595,
-0.06895741820335388,
-0.059777770191431046
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola-3
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0002
- Matthews Correlation: 1.0
Label 0 : "AIMX"
Label 1 : "OWNX"
Label 2 : "CONT"
Label 3 : "BASE"
Label 4 : "MISC"
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 192 | 0.0060 | 1.0 |
| No log | 2.0 | 384 | 0.0019 | 1.0 |
| 0.0826 | 3.0 | 576 | 0.0010 | 1.0 |
| 0.0826 | 4.0 | 768 | 0.0006 | 1.0 |
| 0.0826 | 5.0 | 960 | 0.0005 | 1.0 |
| 0.001 | 6.0 | 1152 | 0.0004 | 1.0 |
| 0.001 | 7.0 | 1344 | 0.0003 | 1.0 |
| 0.0005 | 8.0 | 1536 | 0.0003 | 1.0 |
| 0.0005 | 9.0 | 1728 | 0.0002 | 1.0 |
| 0.0005 | 10.0 | 1920 | 0.0002 | 1.0 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.10.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["matthews_correlation"], "model-index": [{"name": "distilbert-base-uncased-finetuned-cola-3", "results": []}]}
|
text-classification
|
fadhilarkan/distilbert-base-uncased-finetuned-cola-3
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-cola-3
========================================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0002
* Matthews Correlation: 1.0
Label 0 : "AIMX"
Label 1 : "OWNX"
Label 2 : "CONT"
Label 3 : "BASE"
Label 4 : "MISC"
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 10
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.10.0+cu111
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
57,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.09556034207344055,
0.07805211842060089,
-0.0021067378111183643,
0.11713996529579163,
0.1783677339553833,
0.022473713383078575,
0.12161892652511597,
0.12117628008127213,
-0.110373854637146,
0.013630779460072517,
0.118468277156353,
0.18625356256961823,
0.0090566985309124,
0.10382425785064697,
-0.05703836306929588,
-0.2577759027481079,
-0.013524509035050869,
0.05270823836326599,
-0.07964864373207092,
0.14152126014232635,
0.09730467200279236,
-0.13233299553394318,
0.077690988779068,
0.00476899603381753,
-0.22485913336277008,
0.006691040936857462,
0.011269869282841682,
-0.06507560610771179,
0.15173886716365814,
0.019689977169036865,
0.13065245747566223,
0.008844478987157345,
0.07851602882146835,
-0.1807430535554886,
0.00961084384471178,
0.047398194670677185,
0.007037738338112831,
0.08919806033372879,
0.05608683452010155,
-0.013173866085708141,
0.11447194963693619,
-0.08453802019357681,
0.05794636160135269,
0.02307230420410633,
-0.12458401173353195,
-0.21827435493469238,
-0.07880919426679611,
0.023196635767817497,
0.06655897200107574,
0.10795116424560547,
-0.0024592536501586437,
0.1285507082939148,
-0.09859257191419601,
0.09701904654502869,
0.21520061790943146,
-0.2733873724937439,
-0.06533929705619812,
0.030302654951810837,
0.011079906485974789,
0.07334402203559875,
-0.11208351701498032,
-0.025603050366044044,
0.055906616151332855,
0.0502963550388813,
0.13870041072368622,
-0.030023757368326187,
-0.1264047771692276,
0.013685381039977074,
-0.14369335770606995,
-0.034080229699611664,
0.14124943315982819,
0.023970844224095345,
-0.02725897915661335,
-0.04599114507436752,
-0.05895756185054779,
-0.16349387168884277,
-0.039694398641586304,
-0.009159311652183533,
0.04437078535556793,
-0.033555492758750916,
-0.05325113236904144,
0.0008155679097399116,
-0.10855171084403992,
-0.0674397423863411,
-0.07005853950977325,
0.14453943073749542,
0.04057980701327324,
0.007113757077604532,
-0.03198385611176491,
0.11214929819107056,
0.02751036547124386,
-0.1304769068956375,
0.03438757732510567,
0.029752595350146294,
0.003674536943435669,
-0.049193430691957474,
-0.06688958406448364,
-0.059155985713005066,
0.006300509441643953,
0.11466404050588608,
-0.06044260412454605,
0.0494660921394825,
0.029578184708952904,
0.04678598791360855,
-0.09995048493146896,
0.1968921720981598,
-0.029277663677930832,
-0.0016804905608296394,
0.006590354721993208,
0.05093665048480034,
0.004495055414736271,
-0.008297054097056389,
-0.1185772716999054,
0.0015390112530440092,
0.11276838183403015,
0.020907023921608925,
-0.0684475228190422,
0.07602579891681671,
-0.05651255324482918,
-0.02549816109240055,
0.015366672538220882,
-0.10124894976615906,
0.03066578134894371,
0.0009704747353680432,
-0.08508987724781036,
-0.00999740045517683,
0.03101615235209465,
0.012121295556426048,
-0.03010832890868187,
0.11677926778793335,
-0.07932276278734207,
0.04085098206996918,
-0.09931379556655884,
-0.11041594296693802,
0.014606026001274586,
-0.08025963604450226,
0.02332577481865883,
-0.10207315534353256,
-0.16245539486408234,
-0.01750250533223152,
0.05720105767250061,
-0.022038307040929794,
-0.05754924938082695,
-0.05695141479372978,
-0.07465028017759323,
0.011654476635158062,
-0.014061539433896542,
0.1448276787996292,
-0.05793115124106407,
0.11250613629817963,
0.03579079359769821,
0.06668465584516525,
-0.04696211963891983,
0.0663025826215744,
-0.0986185297369957,
0.00001552505091240164,
-0.18668168783187866,
0.05283701792359352,
-0.05030929297208786,
0.0780450850725174,
-0.08703940361738205,
-0.11463645845651627,
0.012483831495046616,
-0.0025723075959831476,
0.06706702709197998,
0.09561631083488464,
-0.16663026809692383,
-0.08210770040750504,
0.15058745443820953,
-0.0661979615688324,
-0.10859732329845428,
0.11561521887779236,
-0.05904791131615639,
0.057908546179533005,
0.07961659133434296,
0.16742712259292603,
0.08636988699436188,
-0.06709348410367966,
0.02754930406808853,
0.006050212308764458,
0.04535974562168121,
-0.06989184767007828,
0.056460656225681305,
0.0022777109406888485,
-0.006472328212112188,
0.036651354283094406,
-0.027531063184142113,
0.07020660489797592,
-0.09422630816698074,
-0.09662216901779175,
-0.03970649093389511,
-0.09773749113082886,
0.06366705149412155,
0.08125269412994385,
0.09086702018976212,
-0.09115944057703018,
-0.06816452741622925,
0.07980289310216904,
0.07564093172550201,
-0.06324686110019684,
0.033139802515506744,
-0.04829319193959236,
0.05728553608059883,
-0.02592947520315647,
-0.017186470329761505,
-0.20175664126873016,
0.003579274984076619,
0.009800445288419724,
-0.0021118447184562683,
0.022105224430561066,
0.019158340990543365,
0.07007618993520737,
0.05530593916773796,
-0.05763894319534302,
-0.023445766419172287,
-0.022614210844039917,
-0.005982686765491962,
-0.1357903927564621,
-0.19189834594726562,
-0.01794586516916752,
-0.016856173053383827,
0.13668203353881836,
-0.20243683457374573,
0.042535826563835144,
-0.013600833714008331,
0.061127182096242905,
-0.00011835969053208828,
-0.0008980310522019863,
-0.04270073398947716,
0.08915959298610687,
-0.03608373925089836,
-0.04596663638949394,
0.07942758500576019,
0.004211812280118465,
-0.08519711345434189,
-0.04008161276578903,
-0.0931938886642456,
0.16648182272911072,
0.14381714165210724,
-0.13299806416034698,
-0.080294169485569,
-0.002929230220615864,
-0.05977094918489456,
-0.0314052514731884,
-0.03653065487742424,
0.03790585324168205,
0.1865789145231247,
-0.01427310612052679,
0.15958940982818604,
-0.06968540698289871,
-0.049452271312475204,
0.01764691434800625,
-0.034420445561409,
0.03238707408308983,
0.1270889937877655,
0.11029697954654694,
-0.07095804065465927,
0.14584891498088837,
0.15448947250843048,
-0.09794506430625916,
0.1313195824623108,
-0.04531214386224747,
-0.0634150356054306,
-0.0052458071149885654,
-0.017027201130986214,
-0.0012771558249369264,
0.07797569781541824,
-0.14465713500976562,
-0.007508931681513786,
0.02001344971358776,
0.020729342475533485,
0.02692278102040291,
-0.23085705935955048,
-0.03387957438826561,
0.028026286512613297,
-0.040370404720306396,
-0.003264216473326087,
-0.019535087049007416,
0.010931523516774178,
0.10834680497646332,
-0.000922244566027075,
-0.08126360923051834,
0.04392674192786217,
0.005697738379240036,
-0.08150065690279007,
0.2227540910243988,
-0.09450290352106094,
-0.1691312938928604,
-0.12657777965068817,
-0.07263422757387161,
-0.0385587178170681,
0.010869196616113186,
0.061100877821445465,
-0.10552682727575302,
-0.02284509316086769,
-0.051205702126026154,
0.02533833682537079,
-0.006028817966580391,
0.034941017627716064,
0.005279145203530788,
0.010865168645977974,
0.06849615275859833,
-0.11295577138662338,
-0.008404228836297989,
-0.05214663967490196,
-0.06259025633335114,
0.05570565164089203,
0.03044436313211918,
0.10406597703695297,
0.1695568561553955,
-0.023152602836489677,
0.006897163111716509,
-0.03130432963371277,
0.2182617336511612,
-0.06766421347856522,
-0.027940241619944572,
0.13462407886981964,
-0.00816596020013094,
0.055299025028944016,
0.10071038454771042,
0.07234703749418259,
-0.08975053578615189,
0.014620262198150158,
0.021667497232556343,
-0.0331890843808651,
-0.235690638422966,
-0.05492894724011421,
-0.05661378800868988,
-0.019739164039492607,
0.09815586358308792,
0.026742221787571907,
0.056539397686719894,
0.06671971827745438,
0.0400962233543396,
0.08761612325906754,
-0.03198970854282379,
0.04700940102338791,
0.12577128410339355,
0.04162386804819107,
0.12214410305023193,
-0.049451470375061035,
-0.0695132166147232,
0.028647636994719505,
-0.0193755142390728,
0.21600638329982758,
0.007544140797108412,
0.13374805450439453,
0.05979343131184578,
0.17528565227985382,
0.003859640331938863,
0.0901840329170227,
0.0010920797940343618,
-0.04228311777114868,
-0.007249949965626001,
-0.04099958762526512,
-0.048150982707738876,
0.011590815149247646,
-0.06159132719039917,
0.05669073387980461,
-0.12275858968496323,
-0.019181393086910248,
0.05713694915175438,
0.2508997917175293,
0.023137493059039116,
-0.3224985599517822,
-0.08612488955259323,
-0.0013583182590082288,
-0.03308222070336342,
-0.02072451449930668,
0.018936412408947945,
0.08285534381866455,
-0.10037960112094879,
0.027454396709799767,
-0.0695660263299942,
0.09578591585159302,
-0.05172184482216835,
0.04812736436724663,
0.06770417094230652,
0.0789208933711052,
0.011689284816384315,
0.08884916454553604,
-0.32858186960220337,
0.26943278312683105,
0.002263992326334119,
0.07032039016485214,
-0.08175311982631683,
-0.00036306135007180274,
0.0352669432759285,
0.06498800218105316,
0.054271332919597626,
-0.013664857484400272,
-0.010121556930243969,
-0.19614863395690918,
-0.055831216275691986,
0.023043310269713402,
0.0917072519659996,
-0.028606470674276352,
0.0839170441031456,
-0.027246221899986267,
0.006744374521076679,
0.07749010622501373,
-0.04154794290661812,
-0.05294172465801239,
-0.10022495687007904,
-0.009723914787173271,
0.014743674546480179,
-0.03651370108127594,
-0.06128797307610512,
-0.11629974842071533,
-0.13131926953792572,
0.14688466489315033,
-0.026848483830690384,
-0.03986747935414314,
-0.11086200177669525,
0.08296462893486023,
0.07482369244098663,
-0.0858897939324379,
0.05247654765844345,
0.00561444042250514,
0.05971970409154892,
0.02544144168496132,
-0.07657540589570999,
0.10322374850511551,
-0.06177733466029167,
-0.15914879739284515,
-0.05199509486556053,
0.11287356168031693,
0.03819432854652405,
0.06345521658658981,
-0.012566390447318554,
0.008454293012619019,
-0.03843201324343681,
-0.09134405851364136,
0.01483391597867012,
-0.012314366176724434,
0.08240001648664474,
0.02843143604695797,
-0.06699798256158829,
-0.0013703759759664536,
-0.06915643066167831,
-0.03265896439552307,
0.19929233193397522,
0.2167070358991623,
-0.09370847046375275,
0.02544708549976349,
0.040185924619436264,
-0.0725896954536438,
-0.20676471292972565,
0.048115141689777374,
0.0647047683596611,
0.0035170710179954767,
0.03061099164187908,
-0.18528041243553162,
0.12468897551298141,
0.09493854641914368,
-0.010122096166014671,
0.10544408857822418,
-0.33977413177490234,
-0.12722384929656982,
0.1254938542842865,
0.14389750361442566,
0.1094142347574234,
-0.14262495934963226,
-0.02236398681998253,
-0.029377354308962822,
-0.11392601579427719,
0.10677415132522583,
-0.07838708907365799,
0.12286162376403809,
-0.03501812368631363,
0.07798603177070618,
0.0036390500608831644,
-0.06041761860251427,
0.1174272745847702,
0.02372990921139717,
0.09753820300102234,
-0.061845529824495316,
-0.03561647981405258,
0.045122865587472916,
-0.034075718373060226,
0.01539558358490467,
-0.0794287919998169,
0.027323413640260696,
-0.09553362429141998,
-0.0197804793715477,
-0.08378588408231735,
0.045116253197193146,
-0.03788011148571968,
-0.056632667779922485,
-0.034896500408649445,
0.021066203713417053,
0.045606814324855804,
-0.012765043415129185,
0.12054754793643951,
0.02278381772339344,
0.14838294684886932,
0.11536911129951477,
0.0762493759393692,
-0.053183771669864655,
-0.06283087283372879,
-0.016597459092736244,
-0.01397030707448721,
0.05752783268690109,
-0.1493009775876999,
0.027681535109877586,
0.14684593677520752,
0.020976833999156952,
0.13690195977687836,
0.08634558320045471,
-0.016480477526783943,
0.0013006353983655572,
0.06411198526620865,
-0.16150890290737152,
-0.07497817277908325,
-0.010012631304562092,
-0.06135164946317673,
-0.10202880948781967,
0.049226757138967514,
0.0802275687456131,
-0.06962132453918457,
-0.012390341609716415,
-0.007011774927377701,
0.00269442331045866,
-0.06172718107700348,
0.20704631507396698,
0.06546936184167862,
0.05040401592850685,
-0.10822685807943344,
0.08012934774160385,
0.06094514578580856,
-0.07865089923143387,
-0.002164010191336274,
0.07785661518573761,
-0.08719474822282791,
-0.04959849268198013,
0.10932941734790802,
0.1687365621328354,
-0.04809458553791046,
-0.043822191655635834,
-0.13786186277866364,
-0.129017174243927,
0.07997864484786987,
0.15764963626861572,
0.12181251496076584,
0.01810179278254509,
-0.06423474848270416,
0.009700224734842777,
-0.1244455873966217,
0.0815032422542572,
0.039021510630846024,
0.06383579224348068,
-0.1340242326259613,
0.1742929071187973,
0.015584192238748074,
0.04877764359116554,
-0.021974530071020126,
0.021719006821513176,
-0.10246710479259491,
0.022209888324141502,
-0.11212967336177826,
-0.02752099744975567,
-0.015144890174269676,
0.00682470528408885,
-0.00955585204064846,
-0.05056159570813179,
-0.04613422602415085,
0.013150534592568874,
-0.12154456228017807,
-0.02147367037832737,
0.026958042755723,
0.05827387049794197,
-0.11345680803060532,
-0.043726541101932526,
0.025209588930010796,
-0.061932336539030075,
0.06392641365528107,
0.05440819635987282,
0.009444166906177998,
0.0680813416838646,
-0.13876761496067047,
-0.0029209950007498264,
0.08173337578773499,
0.015093238092958927,
0.06194562464952469,
-0.08420602977275848,
-0.0067267282865941525,
0.005385236814618111,
0.06681817024946213,
0.02448071725666523,
0.0779937133193016,
-0.14360862970352173,
0.005698604509234428,
-0.03551139682531357,
-0.0824250876903534,
-0.07015236467123032,
0.03544328734278679,
0.08339241147041321,
0.009481306187808514,
0.1958424150943756,
-0.07767083495855331,
0.043252695351839066,
-0.20980706810951233,
0.00041468857671134174,
-0.019721129909157753,
-0.12012454122304916,
-0.12529999017715454,
-0.0701192170381546,
0.0637119933962822,
-0.051146697252988815,
0.13162541389465332,
0.03318427503108978,
0.03916747868061066,
0.026906460523605347,
-0.006209552753716707,
0.0004453198052942753,
0.027775239199399948,
0.21254529058933258,
0.03538767620921135,
-0.03423648700118065,
0.06920802593231201,
0.05667848140001297,
0.09922054409980774,
0.1153043881058693,
0.1916305273771286,
0.15626010298728943,
-0.02376456931233406,
0.09178198873996735,
0.022733312100172043,
-0.047139015048742294,
-0.15313786268234253,
0.04682576283812523,
-0.050764765590429306,
0.09924725443124771,
-0.024942627176642418,
0.21693110466003418,
0.05260614678263664,
-0.16713950037956238,
0.05349664017558098,
-0.05349605157971382,
-0.09459717571735382,
-0.1030881255865097,
-0.035837866365909576,
-0.07850270718336105,
-0.1383875608444214,
-0.004493194632232189,
-0.10666213184595108,
0.011473587714135647,
0.10762526094913483,
0.005083993077278137,
-0.03141387552022934,
0.16149866580963135,
0.03497380390763283,
0.01570090278983116,
0.06610387563705444,
-0.0021981131285429,
-0.02890540473163128,
-0.117701955139637,
-0.06297197937965393,
-0.018291717395186424,
-0.0058380174450576305,
0.03740764036774635,
-0.05450373515486717,
-0.07328589260578156,
0.037810999900102615,
-0.030131494626402855,
-0.09950963407754898,
0.018587881699204445,
0.021023960784077644,
0.06585615873336792,
0.04975156858563423,
0.007895459420979023,
0.008172912523150444,
-0.01110062561929226,
0.21637268364429474,
-0.07555282860994339,
-0.09062347561120987,
-0.08825574815273285,
0.26799678802490234,
0.050293367356061935,
-0.007123897783458233,
0.0346154160797596,
-0.06087297573685646,
-0.0004811020044144243,
0.26281481981277466,
0.20377573370933533,
-0.08510349690914154,
-0.008055216632783413,
0.006707715801894665,
-0.009042750112712383,
-0.012489554472267628,
0.12117348611354828,
0.1459304690361023,
0.0376087948679924,
-0.1049448773264885,
-0.038312774151563644,
-0.05585673823952675,
-0.017992133274674416,
-0.045290812849998474,
0.06608225405216217,
0.03798222169280052,
-0.00048317271284759045,
-0.036708712577819824,
0.06433653831481934,
-0.07012129575014114,
-0.0921628400683403,
0.05540255457162857,
-0.20935675501823425,
-0.16130171716213226,
-0.018556097522377968,
0.10811524838209152,
-0.00021055502293165773,
0.05965680256485939,
-0.023394767194986343,
0.0011714318534359336,
0.06737249344587326,
-0.023418273776769638,
-0.08876517415046692,
-0.07302337884902954,
0.09162282198667526,
-0.11785074323415756,
0.18005475401878357,
-0.04252246022224426,
0.0704759880900383,
0.12048054486513138,
0.07245752215385437,
-0.06212114915251732,
0.061459992080926895,
0.03296166658401489,
-0.06643841415643692,
0.03999544307589531,
0.08310592919588089,
-0.029080405831336975,
0.03580754995346069,
0.0364837683737278,
-0.1218988448381424,
0.032072558999061584,
-0.07009761780500412,
-0.05440477654337883,
-0.041996635496616364,
-0.043330252170562744,
-0.05355597659945488,
0.11858376115560532,
0.22038497030735016,
-0.02553378976881504,
0.011790585704147816,
-0.0759049579501152,
0.0007484169909730554,
0.04757663980126381,
0.016001509502530098,
-0.07893349230289459,
-0.23261138796806335,
0.0029817826580256224,
0.05285237357020378,
-0.017904141917824745,
-0.2266920953989029,
-0.0889548733830452,
-0.0016265064477920532,
-0.07622618228197098,
-0.10240919888019562,
0.08156115561723709,
0.07113739848136902,
0.05170882120728493,
-0.05117121338844299,
-0.0724811777472496,
-0.08120547980070114,
0.15952250361442566,
-0.1556859165430069,
-0.08754152804613113
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola-4
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0011
- Matthews Correlation: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 104 | 0.0243 | 1.0 |
| No log | 2.0 | 208 | 0.0074 | 1.0 |
| No log | 3.0 | 312 | 0.0041 | 1.0 |
| No log | 4.0 | 416 | 0.0028 | 1.0 |
| 0.0929 | 5.0 | 520 | 0.0021 | 1.0 |
| 0.0929 | 6.0 | 624 | 0.0016 | 1.0 |
| 0.0929 | 7.0 | 728 | 0.0014 | 1.0 |
| 0.0929 | 8.0 | 832 | 0.0012 | 1.0 |
| 0.0929 | 9.0 | 936 | 0.0012 | 1.0 |
| 0.0021 | 10.0 | 1040 | 0.0011 | 1.0 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.10.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["matthews_correlation"], "model-index": [{"name": "distilbert-base-uncased-finetuned-cola-4", "results": []}]}
|
text-classification
|
fadhilarkan/distilbert-base-uncased-finetuned-cola-4
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-cola-4
========================================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0011
* Matthews Correlation: 1.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 10
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.10.0+cu111
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
57,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.09556034207344055,
0.07805211842060089,
-0.0021067378111183643,
0.11713996529579163,
0.1783677339553833,
0.022473713383078575,
0.12161892652511597,
0.12117628008127213,
-0.110373854637146,
0.013630779460072517,
0.118468277156353,
0.18625356256961823,
0.0090566985309124,
0.10382425785064697,
-0.05703836306929588,
-0.2577759027481079,
-0.013524509035050869,
0.05270823836326599,
-0.07964864373207092,
0.14152126014232635,
0.09730467200279236,
-0.13233299553394318,
0.077690988779068,
0.00476899603381753,
-0.22485913336277008,
0.006691040936857462,
0.011269869282841682,
-0.06507560610771179,
0.15173886716365814,
0.019689977169036865,
0.13065245747566223,
0.008844478987157345,
0.07851602882146835,
-0.1807430535554886,
0.00961084384471178,
0.047398194670677185,
0.007037738338112831,
0.08919806033372879,
0.05608683452010155,
-0.013173866085708141,
0.11447194963693619,
-0.08453802019357681,
0.05794636160135269,
0.02307230420410633,
-0.12458401173353195,
-0.21827435493469238,
-0.07880919426679611,
0.023196635767817497,
0.06655897200107574,
0.10795116424560547,
-0.0024592536501586437,
0.1285507082939148,
-0.09859257191419601,
0.09701904654502869,
0.21520061790943146,
-0.2733873724937439,
-0.06533929705619812,
0.030302654951810837,
0.011079906485974789,
0.07334402203559875,
-0.11208351701498032,
-0.025603050366044044,
0.055906616151332855,
0.0502963550388813,
0.13870041072368622,
-0.030023757368326187,
-0.1264047771692276,
0.013685381039977074,
-0.14369335770606995,
-0.034080229699611664,
0.14124943315982819,
0.023970844224095345,
-0.02725897915661335,
-0.04599114507436752,
-0.05895756185054779,
-0.16349387168884277,
-0.039694398641586304,
-0.009159311652183533,
0.04437078535556793,
-0.033555492758750916,
-0.05325113236904144,
0.0008155679097399116,
-0.10855171084403992,
-0.0674397423863411,
-0.07005853950977325,
0.14453943073749542,
0.04057980701327324,
0.007113757077604532,
-0.03198385611176491,
0.11214929819107056,
0.02751036547124386,
-0.1304769068956375,
0.03438757732510567,
0.029752595350146294,
0.003674536943435669,
-0.049193430691957474,
-0.06688958406448364,
-0.059155985713005066,
0.006300509441643953,
0.11466404050588608,
-0.06044260412454605,
0.0494660921394825,
0.029578184708952904,
0.04678598791360855,
-0.09995048493146896,
0.1968921720981598,
-0.029277663677930832,
-0.0016804905608296394,
0.006590354721993208,
0.05093665048480034,
0.004495055414736271,
-0.008297054097056389,
-0.1185772716999054,
0.0015390112530440092,
0.11276838183403015,
0.020907023921608925,
-0.0684475228190422,
0.07602579891681671,
-0.05651255324482918,
-0.02549816109240055,
0.015366672538220882,
-0.10124894976615906,
0.03066578134894371,
0.0009704747353680432,
-0.08508987724781036,
-0.00999740045517683,
0.03101615235209465,
0.012121295556426048,
-0.03010832890868187,
0.11677926778793335,
-0.07932276278734207,
0.04085098206996918,
-0.09931379556655884,
-0.11041594296693802,
0.014606026001274586,
-0.08025963604450226,
0.02332577481865883,
-0.10207315534353256,
-0.16245539486408234,
-0.01750250533223152,
0.05720105767250061,
-0.022038307040929794,
-0.05754924938082695,
-0.05695141479372978,
-0.07465028017759323,
0.011654476635158062,
-0.014061539433896542,
0.1448276787996292,
-0.05793115124106407,
0.11250613629817963,
0.03579079359769821,
0.06668465584516525,
-0.04696211963891983,
0.0663025826215744,
-0.0986185297369957,
0.00001552505091240164,
-0.18668168783187866,
0.05283701792359352,
-0.05030929297208786,
0.0780450850725174,
-0.08703940361738205,
-0.11463645845651627,
0.012483831495046616,
-0.0025723075959831476,
0.06706702709197998,
0.09561631083488464,
-0.16663026809692383,
-0.08210770040750504,
0.15058745443820953,
-0.0661979615688324,
-0.10859732329845428,
0.11561521887779236,
-0.05904791131615639,
0.057908546179533005,
0.07961659133434296,
0.16742712259292603,
0.08636988699436188,
-0.06709348410367966,
0.02754930406808853,
0.006050212308764458,
0.04535974562168121,
-0.06989184767007828,
0.056460656225681305,
0.0022777109406888485,
-0.006472328212112188,
0.036651354283094406,
-0.027531063184142113,
0.07020660489797592,
-0.09422630816698074,
-0.09662216901779175,
-0.03970649093389511,
-0.09773749113082886,
0.06366705149412155,
0.08125269412994385,
0.09086702018976212,
-0.09115944057703018,
-0.06816452741622925,
0.07980289310216904,
0.07564093172550201,
-0.06324686110019684,
0.033139802515506744,
-0.04829319193959236,
0.05728553608059883,
-0.02592947520315647,
-0.017186470329761505,
-0.20175664126873016,
0.003579274984076619,
0.009800445288419724,
-0.0021118447184562683,
0.022105224430561066,
0.019158340990543365,
0.07007618993520737,
0.05530593916773796,
-0.05763894319534302,
-0.023445766419172287,
-0.022614210844039917,
-0.005982686765491962,
-0.1357903927564621,
-0.19189834594726562,
-0.01794586516916752,
-0.016856173053383827,
0.13668203353881836,
-0.20243683457374573,
0.042535826563835144,
-0.013600833714008331,
0.061127182096242905,
-0.00011835969053208828,
-0.0008980310522019863,
-0.04270073398947716,
0.08915959298610687,
-0.03608373925089836,
-0.04596663638949394,
0.07942758500576019,
0.004211812280118465,
-0.08519711345434189,
-0.04008161276578903,
-0.0931938886642456,
0.16648182272911072,
0.14381714165210724,
-0.13299806416034698,
-0.080294169485569,
-0.002929230220615864,
-0.05977094918489456,
-0.0314052514731884,
-0.03653065487742424,
0.03790585324168205,
0.1865789145231247,
-0.01427310612052679,
0.15958940982818604,
-0.06968540698289871,
-0.049452271312475204,
0.01764691434800625,
-0.034420445561409,
0.03238707408308983,
0.1270889937877655,
0.11029697954654694,
-0.07095804065465927,
0.14584891498088837,
0.15448947250843048,
-0.09794506430625916,
0.1313195824623108,
-0.04531214386224747,
-0.0634150356054306,
-0.0052458071149885654,
-0.017027201130986214,
-0.0012771558249369264,
0.07797569781541824,
-0.14465713500976562,
-0.007508931681513786,
0.02001344971358776,
0.020729342475533485,
0.02692278102040291,
-0.23085705935955048,
-0.03387957438826561,
0.028026286512613297,
-0.040370404720306396,
-0.003264216473326087,
-0.019535087049007416,
0.010931523516774178,
0.10834680497646332,
-0.000922244566027075,
-0.08126360923051834,
0.04392674192786217,
0.005697738379240036,
-0.08150065690279007,
0.2227540910243988,
-0.09450290352106094,
-0.1691312938928604,
-0.12657777965068817,
-0.07263422757387161,
-0.0385587178170681,
0.010869196616113186,
0.061100877821445465,
-0.10552682727575302,
-0.02284509316086769,
-0.051205702126026154,
0.02533833682537079,
-0.006028817966580391,
0.034941017627716064,
0.005279145203530788,
0.010865168645977974,
0.06849615275859833,
-0.11295577138662338,
-0.008404228836297989,
-0.05214663967490196,
-0.06259025633335114,
0.05570565164089203,
0.03044436313211918,
0.10406597703695297,
0.1695568561553955,
-0.023152602836489677,
0.006897163111716509,
-0.03130432963371277,
0.2182617336511612,
-0.06766421347856522,
-0.027940241619944572,
0.13462407886981964,
-0.00816596020013094,
0.055299025028944016,
0.10071038454771042,
0.07234703749418259,
-0.08975053578615189,
0.014620262198150158,
0.021667497232556343,
-0.0331890843808651,
-0.235690638422966,
-0.05492894724011421,
-0.05661378800868988,
-0.019739164039492607,
0.09815586358308792,
0.026742221787571907,
0.056539397686719894,
0.06671971827745438,
0.0400962233543396,
0.08761612325906754,
-0.03198970854282379,
0.04700940102338791,
0.12577128410339355,
0.04162386804819107,
0.12214410305023193,
-0.049451470375061035,
-0.0695132166147232,
0.028647636994719505,
-0.0193755142390728,
0.21600638329982758,
0.007544140797108412,
0.13374805450439453,
0.05979343131184578,
0.17528565227985382,
0.003859640331938863,
0.0901840329170227,
0.0010920797940343618,
-0.04228311777114868,
-0.007249949965626001,
-0.04099958762526512,
-0.048150982707738876,
0.011590815149247646,
-0.06159132719039917,
0.05669073387980461,
-0.12275858968496323,
-0.019181393086910248,
0.05713694915175438,
0.2508997917175293,
0.023137493059039116,
-0.3224985599517822,
-0.08612488955259323,
-0.0013583182590082288,
-0.03308222070336342,
-0.02072451449930668,
0.018936412408947945,
0.08285534381866455,
-0.10037960112094879,
0.027454396709799767,
-0.0695660263299942,
0.09578591585159302,
-0.05172184482216835,
0.04812736436724663,
0.06770417094230652,
0.0789208933711052,
0.011689284816384315,
0.08884916454553604,
-0.32858186960220337,
0.26943278312683105,
0.002263992326334119,
0.07032039016485214,
-0.08175311982631683,
-0.00036306135007180274,
0.0352669432759285,
0.06498800218105316,
0.054271332919597626,
-0.013664857484400272,
-0.010121556930243969,
-0.19614863395690918,
-0.055831216275691986,
0.023043310269713402,
0.0917072519659996,
-0.028606470674276352,
0.0839170441031456,
-0.027246221899986267,
0.006744374521076679,
0.07749010622501373,
-0.04154794290661812,
-0.05294172465801239,
-0.10022495687007904,
-0.009723914787173271,
0.014743674546480179,
-0.03651370108127594,
-0.06128797307610512,
-0.11629974842071533,
-0.13131926953792572,
0.14688466489315033,
-0.026848483830690384,
-0.03986747935414314,
-0.11086200177669525,
0.08296462893486023,
0.07482369244098663,
-0.0858897939324379,
0.05247654765844345,
0.00561444042250514,
0.05971970409154892,
0.02544144168496132,
-0.07657540589570999,
0.10322374850511551,
-0.06177733466029167,
-0.15914879739284515,
-0.05199509486556053,
0.11287356168031693,
0.03819432854652405,
0.06345521658658981,
-0.012566390447318554,
0.008454293012619019,
-0.03843201324343681,
-0.09134405851364136,
0.01483391597867012,
-0.012314366176724434,
0.08240001648664474,
0.02843143604695797,
-0.06699798256158829,
-0.0013703759759664536,
-0.06915643066167831,
-0.03265896439552307,
0.19929233193397522,
0.2167070358991623,
-0.09370847046375275,
0.02544708549976349,
0.040185924619436264,
-0.0725896954536438,
-0.20676471292972565,
0.048115141689777374,
0.0647047683596611,
0.0035170710179954767,
0.03061099164187908,
-0.18528041243553162,
0.12468897551298141,
0.09493854641914368,
-0.010122096166014671,
0.10544408857822418,
-0.33977413177490234,
-0.12722384929656982,
0.1254938542842865,
0.14389750361442566,
0.1094142347574234,
-0.14262495934963226,
-0.02236398681998253,
-0.029377354308962822,
-0.11392601579427719,
0.10677415132522583,
-0.07838708907365799,
0.12286162376403809,
-0.03501812368631363,
0.07798603177070618,
0.0036390500608831644,
-0.06041761860251427,
0.1174272745847702,
0.02372990921139717,
0.09753820300102234,
-0.061845529824495316,
-0.03561647981405258,
0.045122865587472916,
-0.034075718373060226,
0.01539558358490467,
-0.0794287919998169,
0.027323413640260696,
-0.09553362429141998,
-0.0197804793715477,
-0.08378588408231735,
0.045116253197193146,
-0.03788011148571968,
-0.056632667779922485,
-0.034896500408649445,
0.021066203713417053,
0.045606814324855804,
-0.012765043415129185,
0.12054754793643951,
0.02278381772339344,
0.14838294684886932,
0.11536911129951477,
0.0762493759393692,
-0.053183771669864655,
-0.06283087283372879,
-0.016597459092736244,
-0.01397030707448721,
0.05752783268690109,
-0.1493009775876999,
0.027681535109877586,
0.14684593677520752,
0.020976833999156952,
0.13690195977687836,
0.08634558320045471,
-0.016480477526783943,
0.0013006353983655572,
0.06411198526620865,
-0.16150890290737152,
-0.07497817277908325,
-0.010012631304562092,
-0.06135164946317673,
-0.10202880948781967,
0.049226757138967514,
0.0802275687456131,
-0.06962132453918457,
-0.012390341609716415,
-0.007011774927377701,
0.00269442331045866,
-0.06172718107700348,
0.20704631507396698,
0.06546936184167862,
0.05040401592850685,
-0.10822685807943344,
0.08012934774160385,
0.06094514578580856,
-0.07865089923143387,
-0.002164010191336274,
0.07785661518573761,
-0.08719474822282791,
-0.04959849268198013,
0.10932941734790802,
0.1687365621328354,
-0.04809458553791046,
-0.043822191655635834,
-0.13786186277866364,
-0.129017174243927,
0.07997864484786987,
0.15764963626861572,
0.12181251496076584,
0.01810179278254509,
-0.06423474848270416,
0.009700224734842777,
-0.1244455873966217,
0.0815032422542572,
0.039021510630846024,
0.06383579224348068,
-0.1340242326259613,
0.1742929071187973,
0.015584192238748074,
0.04877764359116554,
-0.021974530071020126,
0.021719006821513176,
-0.10246710479259491,
0.022209888324141502,
-0.11212967336177826,
-0.02752099744975567,
-0.015144890174269676,
0.00682470528408885,
-0.00955585204064846,
-0.05056159570813179,
-0.04613422602415085,
0.013150534592568874,
-0.12154456228017807,
-0.02147367037832737,
0.026958042755723,
0.05827387049794197,
-0.11345680803060532,
-0.043726541101932526,
0.025209588930010796,
-0.061932336539030075,
0.06392641365528107,
0.05440819635987282,
0.009444166906177998,
0.0680813416838646,
-0.13876761496067047,
-0.0029209950007498264,
0.08173337578773499,
0.015093238092958927,
0.06194562464952469,
-0.08420602977275848,
-0.0067267282865941525,
0.005385236814618111,
0.06681817024946213,
0.02448071725666523,
0.0779937133193016,
-0.14360862970352173,
0.005698604509234428,
-0.03551139682531357,
-0.0824250876903534,
-0.07015236467123032,
0.03544328734278679,
0.08339241147041321,
0.009481306187808514,
0.1958424150943756,
-0.07767083495855331,
0.043252695351839066,
-0.20980706810951233,
0.00041468857671134174,
-0.019721129909157753,
-0.12012454122304916,
-0.12529999017715454,
-0.0701192170381546,
0.0637119933962822,
-0.051146697252988815,
0.13162541389465332,
0.03318427503108978,
0.03916747868061066,
0.026906460523605347,
-0.006209552753716707,
0.0004453198052942753,
0.027775239199399948,
0.21254529058933258,
0.03538767620921135,
-0.03423648700118065,
0.06920802593231201,
0.05667848140001297,
0.09922054409980774,
0.1153043881058693,
0.1916305273771286,
0.15626010298728943,
-0.02376456931233406,
0.09178198873996735,
0.022733312100172043,
-0.047139015048742294,
-0.15313786268234253,
0.04682576283812523,
-0.050764765590429306,
0.09924725443124771,
-0.024942627176642418,
0.21693110466003418,
0.05260614678263664,
-0.16713950037956238,
0.05349664017558098,
-0.05349605157971382,
-0.09459717571735382,
-0.1030881255865097,
-0.035837866365909576,
-0.07850270718336105,
-0.1383875608444214,
-0.004493194632232189,
-0.10666213184595108,
0.011473587714135647,
0.10762526094913483,
0.005083993077278137,
-0.03141387552022934,
0.16149866580963135,
0.03497380390763283,
0.01570090278983116,
0.06610387563705444,
-0.0021981131285429,
-0.02890540473163128,
-0.117701955139637,
-0.06297197937965393,
-0.018291717395186424,
-0.0058380174450576305,
0.03740764036774635,
-0.05450373515486717,
-0.07328589260578156,
0.037810999900102615,
-0.030131494626402855,
-0.09950963407754898,
0.018587881699204445,
0.021023960784077644,
0.06585615873336792,
0.04975156858563423,
0.007895459420979023,
0.008172912523150444,
-0.01110062561929226,
0.21637268364429474,
-0.07555282860994339,
-0.09062347561120987,
-0.08825574815273285,
0.26799678802490234,
0.050293367356061935,
-0.007123897783458233,
0.0346154160797596,
-0.06087297573685646,
-0.0004811020044144243,
0.26281481981277466,
0.20377573370933533,
-0.08510349690914154,
-0.008055216632783413,
0.006707715801894665,
-0.009042750112712383,
-0.012489554472267628,
0.12117348611354828,
0.1459304690361023,
0.0376087948679924,
-0.1049448773264885,
-0.038312774151563644,
-0.05585673823952675,
-0.017992133274674416,
-0.045290812849998474,
0.06608225405216217,
0.03798222169280052,
-0.00048317271284759045,
-0.036708712577819824,
0.06433653831481934,
-0.07012129575014114,
-0.0921628400683403,
0.05540255457162857,
-0.20935675501823425,
-0.16130171716213226,
-0.018556097522377968,
0.10811524838209152,
-0.00021055502293165773,
0.05965680256485939,
-0.023394767194986343,
0.0011714318534359336,
0.06737249344587326,
-0.023418273776769638,
-0.08876517415046692,
-0.07302337884902954,
0.09162282198667526,
-0.11785074323415756,
0.18005475401878357,
-0.04252246022224426,
0.0704759880900383,
0.12048054486513138,
0.07245752215385437,
-0.06212114915251732,
0.061459992080926895,
0.03296166658401489,
-0.06643841415643692,
0.03999544307589531,
0.08310592919588089,
-0.029080405831336975,
0.03580754995346069,
0.0364837683737278,
-0.1218988448381424,
0.032072558999061584,
-0.07009761780500412,
-0.05440477654337883,
-0.041996635496616364,
-0.043330252170562744,
-0.05355597659945488,
0.11858376115560532,
0.22038497030735016,
-0.02553378976881504,
0.011790585704147816,
-0.0759049579501152,
0.0007484169909730554,
0.04757663980126381,
0.016001509502530098,
-0.07893349230289459,
-0.23261138796806335,
0.0029817826580256224,
0.05285237357020378,
-0.017904141917824745,
-0.2266920953989029,
-0.0889548733830452,
-0.0016265064477920532,
-0.07622618228197098,
-0.10240919888019562,
0.08156115561723709,
0.07113739848136902,
0.05170882120728493,
-0.05117121338844299,
-0.0724811777472496,
-0.08120547980070114,
0.15952250361442566,
-0.1556859165430069,
-0.08754152804613113
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0008
- Matthews Correlation: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 130 | 0.0166 | 1.0 |
| No log | 2.0 | 260 | 0.0054 | 1.0 |
| No log | 3.0 | 390 | 0.0029 | 1.0 |
| 0.0968 | 4.0 | 520 | 0.0019 | 1.0 |
| 0.0968 | 5.0 | 650 | 0.0014 | 1.0 |
| 0.0968 | 6.0 | 780 | 0.0011 | 1.0 |
| 0.0968 | 7.0 | 910 | 0.0010 | 1.0 |
| 0.0018 | 8.0 | 1040 | 0.0008 | 1.0 |
| 0.0018 | 9.0 | 1170 | 0.0008 | 1.0 |
| 0.0018 | 10.0 | 1300 | 0.0008 | 1.0 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.10.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["matthews_correlation"], "model-index": [{"name": "distilbert-base-uncased-finetuned-cola", "results": []}]}
|
text-classification
|
fadhilarkan/distilbert-base-uncased-finetuned-cola
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-cola
======================================
This model is a fine-tuned version of distilbert-base-uncased on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0008
* Matthews Correlation: 1.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 10
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.10.0+cu111
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
57,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.09556034207344055,
0.07805211842060089,
-0.0021067378111183643,
0.11713996529579163,
0.1783677339553833,
0.022473713383078575,
0.12161892652511597,
0.12117628008127213,
-0.110373854637146,
0.013630779460072517,
0.118468277156353,
0.18625356256961823,
0.0090566985309124,
0.10382425785064697,
-0.05703836306929588,
-0.2577759027481079,
-0.013524509035050869,
0.05270823836326599,
-0.07964864373207092,
0.14152126014232635,
0.09730467200279236,
-0.13233299553394318,
0.077690988779068,
0.00476899603381753,
-0.22485913336277008,
0.006691040936857462,
0.011269869282841682,
-0.06507560610771179,
0.15173886716365814,
0.019689977169036865,
0.13065245747566223,
0.008844478987157345,
0.07851602882146835,
-0.1807430535554886,
0.00961084384471178,
0.047398194670677185,
0.007037738338112831,
0.08919806033372879,
0.05608683452010155,
-0.013173866085708141,
0.11447194963693619,
-0.08453802019357681,
0.05794636160135269,
0.02307230420410633,
-0.12458401173353195,
-0.21827435493469238,
-0.07880919426679611,
0.023196635767817497,
0.06655897200107574,
0.10795116424560547,
-0.0024592536501586437,
0.1285507082939148,
-0.09859257191419601,
0.09701904654502869,
0.21520061790943146,
-0.2733873724937439,
-0.06533929705619812,
0.030302654951810837,
0.011079906485974789,
0.07334402203559875,
-0.11208351701498032,
-0.025603050366044044,
0.055906616151332855,
0.0502963550388813,
0.13870041072368622,
-0.030023757368326187,
-0.1264047771692276,
0.013685381039977074,
-0.14369335770606995,
-0.034080229699611664,
0.14124943315982819,
0.023970844224095345,
-0.02725897915661335,
-0.04599114507436752,
-0.05895756185054779,
-0.16349387168884277,
-0.039694398641586304,
-0.009159311652183533,
0.04437078535556793,
-0.033555492758750916,
-0.05325113236904144,
0.0008155679097399116,
-0.10855171084403992,
-0.0674397423863411,
-0.07005853950977325,
0.14453943073749542,
0.04057980701327324,
0.007113757077604532,
-0.03198385611176491,
0.11214929819107056,
0.02751036547124386,
-0.1304769068956375,
0.03438757732510567,
0.029752595350146294,
0.003674536943435669,
-0.049193430691957474,
-0.06688958406448364,
-0.059155985713005066,
0.006300509441643953,
0.11466404050588608,
-0.06044260412454605,
0.0494660921394825,
0.029578184708952904,
0.04678598791360855,
-0.09995048493146896,
0.1968921720981598,
-0.029277663677930832,
-0.0016804905608296394,
0.006590354721993208,
0.05093665048480034,
0.004495055414736271,
-0.008297054097056389,
-0.1185772716999054,
0.0015390112530440092,
0.11276838183403015,
0.020907023921608925,
-0.0684475228190422,
0.07602579891681671,
-0.05651255324482918,
-0.02549816109240055,
0.015366672538220882,
-0.10124894976615906,
0.03066578134894371,
0.0009704747353680432,
-0.08508987724781036,
-0.00999740045517683,
0.03101615235209465,
0.012121295556426048,
-0.03010832890868187,
0.11677926778793335,
-0.07932276278734207,
0.04085098206996918,
-0.09931379556655884,
-0.11041594296693802,
0.014606026001274586,
-0.08025963604450226,
0.02332577481865883,
-0.10207315534353256,
-0.16245539486408234,
-0.01750250533223152,
0.05720105767250061,
-0.022038307040929794,
-0.05754924938082695,
-0.05695141479372978,
-0.07465028017759323,
0.011654476635158062,
-0.014061539433896542,
0.1448276787996292,
-0.05793115124106407,
0.11250613629817963,
0.03579079359769821,
0.06668465584516525,
-0.04696211963891983,
0.0663025826215744,
-0.0986185297369957,
0.00001552505091240164,
-0.18668168783187866,
0.05283701792359352,
-0.05030929297208786,
0.0780450850725174,
-0.08703940361738205,
-0.11463645845651627,
0.012483831495046616,
-0.0025723075959831476,
0.06706702709197998,
0.09561631083488464,
-0.16663026809692383,
-0.08210770040750504,
0.15058745443820953,
-0.0661979615688324,
-0.10859732329845428,
0.11561521887779236,
-0.05904791131615639,
0.057908546179533005,
0.07961659133434296,
0.16742712259292603,
0.08636988699436188,
-0.06709348410367966,
0.02754930406808853,
0.006050212308764458,
0.04535974562168121,
-0.06989184767007828,
0.056460656225681305,
0.0022777109406888485,
-0.006472328212112188,
0.036651354283094406,
-0.027531063184142113,
0.07020660489797592,
-0.09422630816698074,
-0.09662216901779175,
-0.03970649093389511,
-0.09773749113082886,
0.06366705149412155,
0.08125269412994385,
0.09086702018976212,
-0.09115944057703018,
-0.06816452741622925,
0.07980289310216904,
0.07564093172550201,
-0.06324686110019684,
0.033139802515506744,
-0.04829319193959236,
0.05728553608059883,
-0.02592947520315647,
-0.017186470329761505,
-0.20175664126873016,
0.003579274984076619,
0.009800445288419724,
-0.0021118447184562683,
0.022105224430561066,
0.019158340990543365,
0.07007618993520737,
0.05530593916773796,
-0.05763894319534302,
-0.023445766419172287,
-0.022614210844039917,
-0.005982686765491962,
-0.1357903927564621,
-0.19189834594726562,
-0.01794586516916752,
-0.016856173053383827,
0.13668203353881836,
-0.20243683457374573,
0.042535826563835144,
-0.013600833714008331,
0.061127182096242905,
-0.00011835969053208828,
-0.0008980310522019863,
-0.04270073398947716,
0.08915959298610687,
-0.03608373925089836,
-0.04596663638949394,
0.07942758500576019,
0.004211812280118465,
-0.08519711345434189,
-0.04008161276578903,
-0.0931938886642456,
0.16648182272911072,
0.14381714165210724,
-0.13299806416034698,
-0.080294169485569,
-0.002929230220615864,
-0.05977094918489456,
-0.0314052514731884,
-0.03653065487742424,
0.03790585324168205,
0.1865789145231247,
-0.01427310612052679,
0.15958940982818604,
-0.06968540698289871,
-0.049452271312475204,
0.01764691434800625,
-0.034420445561409,
0.03238707408308983,
0.1270889937877655,
0.11029697954654694,
-0.07095804065465927,
0.14584891498088837,
0.15448947250843048,
-0.09794506430625916,
0.1313195824623108,
-0.04531214386224747,
-0.0634150356054306,
-0.0052458071149885654,
-0.017027201130986214,
-0.0012771558249369264,
0.07797569781541824,
-0.14465713500976562,
-0.007508931681513786,
0.02001344971358776,
0.020729342475533485,
0.02692278102040291,
-0.23085705935955048,
-0.03387957438826561,
0.028026286512613297,
-0.040370404720306396,
-0.003264216473326087,
-0.019535087049007416,
0.010931523516774178,
0.10834680497646332,
-0.000922244566027075,
-0.08126360923051834,
0.04392674192786217,
0.005697738379240036,
-0.08150065690279007,
0.2227540910243988,
-0.09450290352106094,
-0.1691312938928604,
-0.12657777965068817,
-0.07263422757387161,
-0.0385587178170681,
0.010869196616113186,
0.061100877821445465,
-0.10552682727575302,
-0.02284509316086769,
-0.051205702126026154,
0.02533833682537079,
-0.006028817966580391,
0.034941017627716064,
0.005279145203530788,
0.010865168645977974,
0.06849615275859833,
-0.11295577138662338,
-0.008404228836297989,
-0.05214663967490196,
-0.06259025633335114,
0.05570565164089203,
0.03044436313211918,
0.10406597703695297,
0.1695568561553955,
-0.023152602836489677,
0.006897163111716509,
-0.03130432963371277,
0.2182617336511612,
-0.06766421347856522,
-0.027940241619944572,
0.13462407886981964,
-0.00816596020013094,
0.055299025028944016,
0.10071038454771042,
0.07234703749418259,
-0.08975053578615189,
0.014620262198150158,
0.021667497232556343,
-0.0331890843808651,
-0.235690638422966,
-0.05492894724011421,
-0.05661378800868988,
-0.019739164039492607,
0.09815586358308792,
0.026742221787571907,
0.056539397686719894,
0.06671971827745438,
0.0400962233543396,
0.08761612325906754,
-0.03198970854282379,
0.04700940102338791,
0.12577128410339355,
0.04162386804819107,
0.12214410305023193,
-0.049451470375061035,
-0.0695132166147232,
0.028647636994719505,
-0.0193755142390728,
0.21600638329982758,
0.007544140797108412,
0.13374805450439453,
0.05979343131184578,
0.17528565227985382,
0.003859640331938863,
0.0901840329170227,
0.0010920797940343618,
-0.04228311777114868,
-0.007249949965626001,
-0.04099958762526512,
-0.048150982707738876,
0.011590815149247646,
-0.06159132719039917,
0.05669073387980461,
-0.12275858968496323,
-0.019181393086910248,
0.05713694915175438,
0.2508997917175293,
0.023137493059039116,
-0.3224985599517822,
-0.08612488955259323,
-0.0013583182590082288,
-0.03308222070336342,
-0.02072451449930668,
0.018936412408947945,
0.08285534381866455,
-0.10037960112094879,
0.027454396709799767,
-0.0695660263299942,
0.09578591585159302,
-0.05172184482216835,
0.04812736436724663,
0.06770417094230652,
0.0789208933711052,
0.011689284816384315,
0.08884916454553604,
-0.32858186960220337,
0.26943278312683105,
0.002263992326334119,
0.07032039016485214,
-0.08175311982631683,
-0.00036306135007180274,
0.0352669432759285,
0.06498800218105316,
0.054271332919597626,
-0.013664857484400272,
-0.010121556930243969,
-0.19614863395690918,
-0.055831216275691986,
0.023043310269713402,
0.0917072519659996,
-0.028606470674276352,
0.0839170441031456,
-0.027246221899986267,
0.006744374521076679,
0.07749010622501373,
-0.04154794290661812,
-0.05294172465801239,
-0.10022495687007904,
-0.009723914787173271,
0.014743674546480179,
-0.03651370108127594,
-0.06128797307610512,
-0.11629974842071533,
-0.13131926953792572,
0.14688466489315033,
-0.026848483830690384,
-0.03986747935414314,
-0.11086200177669525,
0.08296462893486023,
0.07482369244098663,
-0.0858897939324379,
0.05247654765844345,
0.00561444042250514,
0.05971970409154892,
0.02544144168496132,
-0.07657540589570999,
0.10322374850511551,
-0.06177733466029167,
-0.15914879739284515,
-0.05199509486556053,
0.11287356168031693,
0.03819432854652405,
0.06345521658658981,
-0.012566390447318554,
0.008454293012619019,
-0.03843201324343681,
-0.09134405851364136,
0.01483391597867012,
-0.012314366176724434,
0.08240001648664474,
0.02843143604695797,
-0.06699798256158829,
-0.0013703759759664536,
-0.06915643066167831,
-0.03265896439552307,
0.19929233193397522,
0.2167070358991623,
-0.09370847046375275,
0.02544708549976349,
0.040185924619436264,
-0.0725896954536438,
-0.20676471292972565,
0.048115141689777374,
0.0647047683596611,
0.0035170710179954767,
0.03061099164187908,
-0.18528041243553162,
0.12468897551298141,
0.09493854641914368,
-0.010122096166014671,
0.10544408857822418,
-0.33977413177490234,
-0.12722384929656982,
0.1254938542842865,
0.14389750361442566,
0.1094142347574234,
-0.14262495934963226,
-0.02236398681998253,
-0.029377354308962822,
-0.11392601579427719,
0.10677415132522583,
-0.07838708907365799,
0.12286162376403809,
-0.03501812368631363,
0.07798603177070618,
0.0036390500608831644,
-0.06041761860251427,
0.1174272745847702,
0.02372990921139717,
0.09753820300102234,
-0.061845529824495316,
-0.03561647981405258,
0.045122865587472916,
-0.034075718373060226,
0.01539558358490467,
-0.0794287919998169,
0.027323413640260696,
-0.09553362429141998,
-0.0197804793715477,
-0.08378588408231735,
0.045116253197193146,
-0.03788011148571968,
-0.056632667779922485,
-0.034896500408649445,
0.021066203713417053,
0.045606814324855804,
-0.012765043415129185,
0.12054754793643951,
0.02278381772339344,
0.14838294684886932,
0.11536911129951477,
0.0762493759393692,
-0.053183771669864655,
-0.06283087283372879,
-0.016597459092736244,
-0.01397030707448721,
0.05752783268690109,
-0.1493009775876999,
0.027681535109877586,
0.14684593677520752,
0.020976833999156952,
0.13690195977687836,
0.08634558320045471,
-0.016480477526783943,
0.0013006353983655572,
0.06411198526620865,
-0.16150890290737152,
-0.07497817277908325,
-0.010012631304562092,
-0.06135164946317673,
-0.10202880948781967,
0.049226757138967514,
0.0802275687456131,
-0.06962132453918457,
-0.012390341609716415,
-0.007011774927377701,
0.00269442331045866,
-0.06172718107700348,
0.20704631507396698,
0.06546936184167862,
0.05040401592850685,
-0.10822685807943344,
0.08012934774160385,
0.06094514578580856,
-0.07865089923143387,
-0.002164010191336274,
0.07785661518573761,
-0.08719474822282791,
-0.04959849268198013,
0.10932941734790802,
0.1687365621328354,
-0.04809458553791046,
-0.043822191655635834,
-0.13786186277866364,
-0.129017174243927,
0.07997864484786987,
0.15764963626861572,
0.12181251496076584,
0.01810179278254509,
-0.06423474848270416,
0.009700224734842777,
-0.1244455873966217,
0.0815032422542572,
0.039021510630846024,
0.06383579224348068,
-0.1340242326259613,
0.1742929071187973,
0.015584192238748074,
0.04877764359116554,
-0.021974530071020126,
0.021719006821513176,
-0.10246710479259491,
0.022209888324141502,
-0.11212967336177826,
-0.02752099744975567,
-0.015144890174269676,
0.00682470528408885,
-0.00955585204064846,
-0.05056159570813179,
-0.04613422602415085,
0.013150534592568874,
-0.12154456228017807,
-0.02147367037832737,
0.026958042755723,
0.05827387049794197,
-0.11345680803060532,
-0.043726541101932526,
0.025209588930010796,
-0.061932336539030075,
0.06392641365528107,
0.05440819635987282,
0.009444166906177998,
0.0680813416838646,
-0.13876761496067047,
-0.0029209950007498264,
0.08173337578773499,
0.015093238092958927,
0.06194562464952469,
-0.08420602977275848,
-0.0067267282865941525,
0.005385236814618111,
0.06681817024946213,
0.02448071725666523,
0.0779937133193016,
-0.14360862970352173,
0.005698604509234428,
-0.03551139682531357,
-0.0824250876903534,
-0.07015236467123032,
0.03544328734278679,
0.08339241147041321,
0.009481306187808514,
0.1958424150943756,
-0.07767083495855331,
0.043252695351839066,
-0.20980706810951233,
0.00041468857671134174,
-0.019721129909157753,
-0.12012454122304916,
-0.12529999017715454,
-0.0701192170381546,
0.0637119933962822,
-0.051146697252988815,
0.13162541389465332,
0.03318427503108978,
0.03916747868061066,
0.026906460523605347,
-0.006209552753716707,
0.0004453198052942753,
0.027775239199399948,
0.21254529058933258,
0.03538767620921135,
-0.03423648700118065,
0.06920802593231201,
0.05667848140001297,
0.09922054409980774,
0.1153043881058693,
0.1916305273771286,
0.15626010298728943,
-0.02376456931233406,
0.09178198873996735,
0.022733312100172043,
-0.047139015048742294,
-0.15313786268234253,
0.04682576283812523,
-0.050764765590429306,
0.09924725443124771,
-0.024942627176642418,
0.21693110466003418,
0.05260614678263664,
-0.16713950037956238,
0.05349664017558098,
-0.05349605157971382,
-0.09459717571735382,
-0.1030881255865097,
-0.035837866365909576,
-0.07850270718336105,
-0.1383875608444214,
-0.004493194632232189,
-0.10666213184595108,
0.011473587714135647,
0.10762526094913483,
0.005083993077278137,
-0.03141387552022934,
0.16149866580963135,
0.03497380390763283,
0.01570090278983116,
0.06610387563705444,
-0.0021981131285429,
-0.02890540473163128,
-0.117701955139637,
-0.06297197937965393,
-0.018291717395186424,
-0.0058380174450576305,
0.03740764036774635,
-0.05450373515486717,
-0.07328589260578156,
0.037810999900102615,
-0.030131494626402855,
-0.09950963407754898,
0.018587881699204445,
0.021023960784077644,
0.06585615873336792,
0.04975156858563423,
0.007895459420979023,
0.008172912523150444,
-0.01110062561929226,
0.21637268364429474,
-0.07555282860994339,
-0.09062347561120987,
-0.08825574815273285,
0.26799678802490234,
0.050293367356061935,
-0.007123897783458233,
0.0346154160797596,
-0.06087297573685646,
-0.0004811020044144243,
0.26281481981277466,
0.20377573370933533,
-0.08510349690914154,
-0.008055216632783413,
0.006707715801894665,
-0.009042750112712383,
-0.012489554472267628,
0.12117348611354828,
0.1459304690361023,
0.0376087948679924,
-0.1049448773264885,
-0.038312774151563644,
-0.05585673823952675,
-0.017992133274674416,
-0.045290812849998474,
0.06608225405216217,
0.03798222169280052,
-0.00048317271284759045,
-0.036708712577819824,
0.06433653831481934,
-0.07012129575014114,
-0.0921628400683403,
0.05540255457162857,
-0.20935675501823425,
-0.16130171716213226,
-0.018556097522377968,
0.10811524838209152,
-0.00021055502293165773,
0.05965680256485939,
-0.023394767194986343,
0.0011714318534359336,
0.06737249344587326,
-0.023418273776769638,
-0.08876517415046692,
-0.07302337884902954,
0.09162282198667526,
-0.11785074323415756,
0.18005475401878357,
-0.04252246022224426,
0.0704759880900383,
0.12048054486513138,
0.07245752215385437,
-0.06212114915251732,
0.061459992080926895,
0.03296166658401489,
-0.06643841415643692,
0.03999544307589531,
0.08310592919588089,
-0.029080405831336975,
0.03580754995346069,
0.0364837683737278,
-0.1218988448381424,
0.032072558999061584,
-0.07009761780500412,
-0.05440477654337883,
-0.041996635496616364,
-0.043330252170562744,
-0.05355597659945488,
0.11858376115560532,
0.22038497030735016,
-0.02553378976881504,
0.011790585704147816,
-0.0759049579501152,
0.0007484169909730554,
0.04757663980126381,
0.016001509502530098,
-0.07893349230289459,
-0.23261138796806335,
0.0029817826580256224,
0.05285237357020378,
-0.017904141917824745,
-0.2266920953989029,
-0.0889548733830452,
-0.0016265064477920532,
-0.07622618228197098,
-0.10240919888019562,
0.08156115561723709,
0.07113739848136902,
0.05170882120728493,
-0.05117121338844299,
-0.0724811777472496,
-0.08120547980070114,
0.15952250361442566,
-0.1556859165430069,
-0.08754152804613113
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1523
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.2171 | 1.0 | 5533 | 1.1511 |
| 0.952 | 2.0 | 11066 | 1.1180 |
| 0.7707 | 3.0 | 16599 | 1.1523 |
### Framework versions
- Transformers 4.9.2
- Pytorch 1.9.0+cu102
- Datasets 1.11.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model_index": [{"name": "distilbert-base-uncased-finetuned-squad", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "squad", "type": "squad", "args": "plain_text"}}]}]}
|
question-answering
|
fadhilarkan/distilbert-base-uncased-finetuned-squad
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-squad
=======================================
This model is a fine-tuned version of distilbert-base-uncased on the squad dataset.
It achieves the following results on the evaluation set:
* Loss: 1.1523
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.9.2
* Pytorch 1.9.0+cu102
* Datasets 1.11.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] |
[
56,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.9.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] |
[
-0.10705362260341644,
0.08642617613077164,
-0.0019497052999213338,
0.12088190019130707,
0.16487732529640198,
0.023926420137286186,
0.09646541625261307,
0.1237303763628006,
-0.09583912789821625,
0.030682923272252083,
0.13164585828781128,
0.1725284904241562,
0.0014034198829904199,
0.06260303407907486,
-0.046532340347766876,
-0.2176804095506668,
-0.0197356678545475,
0.05550070479512215,
-0.09573041647672653,
0.14594170451164246,
0.08441159874200821,
-0.15058943629264832,
0.07064444571733475,
0.0026585650630295277,
-0.21669349074363708,
0.015164773911237717,
0.0025265924632549286,
-0.03662744536995888,
0.1419433057308197,
0.001985737821087241,
0.12078787386417389,
0.0011728659737855196,
0.06436105072498322,
-0.18098999559879303,
0.014821168966591358,
0.04970328509807587,
0.009376498870551586,
0.08294200152158737,
0.049711763858795166,
0.007905444130301476,
0.10354091972112656,
-0.08558771014213562,
0.040785662829875946,
0.0262104831635952,
-0.12860290706157684,
-0.25578513741493225,
-0.10602656751871109,
0.015139518305659294,
0.0654694139957428,
0.12071394920349121,
-0.0052402629517018795,
0.15999731421470642,
-0.1173391044139862,
0.0887121632695198,
0.2535759210586548,
-0.2953091561794281,
-0.06985343992710114,
0.0330313965678215,
0.027709294110536575,
0.0725269541144371,
-0.10707271844148636,
-0.032564517110586166,
0.05216984823346138,
0.05382472276687622,
0.10757655650377274,
-0.04613222926855087,
-0.12844090163707733,
0.03627266734838486,
-0.14919820427894592,
-0.04196754842996597,
0.14769722521305084,
0.04810914397239685,
-0.025381309911608696,
-0.02513941563665867,
-0.059023331850767136,
-0.1273024082183838,
-0.022567089647054672,
-0.018867578357458115,
0.047368716448545456,
-0.04896422475576401,
-0.08322884887456894,
-0.004689880646765232,
-0.10715536028146744,
-0.08304325491189957,
-0.07341541349887848,
0.1329653561115265,
0.041193850338459015,
0.03117016702890396,
-0.04723931476473808,
0.10155755281448364,
0.008000984787940979,
-0.13084359467029572,
0.013906402513384819,
0.03287709504365921,
-0.01716935634613037,
-0.034180060029029846,
-0.06618786603212357,
-0.06048790365457535,
0.02539178542792797,
0.11794868856668472,
-0.07573472708463669,
0.03353269398212433,
0.05489823594689369,
0.041960131376981735,
-0.08697438985109329,
0.1789291352033615,
-0.07115312665700912,
-0.005240211263298988,
-0.005402738694101572,
0.0408657006919384,
0.0005355849862098694,
0.0027468781918287277,
-0.09764566272497177,
-0.006681119557470083,
0.09413602203130722,
0.02257387898862362,
-0.03649115189909935,
0.05305568501353264,
-0.051307108253240585,
-0.02829805761575699,
0.0019010143587365746,
-0.08408232778310776,
0.028822651132941246,
-0.0017927183071151376,
-0.0923161432147026,
-0.016438324004411697,
0.007014493457973003,
0.014326117932796478,
-0.015818027779459953,
0.09484752267599106,
-0.09182373434305191,
0.038419175893068314,
-0.09660188853740692,
-0.09849677234888077,
0.0263642780482769,
-0.0894007533788681,
0.028788752853870392,
-0.07989021390676498,
-0.15778681635856628,
-0.01351870596408844,
0.049795132130384445,
-0.025749336928129196,
-0.049776751548051834,
-0.032351743429899216,
-0.09150053560733795,
-0.013265893794596195,
-0.019873524084687233,
0.16068367660045624,
-0.059537623077631,
0.11866495013237,
0.03918783366680145,
0.06486118584871292,
-0.044445719569921494,
0.060257744044065475,
-0.10458165407180786,
0.014094223268330097,
-0.17121462523937225,
0.03698045387864113,
-0.056339092552661896,
0.06332071125507355,
-0.10142675042152405,
-0.12318113446235657,
0.026366669684648514,
-0.020798826590180397,
0.08536434173583984,
0.0960046648979187,
-0.1686159372329712,
-0.05689946562051773,
0.14272886514663696,
-0.051902204751968384,
-0.14702355861663818,
0.12120480090379715,
-0.054985251277685165,
0.034819599241018295,
0.06487157195806503,
0.17154933512210846,
0.05769319459795952,
-0.08649806678295135,
0.012693027034401894,
0.00015909463400021195,
0.0353405699133873,
-0.08103270828723907,
0.07234657555818558,
-0.008505509234964848,
0.02899428829550743,
0.025746896862983704,
-0.061729490756988525,
0.05647541955113411,
-0.11129213124513626,
-0.09641294181346893,
-0.04998811334371567,
-0.106446273624897,
0.04362386837601662,
0.09032674133777618,
0.0696628987789154,
-0.10130131244659424,
-0.0668836236000061,
0.0788530483841896,
0.07055991142988205,
-0.05555995553731918,
0.03020990826189518,
-0.06079702824354172,
0.07214636355638504,
-0.07054517418146133,
-0.029756775125861168,
-0.19741395115852356,
-0.03161172196269035,
0.006132589653134346,
0.0016259511467069387,
0.011525372974574566,
0.047803446650505066,
0.07450693100690842,
0.040761761367321014,
-0.053407590836286545,
-0.022477339953184128,
-0.05230104923248291,
-0.007154479622840881,
-0.13175255060195923,
-0.18763092160224915,
-0.03859427943825722,
-0.013990151695907116,
0.09929783642292023,
-0.1770028918981552,
0.02430090494453907,
-0.011846322566270828,
0.06863868236541748,
-0.007940981537103653,
-0.014104013331234455,
-0.035152602940797806,
0.08532548695802689,
-0.017078038305044174,
-0.048104576766490936,
0.08010081201791763,
-0.0031806393526494503,
-0.09105556458234787,
-0.05952604487538338,
-0.05989661067724228,
0.14917483925819397,
0.13060101866722107,
-0.11907023936510086,
-0.062262170016765594,
0.008860456757247448,
-0.072929747402668,
-0.03914691135287285,
-0.03998178616166115,
0.041032832115888596,
0.17827971279621124,
-0.0040969084948301315,
0.12297860532999039,
-0.08482526242733002,
-0.051230937242507935,
0.010234813205897808,
-0.032902657985687256,
0.04353863745927811,
0.12846434116363525,
0.12255699932575226,
-0.0621359683573246,
0.14226031303405762,
0.15808096528053284,
-0.09311322122812271,
0.10119669884443283,
-0.06402391940355301,
-0.08695408701896667,
-0.03272510692477226,
0.0031451815739274025,
-0.007456344552338123,
0.12160729616880417,
-0.14851056039333344,
0.017792554572224617,
0.03243545815348625,
0.020606746897101402,
0.026502549648284912,
-0.23197223246097565,
-0.062006331980228424,
0.01709997095167637,
-0.04970545694231987,
-0.030256427824497223,
-0.00239208503626287,
0.020901255309581757,
0.10108381509780884,
-0.0038939465302973986,
-0.059709277004003525,
0.04561899974942207,
-0.0017187908524647355,
-0.06682553142309189,
0.2184758484363556,
-0.07526005059480667,
-0.1254199743270874,
-0.094719298183918,
-0.04555321857333183,
-0.052239466458559036,
-0.007215939927846193,
0.0673716813325882,
-0.09399743378162384,
-0.013393103145062923,
-0.047499414533376694,
0.01952977478504181,
-0.012330112047493458,
0.02118721976876259,
-0.0015438346890732646,
-0.0036645778454840183,
0.07651335746049881,
-0.11892422288656235,
0.007656872738152742,
-0.058123666793107986,
-0.07646090537309647,
0.05483946204185486,
0.05542798712849617,
0.13111157715320587,
0.14213572442531586,
-0.013177195563912392,
0.0069211432710289955,
-0.02272847108542919,
0.25134915113449097,
-0.0702657625079155,
-0.039946649223566055,
0.14465570449829102,
0.014385288581252098,
0.059205565601587296,
0.10434228181838989,
0.07397183030843735,
-0.09202170372009277,
0.005222668405622244,
0.03815104439854622,
-0.03968149796128273,
-0.24625565111637115,
-0.03230684623122215,
-0.06055295094847679,
-0.027167487889528275,
0.06976820528507233,
0.0244212057441473,
0.03334059193730354,
0.07314952462911606,
0.04392261058092117,
0.039242926985025406,
-0.0703417956829071,
0.04013306275010109,
0.11037158966064453,
0.04307779297232628,
0.11677433550357819,
-0.046103473752737045,
-0.05851686745882034,
0.027720250189304352,
0.0017232088139280677,
0.2433367222547531,
-0.012260407209396362,
0.14298272132873535,
0.08375278860330582,
0.22033469378948212,
-0.01632688194513321,
0.08038175851106644,
-0.012282149866223335,
-0.04716859012842178,
-0.005035204812884331,
-0.03305821120738983,
-0.027583355084061623,
0.004373906645923853,
-0.03588651493191719,
0.0717795193195343,
-0.09873504191637039,
-0.01898931711912155,
0.05804295465350151,
0.27385497093200684,
0.028827782720327377,
-0.29747670888900757,
-0.08714888244867325,
-0.010121333412826061,
-0.02386089786887169,
-0.003528148867189884,
0.018887881189584732,
0.12392908334732056,
-0.09558102488517761,
0.0010316878324374557,
-0.06822389364242554,
0.09815212339162827,
-0.009702709503471851,
0.037318453192710876,
0.0779808834195137,
0.08910394459962845,
0.0179166030138731,
0.0931445062160492,
-0.31035757064819336,
0.2663770914077759,
0.0007324166363105178,
0.0732073113322258,
-0.07605110853910446,
-0.015041698701679707,
0.009985620155930519,
0.03408130258321762,
0.08340337872505188,
-0.007236759178340435,
0.014564449898898602,
-0.17420028150081635,
-0.04414974898099899,
0.038388464599847794,
0.08121556043624878,
-0.01792331039905548,
0.0938715711236,
-0.01063433289527893,
0.01671898551285267,
0.07152604311704636,
0.002562592038884759,
-0.04683203995227814,
-0.07794690132141113,
-0.014847789891064167,
0.010161404497921467,
-0.051856089383363724,
-0.06877776235342026,
-0.10462789237499237,
-0.11987528949975967,
0.12668731808662415,
-0.003110965248197317,
-0.042487747967243195,
-0.10643978416919708,
0.08855831623077393,
0.10843878984451294,
-0.08679448813199997,
0.03180045634508133,
0.011929372325539589,
0.0320415236055851,
0.04570147767663002,
-0.06128063425421715,
0.09632886201143265,
-0.06005597487092018,
-0.15579773485660553,
-0.044233549386262894,
0.1153780072927475,
0.05148554965853691,
0.06507819145917892,
-0.013356178067624569,
0.015649156644940376,
-0.05529586970806122,
-0.10461370646953583,
0.02365027740597725,
-0.03505963087081909,
0.09109974652528763,
0.0284817386418581,
-0.030087964609265327,
0.059247154742479324,
-0.06069652736186981,
-0.0225078072398901,
0.1943713128566742,
0.22774778306484222,
-0.101385697722435,
0.01651471108198166,
0.03222823143005371,
-0.04833701625466347,
-0.18603743612766266,
0.054610129445791245,
0.07176769524812698,
-0.011203155852854252,
0.05324731394648552,
-0.15866968035697937,
0.15213872492313385,
0.11822438985109329,
-0.0055128647945821285,
0.11762019991874695,
-0.37107500433921814,
-0.11291325837373734,
0.0933867022395134,
0.16289226710796356,
0.12006214261054993,
-0.16556696593761444,
-0.022488795220851898,
0.0032275838311761618,
-0.16540971398353577,
0.11005903035402298,
-0.09508673846721649,
0.11674617230892181,
-0.04259790480136871,
0.10232359915971756,
-0.0015381911071017385,
-0.07247967272996902,
0.1289955973625183,
0.037155602127313614,
0.09961805492639542,
-0.048446789383888245,
-0.028741350397467613,
0.08560382574796677,
-0.02097877487540245,
0.020590437576174736,
-0.06116931885480881,
0.03969905897974968,
-0.10632092505693436,
-0.011000074446201324,
-0.11208027601242065,
0.03959933668375015,
-0.042604196816682816,
-0.05389430746436119,
-0.038828276097774506,
0.024039391428232193,
0.046196818351745605,
-0.011830967850983143,
0.11648321896791458,
0.03120993636548519,
0.14675062894821167,
0.08701121062040329,
0.06906042248010635,
-0.06576694548130035,
-0.12080808728933334,
-0.017382638528943062,
-0.005800485145300627,
0.052484117448329926,
-0.13286210596561432,
0.02290811948478222,
0.15578119456768036,
0.049574896693229675,
0.12006233632564545,
0.07638749480247498,
-0.029078437015414238,
0.013418870978057384,
0.039682477712631226,
-0.16277535259723663,
-0.1344565451145172,
0.014518046751618385,
-0.07444997131824493,
-0.11946628987789154,
0.058726418763399124,
0.06356285512447357,
-0.05071093514561653,
-0.013918295502662659,
-0.004923684522509575,
-0.0025287270545959473,
-0.065342478454113,
0.20903895795345306,
0.08312448859214783,
0.05133184418082237,
-0.11398837715387344,
0.07757723331451416,
0.043914828449487686,
-0.09047120064496994,
-0.010273878462612629,
0.061149582266807556,
-0.07231056690216064,
-0.04521673172712326,
0.09306196123361588,
0.14992977678775787,
-0.05664033442735672,
-0.03872622177004814,
-0.13121022284030914,
-0.1108725443482399,
0.076183021068573,
0.1398717612028122,
0.11530204862356186,
0.010222851298749447,
-0.04444783553481102,
0.009547539986670017,
-0.12095624208450317,
0.08404439687728882,
0.03432873263955116,
0.06243853643536568,
-0.12479259073734283,
0.13558484613895416,
0.007709937635809183,
0.06370850652456284,
-0.017596859484910965,
0.033238545060157776,
-0.09780517220497131,
0.031059278175234795,
-0.14222906529903412,
-0.03652476891875267,
-0.03202323243021965,
-0.0038184584118425846,
-0.01050778292119503,
-0.08243392407894135,
-0.06318429857492447,
0.024013852700591087,
-0.1235513761639595,
-0.023766526952385902,
0.04916222393512726,
0.04525146633386612,
-0.1410963088274002,
-0.04801291599869728,
0.033860113471746445,
-0.05001958832144737,
0.0664973258972168,
0.06739436089992523,
0.012324145063757896,
0.054770730435848236,
-0.132499560713768,
-0.020675551146268845,
0.051496848464012146,
0.01509476825594902,
0.08131185173988342,
-0.09428247809410095,
-0.01994062028825283,
0.009275135584175587,
0.06805738806724548,
0.01873490773141384,
0.04307315871119499,
-0.14214111864566803,
-0.015212289988994598,
-0.027791336178779602,
-0.07601085305213928,
-0.07392939180135727,
0.011732340790331364,
0.0990791991353035,
0.035267964005470276,
0.1982436180114746,
-0.057769931852817535,
0.061553314328193665,
-0.21861815452575684,
-0.01047789677977562,
-0.014591813087463379,
-0.09524960070848465,
-0.11237236112356186,
-0.040982525795698166,
0.06918913125991821,
-0.05867075175046921,
0.1278475672006607,
-0.00857825018465519,
0.06017875671386719,
0.022076651453971863,
-0.011869906447827816,
0.03461888059973717,
0.013502356596291065,
0.23396167159080505,
0.017783714458346367,
-0.031651370227336884,
0.08459401875734329,
0.05995028838515282,
0.07018017768859863,
0.11210562288761139,
0.20727801322937012,
0.17490054666996002,
0.011099973693490028,
0.0695914700627327,
0.03448977321386337,
-0.043950777500867844,
-0.13315629959106445,
0.04209044948220253,
-0.022482844069600105,
0.08247946947813034,
-0.016947614029049873,
0.23862984776496887,
0.05685354024171829,
-0.17662498354911804,
0.05904228985309601,
-0.06029011681675911,
-0.09378110617399216,
-0.08073414117097855,
-0.02257544733583927,
-0.06706849485635757,
-0.15058842301368713,
0.010944494046270847,
-0.1251794546842575,
0.015082105994224548,
0.13801363110542297,
0.009971498511731625,
-0.032145172357559204,
0.18226143717765808,
0.05530807748436928,
0.027555327862501144,
0.04002152755856514,
-0.001470112125389278,
-0.02377125807106495,
-0.08089350908994675,
-0.04118923470377922,
-0.0020035288762301207,
-0.023628972470760345,
0.04400544986128807,
-0.05001511424779892,
-0.08180727809667587,
0.03112180531024933,
-0.039959877729415894,
-0.09424624592065811,
0.005215968005359173,
0.03562798351049423,
0.0703512653708458,
0.05341045558452606,
0.018788369372487068,
0.030808009207248688,
-0.023181790485978127,
0.2175707072019577,
-0.07314731925725937,
-0.0829126387834549,
-0.09170208126306534,
0.24632681906223297,
0.03294753655791283,
-0.02414011023938656,
0.043751053512096405,
-0.06978767365217209,
0.005031050182878971,
0.2464054375886917,
0.17973297834396362,
-0.10233765095472336,
-0.013814710080623627,
0.008674983866512775,
-0.01028298307210207,
-0.035331159830093384,
0.09132779389619827,
0.14668798446655273,
0.0456087701022625,
-0.11032190918922424,
-0.043825969099998474,
-0.07708689570426941,
-0.013200630433857441,
-0.04153912886977196,
0.04719460755586624,
0.05362504720687866,
-0.00508418632671237,
-0.04142461344599724,
0.06687828153371811,
-0.06714662909507751,
-0.12272766977548599,
0.07650734484195709,
-0.19510909914970398,
-0.16071312129497528,
-0.02011563628911972,
0.10969113558530807,
0.004773154389113188,
0.06434996426105499,
-0.03524774685502052,
0.013742657378315926,
0.07288698107004166,
-0.01659313589334488,
-0.09735333919525146,
-0.08046189695596695,
0.11865511536598206,
-0.12345418334007263,
0.1842367947101593,
-0.0410783588886261,
0.08858612924814224,
0.12867200374603271,
0.058452241122722626,
-0.08688288927078247,
0.06353893131017685,
0.06108807399868965,
-0.0957256406545639,
0.01491166278719902,
0.07912759482860565,
-0.00938519835472107,
0.03700130432844162,
0.03820996731519699,
-0.11775225400924683,
0.011159430257976055,
-0.043217238038778305,
-0.0220625177025795,
-0.06689770519733429,
-0.03461260721087456,
-0.05951428785920143,
0.1234385073184967,
0.21522264182567596,
-0.040953319519758224,
0.013369902037084103,
-0.08043504506349564,
0.00851592980325222,
0.053802166134119034,
0.02592952735722065,
-0.07651648670434952,
-0.2090436965227127,
0.02757873758673668,
0.052441515028476715,
-0.03314166143536568,
-0.2023749202489853,
-0.08990813791751862,
0.024924492463469505,
-0.08896686136722565,
-0.06656374782323837,
0.06122384965419769,
0.0670473724603653,
0.058435630053281784,
-0.045738764107227325,
-0.06685679405927658,
-0.09021566063165665,
0.15882399678230286,
-0.152295783162117,
-0.0846269428730011
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.